00:00:00.001 Started by upstream project "autotest-per-patch" build number 122813 00:00:00.001 originally caused by: 00:00:00.001 Started by user sys_sgci 00:00:00.025 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.026 The recommended git tool is: git 00:00:00.027 using credential 00000000-0000-0000-0000-000000000002 00:00:00.028 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.043 Fetching changes from the remote Git repository 00:00:00.044 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.081 Using shallow fetch with depth 1 00:00:00.081 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.081 > git --version # timeout=10 00:00:00.128 > git --version # 'git version 2.39.2' 00:00:00.128 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.134 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.134 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.980 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.995 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:03.006 Checking out Revision 10da8f6d99838e411e4e94523ded0bfebf3e7100 (FETCH_HEAD) 00:00:03.006 > git config core.sparsecheckout # timeout=10 00:00:03.016 > git read-tree -mu HEAD # timeout=10 00:00:03.031 > git checkout -f 10da8f6d99838e411e4e94523ded0bfebf3e7100 # timeout=5 00:00:03.048 Commit message: "scripts/create_git_mirror: Update path to xnvme submodule" 00:00:03.048 > git rev-list --no-walk 10da8f6d99838e411e4e94523ded0bfebf3e7100 # timeout=10 00:00:03.123 [Pipeline] Start of Pipeline 00:00:03.135 [Pipeline] library 00:00:03.137 Loading library shm_lib@master 00:00:03.137 Library shm_lib@master is cached. Copying from home. 00:00:03.157 [Pipeline] node 00:00:03.165 Running on WFP50 in /var/jenkins/workspace/crypto-phy-autotest 00:00:03.167 [Pipeline] { 00:00:03.179 [Pipeline] catchError 00:00:03.181 [Pipeline] { 00:00:03.196 [Pipeline] wrap 00:00:03.205 [Pipeline] { 00:00:03.212 [Pipeline] stage 00:00:03.235 [Pipeline] { (Prologue) 00:00:03.443 [Pipeline] sh 00:00:03.728 + logger -p user.info -t JENKINS-CI 00:00:03.746 [Pipeline] echo 00:00:03.747 Node: WFP50 00:00:03.755 [Pipeline] sh 00:00:04.050 [Pipeline] setCustomBuildProperty 00:00:04.061 [Pipeline] echo 00:00:04.063 Cleanup processes 00:00:04.068 [Pipeline] sh 00:00:04.352 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.352 228933 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.364 [Pipeline] sh 00:00:04.645 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.645 ++ grep -v 'sudo pgrep' 00:00:04.645 ++ awk '{print $1}' 00:00:04.645 + sudo kill -9 00:00:04.645 + true 00:00:04.657 [Pipeline] cleanWs 00:00:04.665 [WS-CLEANUP] Deleting project workspace... 00:00:04.666 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.672 [WS-CLEANUP] done 00:00:04.674 [Pipeline] setCustomBuildProperty 00:00:04.685 [Pipeline] sh 00:00:04.964 + sudo git config --global --replace-all safe.directory '*' 00:00:05.046 [Pipeline] nodesByLabel 00:00:05.047 Found a total of 1 nodes with the 'sorcerer' label 00:00:05.056 [Pipeline] httpRequest 00:00:05.061 HttpMethod: GET 00:00:05.061 URL: http://10.211.164.101/packages/jbp_10da8f6d99838e411e4e94523ded0bfebf3e7100.tar.gz 00:00:05.067 Sending request to url: http://10.211.164.101/packages/jbp_10da8f6d99838e411e4e94523ded0bfebf3e7100.tar.gz 00:00:05.070 Response Code: HTTP/1.1 200 OK 00:00:05.071 Success: Status code 200 is in the accepted range: 200,404 00:00:05.071 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_10da8f6d99838e411e4e94523ded0bfebf3e7100.tar.gz 00:00:05.564 [Pipeline] sh 00:00:05.845 + tar --no-same-owner -xf jbp_10da8f6d99838e411e4e94523ded0bfebf3e7100.tar.gz 00:00:05.862 [Pipeline] httpRequest 00:00:05.866 HttpMethod: GET 00:00:05.866 URL: http://10.211.164.101/packages/spdk_52939f252f2e182ba62a91f015fc30b8e463d7b0.tar.gz 00:00:05.867 Sending request to url: http://10.211.164.101/packages/spdk_52939f252f2e182ba62a91f015fc30b8e463d7b0.tar.gz 00:00:05.878 Response Code: HTTP/1.1 200 OK 00:00:05.878 Success: Status code 200 is in the accepted range: 200,404 00:00:05.879 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_52939f252f2e182ba62a91f015fc30b8e463d7b0.tar.gz 00:00:36.009 [Pipeline] sh 00:00:36.293 + tar --no-same-owner -xf spdk_52939f252f2e182ba62a91f015fc30b8e463d7b0.tar.gz 00:00:39.597 [Pipeline] sh 00:00:39.880 + git -C spdk log --oneline -n5 00:00:39.880 52939f252 lib/blobfs: fix memory error for spdk_file_write 00:00:39.880 235c4c537 xnvme: change gitmodule-remote 00:00:39.880 bf8fa3b96 test/skipped_tests: update the list to current per-patch 00:00:39.880 e2d29d42b test/ftl: remove duplicated ftl_dirty_shutdown 00:00:39.880 7313180df test/ftl: replace FTL extended and nightly flags 00:00:39.892 [Pipeline] } 00:00:39.909 [Pipeline] // stage 00:00:39.918 [Pipeline] stage 00:00:39.920 [Pipeline] { (Prepare) 00:00:39.938 [Pipeline] writeFile 00:00:39.954 [Pipeline] sh 00:00:40.238 + logger -p user.info -t JENKINS-CI 00:00:40.252 [Pipeline] sh 00:00:40.537 + logger -p user.info -t JENKINS-CI 00:00:40.549 [Pipeline] sh 00:00:40.832 + cat autorun-spdk.conf 00:00:40.832 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:40.832 SPDK_TEST_BLOCKDEV=1 00:00:40.832 SPDK_TEST_ISAL=1 00:00:40.832 SPDK_TEST_CRYPTO=1 00:00:40.832 SPDK_TEST_REDUCE=1 00:00:40.832 SPDK_TEST_VBDEV_COMPRESS=1 00:00:40.832 SPDK_RUN_UBSAN=1 00:00:40.840 RUN_NIGHTLY=0 00:00:40.844 [Pipeline] readFile 00:00:40.863 [Pipeline] withEnv 00:00:40.864 [Pipeline] { 00:00:40.877 [Pipeline] sh 00:00:41.160 + set -ex 00:00:41.160 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:00:41.160 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:00:41.160 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:41.160 ++ SPDK_TEST_BLOCKDEV=1 00:00:41.160 ++ SPDK_TEST_ISAL=1 00:00:41.160 ++ SPDK_TEST_CRYPTO=1 00:00:41.160 ++ SPDK_TEST_REDUCE=1 00:00:41.160 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:00:41.160 ++ SPDK_RUN_UBSAN=1 00:00:41.160 ++ RUN_NIGHTLY=0 00:00:41.160 + case $SPDK_TEST_NVMF_NICS in 00:00:41.160 + DRIVERS= 00:00:41.160 + [[ -n '' ]] 00:00:41.160 + exit 0 00:00:41.169 [Pipeline] } 00:00:41.188 [Pipeline] // withEnv 00:00:41.193 [Pipeline] } 00:00:41.210 [Pipeline] // stage 00:00:41.220 [Pipeline] catchError 00:00:41.222 [Pipeline] { 00:00:41.238 [Pipeline] timeout 00:00:41.239 Timeout set to expire in 40 min 00:00:41.241 [Pipeline] { 00:00:41.257 [Pipeline] stage 00:00:41.258 [Pipeline] { (Tests) 00:00:41.272 [Pipeline] sh 00:00:41.553 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:00:41.553 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:00:41.553 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:00:41.553 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:00:41.553 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:41.553 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:00:41.553 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:00:41.553 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:00:41.553 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:00:41.553 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:00:41.553 + cd /var/jenkins/workspace/crypto-phy-autotest 00:00:41.553 + source /etc/os-release 00:00:41.553 ++ NAME='Fedora Linux' 00:00:41.553 ++ VERSION='38 (Cloud Edition)' 00:00:41.553 ++ ID=fedora 00:00:41.553 ++ VERSION_ID=38 00:00:41.553 ++ VERSION_CODENAME= 00:00:41.553 ++ PLATFORM_ID=platform:f38 00:00:41.553 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:00:41.553 ++ ANSI_COLOR='0;38;2;60;110;180' 00:00:41.553 ++ LOGO=fedora-logo-icon 00:00:41.553 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:00:41.553 ++ HOME_URL=https://fedoraproject.org/ 00:00:41.553 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:00:41.553 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:00:41.553 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:00:41.553 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:00:41.553 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:00:41.553 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:00:41.553 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:00:41.553 ++ SUPPORT_END=2024-05-14 00:00:41.553 ++ VARIANT='Cloud Edition' 00:00:41.553 ++ VARIANT_ID=cloud 00:00:41.553 + uname -a 00:00:41.553 Linux spdk-wfp-50 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:00:41.553 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:00:44.847 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:00:44.847 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:00:44.847 Hugepages 00:00:44.847 node hugesize free / total 00:00:44.847 node0 1048576kB 0 / 0 00:00:44.847 node0 2048kB 0 / 0 00:00:44.847 node1 1048576kB 0 / 0 00:00:44.847 node1 2048kB 0 / 0 00:00:44.847 00:00:44.847 Type BDF Vendor Device NUMA Driver Device Block devices 00:00:44.847 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:00:44.847 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:00:44.847 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:00:44.847 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:00:44.847 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:00:44.847 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:00:44.847 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:00:44.847 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:00:44.847 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:00:44.847 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:00:44.847 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:00:45.107 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:00:45.107 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:00:45.107 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:00:45.107 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:00:45.107 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:00:45.107 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:00:45.107 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:00:45.107 VMD 0000:d7:05.5 8086 201d 1 vfio-pci - - 00:00:45.107 + rm -f /tmp/spdk-ld-path 00:00:45.107 + source autorun-spdk.conf 00:00:45.107 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:45.107 ++ SPDK_TEST_BLOCKDEV=1 00:00:45.107 ++ SPDK_TEST_ISAL=1 00:00:45.107 ++ SPDK_TEST_CRYPTO=1 00:00:45.107 ++ SPDK_TEST_REDUCE=1 00:00:45.107 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:00:45.107 ++ SPDK_RUN_UBSAN=1 00:00:45.107 ++ RUN_NIGHTLY=0 00:00:45.107 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:00:45.107 + [[ -n '' ]] 00:00:45.107 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:45.107 + for M in /var/spdk/build-*-manifest.txt 00:00:45.107 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:00:45.107 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:00:45.107 + for M in /var/spdk/build-*-manifest.txt 00:00:45.107 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:00:45.107 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:00:45.107 ++ uname 00:00:45.107 + [[ Linux == \L\i\n\u\x ]] 00:00:45.107 + sudo dmesg -T 00:00:45.107 + sudo dmesg --clear 00:00:45.107 + dmesg_pid=230144 00:00:45.107 + [[ Fedora Linux == FreeBSD ]] 00:00:45.107 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:45.107 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:45.107 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:00:45.107 + [[ -x /usr/src/fio-static/fio ]] 00:00:45.107 + export FIO_BIN=/usr/src/fio-static/fio 00:00:45.107 + FIO_BIN=/usr/src/fio-static/fio 00:00:45.107 + sudo dmesg -Tw 00:00:45.107 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:00:45.107 + [[ ! -v VFIO_QEMU_BIN ]] 00:00:45.107 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:00:45.107 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:45.107 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:45.107 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:00:45.107 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:45.107 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:45.107 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:00:45.107 Test configuration: 00:00:45.107 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:45.107 SPDK_TEST_BLOCKDEV=1 00:00:45.107 SPDK_TEST_ISAL=1 00:00:45.107 SPDK_TEST_CRYPTO=1 00:00:45.107 SPDK_TEST_REDUCE=1 00:00:45.107 SPDK_TEST_VBDEV_COMPRESS=1 00:00:45.107 SPDK_RUN_UBSAN=1 00:00:45.366 RUN_NIGHTLY=0 23:42:45 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:00:45.366 23:42:45 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:00:45.366 23:42:45 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:00:45.366 23:42:45 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:00:45.366 23:42:45 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:45.366 23:42:45 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:45.366 23:42:45 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:45.366 23:42:45 -- paths/export.sh@5 -- $ export PATH 00:00:45.366 23:42:45 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:45.366 23:42:45 -- common/autobuild_common.sh@436 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:00:45.366 23:42:45 -- common/autobuild_common.sh@437 -- $ date +%s 00:00:45.366 23:42:45 -- common/autobuild_common.sh@437 -- $ mktemp -dt spdk_1715722965.XXXXXX 00:00:45.366 23:42:45 -- common/autobuild_common.sh@437 -- $ SPDK_WORKSPACE=/tmp/spdk_1715722965.WfGLyR 00:00:45.366 23:42:45 -- common/autobuild_common.sh@439 -- $ [[ -n '' ]] 00:00:45.366 23:42:45 -- common/autobuild_common.sh@443 -- $ '[' -n '' ']' 00:00:45.366 23:42:45 -- common/autobuild_common.sh@446 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:00:45.366 23:42:45 -- common/autobuild_common.sh@450 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:00:45.366 23:42:45 -- common/autobuild_common.sh@452 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:00:45.366 23:42:45 -- common/autobuild_common.sh@453 -- $ get_config_params 00:00:45.366 23:42:45 -- common/autotest_common.sh@395 -- $ xtrace_disable 00:00:45.366 23:42:45 -- common/autotest_common.sh@10 -- $ set +x 00:00:45.366 23:42:45 -- common/autobuild_common.sh@453 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:00:45.366 23:42:45 -- common/autobuild_common.sh@455 -- $ start_monitor_resources 00:00:45.366 23:42:45 -- pm/common@17 -- $ local monitor 00:00:45.366 23:42:45 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:45.366 23:42:45 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:45.366 23:42:45 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:45.366 23:42:45 -- pm/common@21 -- $ date +%s 00:00:45.367 23:42:45 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:45.367 23:42:45 -- pm/common@21 -- $ date +%s 00:00:45.367 23:42:45 -- pm/common@25 -- $ sleep 1 00:00:45.367 23:42:45 -- pm/common@21 -- $ date +%s 00:00:45.367 23:42:45 -- pm/common@21 -- $ date +%s 00:00:45.367 23:42:45 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1715722965 00:00:45.367 23:42:45 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1715722965 00:00:45.367 23:42:45 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1715722965 00:00:45.367 23:42:45 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1715722965 00:00:45.367 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1715722965_collect-vmstat.pm.log 00:00:45.367 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1715722965_collect-cpu-load.pm.log 00:00:45.367 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1715722965_collect-cpu-temp.pm.log 00:00:45.367 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1715722965_collect-bmc-pm.bmc.pm.log 00:00:46.305 23:42:46 -- common/autobuild_common.sh@456 -- $ trap stop_monitor_resources EXIT 00:00:46.305 23:42:46 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:00:46.305 23:42:46 -- spdk/autobuild.sh@12 -- $ umask 022 00:00:46.305 23:42:46 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:46.305 23:42:46 -- spdk/autobuild.sh@16 -- $ date -u 00:00:46.305 Tue May 14 09:42:46 PM UTC 2024 00:00:46.305 23:42:46 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:00:46.305 v24.05-pre-617-g52939f252 00:00:46.305 23:42:46 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:00:46.305 23:42:46 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:00:46.305 23:42:46 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:00:46.305 23:42:46 -- common/autotest_common.sh@1097 -- $ '[' 3 -le 1 ']' 00:00:46.305 23:42:46 -- common/autotest_common.sh@1103 -- $ xtrace_disable 00:00:46.305 23:42:46 -- common/autotest_common.sh@10 -- $ set +x 00:00:46.564 ************************************ 00:00:46.564 START TEST ubsan 00:00:46.564 ************************************ 00:00:46.564 23:42:46 ubsan -- common/autotest_common.sh@1121 -- $ echo 'using ubsan' 00:00:46.564 using ubsan 00:00:46.564 00:00:46.564 real 0m0.000s 00:00:46.564 user 0m0.000s 00:00:46.564 sys 0m0.000s 00:00:46.564 23:42:46 ubsan -- common/autotest_common.sh@1122 -- $ xtrace_disable 00:00:46.564 23:42:46 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:00:46.564 ************************************ 00:00:46.564 END TEST ubsan 00:00:46.564 ************************************ 00:00:46.564 23:42:46 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:00:46.564 23:42:46 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:00:46.564 23:42:46 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:00:46.564 23:42:46 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:00:46.564 23:42:46 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:00:46.564 23:42:46 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:00:46.564 23:42:46 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:00:46.564 23:42:46 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:00:46.564 23:42:46 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:00:46.564 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:00:46.564 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:00:47.133 Using 'verbs' RDMA provider 00:01:03.460 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:18.345 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:18.345 Creating mk/config.mk...done. 00:01:18.345 Creating mk/cc.flags.mk...done. 00:01:18.345 Type 'make' to build. 00:01:18.345 23:43:17 -- spdk/autobuild.sh@69 -- $ run_test make make -j72 00:01:18.345 23:43:17 -- common/autotest_common.sh@1097 -- $ '[' 3 -le 1 ']' 00:01:18.345 23:43:17 -- common/autotest_common.sh@1103 -- $ xtrace_disable 00:01:18.345 23:43:17 -- common/autotest_common.sh@10 -- $ set +x 00:01:18.345 ************************************ 00:01:18.345 START TEST make 00:01:18.345 ************************************ 00:01:18.345 23:43:17 make -- common/autotest_common.sh@1121 -- $ make -j72 00:01:18.345 make[1]: Nothing to be done for 'all'. 00:01:57.084 The Meson build system 00:01:57.084 Version: 1.3.1 00:01:57.084 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:01:57.084 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:01:57.084 Build type: native build 00:01:57.084 Program cat found: YES (/usr/bin/cat) 00:01:57.084 Project name: DPDK 00:01:57.084 Project version: 23.11.0 00:01:57.084 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:01:57.084 C linker for the host machine: cc ld.bfd 2.39-16 00:01:57.084 Host machine cpu family: x86_64 00:01:57.084 Host machine cpu: x86_64 00:01:57.084 Message: ## Building in Developer Mode ## 00:01:57.084 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:57.084 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:01:57.084 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:01:57.084 Program python3 found: YES (/usr/bin/python3) 00:01:57.084 Program cat found: YES (/usr/bin/cat) 00:01:57.084 Compiler for C supports arguments -march=native: YES 00:01:57.084 Checking for size of "void *" : 8 00:01:57.084 Checking for size of "void *" : 8 (cached) 00:01:57.084 Library m found: YES 00:01:57.084 Library numa found: YES 00:01:57.084 Has header "numaif.h" : YES 00:01:57.084 Library fdt found: NO 00:01:57.084 Library execinfo found: NO 00:01:57.084 Has header "execinfo.h" : YES 00:01:57.084 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:01:57.084 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:57.084 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:57.084 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:57.084 Run-time dependency openssl found: YES 3.0.9 00:01:57.084 Run-time dependency libpcap found: YES 1.10.4 00:01:57.084 Has header "pcap.h" with dependency libpcap: YES 00:01:57.084 Compiler for C supports arguments -Wcast-qual: YES 00:01:57.084 Compiler for C supports arguments -Wdeprecated: YES 00:01:57.084 Compiler for C supports arguments -Wformat: YES 00:01:57.084 Compiler for C supports arguments -Wformat-nonliteral: NO 00:01:57.084 Compiler for C supports arguments -Wformat-security: NO 00:01:57.084 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:57.085 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:57.085 Compiler for C supports arguments -Wnested-externs: YES 00:01:57.085 Compiler for C supports arguments -Wold-style-definition: YES 00:01:57.085 Compiler for C supports arguments -Wpointer-arith: YES 00:01:57.085 Compiler for C supports arguments -Wsign-compare: YES 00:01:57.085 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:57.085 Compiler for C supports arguments -Wundef: YES 00:01:57.085 Compiler for C supports arguments -Wwrite-strings: YES 00:01:57.085 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:57.085 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:01:57.085 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:57.085 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:01:57.085 Program objdump found: YES (/usr/bin/objdump) 00:01:57.085 Compiler for C supports arguments -mavx512f: YES 00:01:57.085 Checking if "AVX512 checking" compiles: YES 00:01:57.085 Fetching value of define "__SSE4_2__" : 1 00:01:57.085 Fetching value of define "__AES__" : 1 00:01:57.085 Fetching value of define "__AVX__" : 1 00:01:57.085 Fetching value of define "__AVX2__" : 1 00:01:57.085 Fetching value of define "__AVX512BW__" : 1 00:01:57.085 Fetching value of define "__AVX512CD__" : 1 00:01:57.085 Fetching value of define "__AVX512DQ__" : 1 00:01:57.085 Fetching value of define "__AVX512F__" : 1 00:01:57.085 Fetching value of define "__AVX512VL__" : 1 00:01:57.085 Fetching value of define "__PCLMUL__" : 1 00:01:57.085 Fetching value of define "__RDRND__" : 1 00:01:57.085 Fetching value of define "__RDSEED__" : 1 00:01:57.085 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:57.085 Fetching value of define "__znver1__" : (undefined) 00:01:57.085 Fetching value of define "__znver2__" : (undefined) 00:01:57.085 Fetching value of define "__znver3__" : (undefined) 00:01:57.085 Fetching value of define "__znver4__" : (undefined) 00:01:57.085 Compiler for C supports arguments -Wno-format-truncation: YES 00:01:57.085 Message: lib/log: Defining dependency "log" 00:01:57.085 Message: lib/kvargs: Defining dependency "kvargs" 00:01:57.085 Message: lib/telemetry: Defining dependency "telemetry" 00:01:57.085 Checking for function "getentropy" : NO 00:01:57.085 Message: lib/eal: Defining dependency "eal" 00:01:57.085 Message: lib/ring: Defining dependency "ring" 00:01:57.085 Message: lib/rcu: Defining dependency "rcu" 00:01:57.085 Message: lib/mempool: Defining dependency "mempool" 00:01:57.085 Message: lib/mbuf: Defining dependency "mbuf" 00:01:57.085 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:57.085 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:57.085 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:57.085 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:57.085 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:57.085 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:57.085 Compiler for C supports arguments -mpclmul: YES 00:01:57.085 Compiler for C supports arguments -maes: YES 00:01:57.085 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:57.085 Compiler for C supports arguments -mavx512bw: YES 00:01:57.085 Compiler for C supports arguments -mavx512dq: YES 00:01:57.085 Compiler for C supports arguments -mavx512vl: YES 00:01:57.085 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:57.085 Compiler for C supports arguments -mavx2: YES 00:01:57.085 Compiler for C supports arguments -mavx: YES 00:01:57.085 Message: lib/net: Defining dependency "net" 00:01:57.085 Message: lib/meter: Defining dependency "meter" 00:01:57.085 Message: lib/ethdev: Defining dependency "ethdev" 00:01:57.085 Message: lib/pci: Defining dependency "pci" 00:01:57.085 Message: lib/cmdline: Defining dependency "cmdline" 00:01:57.085 Message: lib/hash: Defining dependency "hash" 00:01:57.085 Message: lib/timer: Defining dependency "timer" 00:01:57.085 Message: lib/compressdev: Defining dependency "compressdev" 00:01:57.085 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:57.085 Message: lib/dmadev: Defining dependency "dmadev" 00:01:57.085 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:57.085 Message: lib/power: Defining dependency "power" 00:01:57.085 Message: lib/reorder: Defining dependency "reorder" 00:01:57.085 Message: lib/security: Defining dependency "security" 00:01:57.085 Has header "linux/userfaultfd.h" : YES 00:01:57.085 Has header "linux/vduse.h" : YES 00:01:57.085 Message: lib/vhost: Defining dependency "vhost" 00:01:57.085 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:01:57.085 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:01:57.085 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:57.085 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:57.085 Compiler for C supports arguments -std=c11: YES 00:01:57.085 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:01:57.085 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:01:57.085 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:01:57.085 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:01:57.085 Run-time dependency libmlx5 found: YES 1.24.44.0 00:01:57.085 Run-time dependency libibverbs found: YES 1.14.44.0 00:01:57.085 Library mtcr_ul found: NO 00:01:57.085 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:01:57.085 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:01:57.085 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:01:57.085 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:01:57.085 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:01:57.085 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:01:57.085 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:01:57.085 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:01:57.085 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:01:57.085 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:01:57.085 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:01:57.085 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:01:57.085 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:01:57.085 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:01:57.085 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:01:57.085 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:02:00.433 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:02:00.433 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:02:00.433 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:02:00.433 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:02:00.433 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:02:00.433 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:02:00.433 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:02:00.433 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:02:00.433 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:02:00.433 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:02:00.433 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:02:00.433 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:02:00.434 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:02:00.434 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:02:00.434 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:02:00.434 Header "linux/ethtool.h" has symbol "SUPPORTED_40000baseKR4_Full" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "linux/ethtool.h" has symbol "SUPPORTED_40000baseCR4_Full" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "linux/ethtool.h" has symbol "SUPPORTED_40000baseSR4_Full" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "linux/ethtool.h" has symbol "SUPPORTED_40000baseLR4_Full" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "linux/ethtool.h" has symbol "SUPPORTED_56000baseKR4_Full" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "linux/ethtool.h" has symbol "SUPPORTED_56000baseCR4_Full" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "linux/ethtool.h" has symbol "SUPPORTED_56000baseSR4_Full" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "linux/ethtool.h" has symbol "SUPPORTED_56000baseLR4_Full" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "linux/ethtool.h" has symbol "ETHTOOL_LINK_MODE_25000baseCR_Full_BIT" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "linux/ethtool.h" has symbol "ETHTOOL_LINK_MODE_50000baseCR2_Full_BIT" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "linux/ethtool.h" has symbol "ETHTOOL_LINK_MODE_100000baseKR4_Full_BIT" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:02:00.434 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:02:00.434 Configuring mlx5_autoconf.h using configuration 00:02:00.434 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:02:00.434 Run-time dependency libcrypto found: YES 3.0.9 00:02:00.434 Library IPSec_MB found: YES 00:02:00.434 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:02:00.434 Message: drivers/common/qat: Defining dependency "common_qat" 00:02:00.434 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:00.434 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:02:00.434 Library IPSec_MB found: YES 00:02:00.434 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:02:00.434 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:02:00.434 Compiler for C supports arguments -std=c11: YES (cached) 00:02:00.434 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:00.434 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:00.434 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:00.434 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:00.434 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:02:00.434 Run-time dependency libisal found: NO (tried pkgconfig) 00:02:00.434 Library libisal found: NO 00:02:00.434 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:02:00.434 Compiler for C supports arguments -std=c11: YES (cached) 00:02:00.434 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:00.434 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:00.434 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:00.434 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:00.434 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:02:00.434 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:02:00.434 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:02:00.434 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:02:00.434 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:02:00.434 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:02:00.434 Program doxygen found: YES (/usr/bin/doxygen) 00:02:00.434 Configuring doxy-api-html.conf using configuration 00:02:00.434 Configuring doxy-api-man.conf using configuration 00:02:00.434 Program mandb found: YES (/usr/bin/mandb) 00:02:00.434 Program sphinx-build found: NO 00:02:00.434 Configuring rte_build_config.h using configuration 00:02:00.434 Message: 00:02:00.434 ================= 00:02:00.434 Applications Enabled 00:02:00.434 ================= 00:02:00.434 00:02:00.434 apps: 00:02:00.434 00:02:00.434 00:02:00.434 Message: 00:02:00.434 ================= 00:02:00.434 Libraries Enabled 00:02:00.434 ================= 00:02:00.434 00:02:00.434 libs: 00:02:00.434 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:02:00.434 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:02:00.434 cryptodev, dmadev, power, reorder, security, vhost, 00:02:00.434 00:02:00.434 Message: 00:02:00.434 =============== 00:02:00.434 Drivers Enabled 00:02:00.434 =============== 00:02:00.434 00:02:00.434 common: 00:02:00.434 mlx5, qat, 00:02:00.434 bus: 00:02:00.434 auxiliary, pci, vdev, 00:02:00.434 mempool: 00:02:00.434 ring, 00:02:00.434 dma: 00:02:00.434 00:02:00.434 net: 00:02:00.434 00:02:00.434 crypto: 00:02:00.434 ipsec_mb, mlx5, 00:02:00.434 compress: 00:02:00.434 isal, mlx5, 00:02:00.434 vdpa: 00:02:00.434 00:02:00.434 00:02:00.434 Message: 00:02:00.434 ================= 00:02:00.434 Content Skipped 00:02:00.434 ================= 00:02:00.434 00:02:00.434 apps: 00:02:00.434 dumpcap: explicitly disabled via build config 00:02:00.434 graph: explicitly disabled via build config 00:02:00.434 pdump: explicitly disabled via build config 00:02:00.434 proc-info: explicitly disabled via build config 00:02:00.434 test-acl: explicitly disabled via build config 00:02:00.434 test-bbdev: explicitly disabled via build config 00:02:00.434 test-cmdline: explicitly disabled via build config 00:02:00.434 test-compress-perf: explicitly disabled via build config 00:02:00.434 test-crypto-perf: explicitly disabled via build config 00:02:00.434 test-dma-perf: explicitly disabled via build config 00:02:00.434 test-eventdev: explicitly disabled via build config 00:02:00.434 test-fib: explicitly disabled via build config 00:02:00.434 test-flow-perf: explicitly disabled via build config 00:02:00.434 test-gpudev: explicitly disabled via build config 00:02:00.434 test-mldev: explicitly disabled via build config 00:02:00.434 test-pipeline: explicitly disabled via build config 00:02:00.434 test-pmd: explicitly disabled via build config 00:02:00.434 test-regex: explicitly disabled via build config 00:02:00.434 test-sad: explicitly disabled via build config 00:02:00.434 test-security-perf: explicitly disabled via build config 00:02:00.434 00:02:00.434 libs: 00:02:00.434 metrics: explicitly disabled via build config 00:02:00.435 acl: explicitly disabled via build config 00:02:00.435 bbdev: explicitly disabled via build config 00:02:00.435 bitratestats: explicitly disabled via build config 00:02:00.435 bpf: explicitly disabled via build config 00:02:00.435 cfgfile: explicitly disabled via build config 00:02:00.435 distributor: explicitly disabled via build config 00:02:00.435 efd: explicitly disabled via build config 00:02:00.435 eventdev: explicitly disabled via build config 00:02:00.435 dispatcher: explicitly disabled via build config 00:02:00.435 gpudev: explicitly disabled via build config 00:02:00.435 gro: explicitly disabled via build config 00:02:00.435 gso: explicitly disabled via build config 00:02:00.435 ip_frag: explicitly disabled via build config 00:02:00.435 jobstats: explicitly disabled via build config 00:02:00.435 latencystats: explicitly disabled via build config 00:02:00.435 lpm: explicitly disabled via build config 00:02:00.435 member: explicitly disabled via build config 00:02:00.435 pcapng: explicitly disabled via build config 00:02:00.435 rawdev: explicitly disabled via build config 00:02:00.435 regexdev: explicitly disabled via build config 00:02:00.435 mldev: explicitly disabled via build config 00:02:00.435 rib: explicitly disabled via build config 00:02:00.435 sched: explicitly disabled via build config 00:02:00.435 stack: explicitly disabled via build config 00:02:00.435 ipsec: explicitly disabled via build config 00:02:00.435 pdcp: explicitly disabled via build config 00:02:00.435 fib: explicitly disabled via build config 00:02:00.435 port: explicitly disabled via build config 00:02:00.435 pdump: explicitly disabled via build config 00:02:00.435 table: explicitly disabled via build config 00:02:00.435 pipeline: explicitly disabled via build config 00:02:00.435 graph: explicitly disabled via build config 00:02:00.435 node: explicitly disabled via build config 00:02:00.435 00:02:00.435 drivers: 00:02:00.435 common/cpt: not in enabled drivers build config 00:02:00.435 common/dpaax: not in enabled drivers build config 00:02:00.435 common/iavf: not in enabled drivers build config 00:02:00.435 common/idpf: not in enabled drivers build config 00:02:00.435 common/mvep: not in enabled drivers build config 00:02:00.435 common/octeontx: not in enabled drivers build config 00:02:00.435 bus/cdx: not in enabled drivers build config 00:02:00.435 bus/dpaa: not in enabled drivers build config 00:02:00.435 bus/fslmc: not in enabled drivers build config 00:02:00.435 bus/ifpga: not in enabled drivers build config 00:02:00.435 bus/platform: not in enabled drivers build config 00:02:00.435 bus/vmbus: not in enabled drivers build config 00:02:00.435 common/cnxk: not in enabled drivers build config 00:02:00.435 common/nfp: not in enabled drivers build config 00:02:00.435 common/sfc_efx: not in enabled drivers build config 00:02:00.435 mempool/bucket: not in enabled drivers build config 00:02:00.435 mempool/cnxk: not in enabled drivers build config 00:02:00.435 mempool/dpaa: not in enabled drivers build config 00:02:00.435 mempool/dpaa2: not in enabled drivers build config 00:02:00.435 mempool/octeontx: not in enabled drivers build config 00:02:00.435 mempool/stack: not in enabled drivers build config 00:02:00.435 dma/cnxk: not in enabled drivers build config 00:02:00.435 dma/dpaa: not in enabled drivers build config 00:02:00.435 dma/dpaa2: not in enabled drivers build config 00:02:00.435 dma/hisilicon: not in enabled drivers build config 00:02:00.435 dma/idxd: not in enabled drivers build config 00:02:00.435 dma/ioat: not in enabled drivers build config 00:02:00.435 dma/skeleton: not in enabled drivers build config 00:02:00.435 net/af_packet: not in enabled drivers build config 00:02:00.435 net/af_xdp: not in enabled drivers build config 00:02:00.435 net/ark: not in enabled drivers build config 00:02:00.435 net/atlantic: not in enabled drivers build config 00:02:00.435 net/avp: not in enabled drivers build config 00:02:00.435 net/axgbe: not in enabled drivers build config 00:02:00.435 net/bnx2x: not in enabled drivers build config 00:02:00.435 net/bnxt: not in enabled drivers build config 00:02:00.435 net/bonding: not in enabled drivers build config 00:02:00.435 net/cnxk: not in enabled drivers build config 00:02:00.435 net/cpfl: not in enabled drivers build config 00:02:00.435 net/cxgbe: not in enabled drivers build config 00:02:00.435 net/dpaa: not in enabled drivers build config 00:02:00.435 net/dpaa2: not in enabled drivers build config 00:02:00.435 net/e1000: not in enabled drivers build config 00:02:00.435 net/ena: not in enabled drivers build config 00:02:00.435 net/enetc: not in enabled drivers build config 00:02:00.435 net/enetfec: not in enabled drivers build config 00:02:00.435 net/enic: not in enabled drivers build config 00:02:00.435 net/failsafe: not in enabled drivers build config 00:02:00.435 net/fm10k: not in enabled drivers build config 00:02:00.435 net/gve: not in enabled drivers build config 00:02:00.435 net/hinic: not in enabled drivers build config 00:02:00.435 net/hns3: not in enabled drivers build config 00:02:00.435 net/i40e: not in enabled drivers build config 00:02:00.435 net/iavf: not in enabled drivers build config 00:02:00.435 net/ice: not in enabled drivers build config 00:02:00.435 net/idpf: not in enabled drivers build config 00:02:00.435 net/igc: not in enabled drivers build config 00:02:00.435 net/ionic: not in enabled drivers build config 00:02:00.435 net/ipn3ke: not in enabled drivers build config 00:02:00.435 net/ixgbe: not in enabled drivers build config 00:02:00.435 net/mana: not in enabled drivers build config 00:02:00.435 net/memif: not in enabled drivers build config 00:02:00.435 net/mlx4: not in enabled drivers build config 00:02:00.435 net/mlx5: not in enabled drivers build config 00:02:00.435 net/mvneta: not in enabled drivers build config 00:02:00.435 net/mvpp2: not in enabled drivers build config 00:02:00.435 net/netvsc: not in enabled drivers build config 00:02:00.435 net/nfb: not in enabled drivers build config 00:02:00.435 net/nfp: not in enabled drivers build config 00:02:00.435 net/ngbe: not in enabled drivers build config 00:02:00.435 net/null: not in enabled drivers build config 00:02:00.435 net/octeontx: not in enabled drivers build config 00:02:00.435 net/octeon_ep: not in enabled drivers build config 00:02:00.435 net/pcap: not in enabled drivers build config 00:02:00.435 net/pfe: not in enabled drivers build config 00:02:00.435 net/qede: not in enabled drivers build config 00:02:00.435 net/ring: not in enabled drivers build config 00:02:00.435 net/sfc: not in enabled drivers build config 00:02:00.435 net/softnic: not in enabled drivers build config 00:02:00.435 net/tap: not in enabled drivers build config 00:02:00.435 net/thunderx: not in enabled drivers build config 00:02:00.435 net/txgbe: not in enabled drivers build config 00:02:00.435 net/vdev_netvsc: not in enabled drivers build config 00:02:00.435 net/vhost: not in enabled drivers build config 00:02:00.435 net/virtio: not in enabled drivers build config 00:02:00.435 net/vmxnet3: not in enabled drivers build config 00:02:00.435 raw/*: missing internal dependency, "rawdev" 00:02:00.435 crypto/armv8: not in enabled drivers build config 00:02:00.435 crypto/bcmfs: not in enabled drivers build config 00:02:00.435 crypto/caam_jr: not in enabled drivers build config 00:02:00.435 crypto/ccp: not in enabled drivers build config 00:02:00.435 crypto/cnxk: not in enabled drivers build config 00:02:00.435 crypto/dpaa_sec: not in enabled drivers build config 00:02:00.435 crypto/dpaa2_sec: not in enabled drivers build config 00:02:00.435 crypto/mvsam: not in enabled drivers build config 00:02:00.435 crypto/nitrox: not in enabled drivers build config 00:02:00.435 crypto/null: not in enabled drivers build config 00:02:00.435 crypto/octeontx: not in enabled drivers build config 00:02:00.435 crypto/openssl: not in enabled drivers build config 00:02:00.435 crypto/scheduler: not in enabled drivers build config 00:02:00.435 crypto/uadk: not in enabled drivers build config 00:02:00.435 crypto/virtio: not in enabled drivers build config 00:02:00.435 compress/octeontx: not in enabled drivers build config 00:02:00.435 compress/zlib: not in enabled drivers build config 00:02:00.435 regex/*: missing internal dependency, "regexdev" 00:02:00.435 ml/*: missing internal dependency, "mldev" 00:02:00.435 vdpa/ifc: not in enabled drivers build config 00:02:00.435 vdpa/mlx5: not in enabled drivers build config 00:02:00.435 vdpa/nfp: not in enabled drivers build config 00:02:00.435 vdpa/sfc: not in enabled drivers build config 00:02:00.435 event/*: missing internal dependency, "eventdev" 00:02:00.435 baseband/*: missing internal dependency, "bbdev" 00:02:00.435 gpu/*: missing internal dependency, "gpudev" 00:02:00.435 00:02:00.435 00:02:01.004 Build targets in project: 115 00:02:01.004 00:02:01.004 DPDK 23.11.0 00:02:01.004 00:02:01.004 User defined options 00:02:01.004 buildtype : debug 00:02:01.004 default_library : shared 00:02:01.004 libdir : lib 00:02:01.004 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:02:01.004 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:02:01.005 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:02:01.005 cpu_instruction_set: native 00:02:01.005 disable_apps : test-sad,graph,test-regex,dumpcap,test-eventdev,test-compress-perf,pdump,test-security-perf,test-pmd,test-flow-perf,test-pipeline,test-crypto-perf,test-gpudev,test-cmdline,test-dma-perf,proc-info,test-bbdev,test-acl,test,test-mldev,test-fib 00:02:01.005 disable_libs : sched,port,dispatcher,graph,rawdev,pdcp,bitratestats,ipsec,pcapng,pdump,gso,cfgfile,gpudev,ip_frag,node,distributor,mldev,lpm,acl,bpf,latencystats,eventdev,regexdev,gro,stack,fib,pipeline,bbdev,table,metrics,member,jobstats,efd,rib 00:02:01.005 enable_docs : false 00:02:01.005 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:02:01.005 enable_kmods : false 00:02:01.005 tests : false 00:02:01.005 00:02:01.005 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:01.580 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:02:01.580 [1/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:01.580 [2/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:01.580 [3/370] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:01.580 [4/370] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:01.580 [5/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:01.580 [6/370] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:02:01.580 [7/370] Linking static target lib/librte_kvargs.a 00:02:01.580 [8/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:01.580 [9/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:01.580 [10/370] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:01.580 [11/370] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:01.844 [12/370] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:01.844 [13/370] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:01.844 [14/370] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:01.844 [15/370] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:01.844 [16/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:01.844 [17/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:01.844 [18/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:01.844 [19/370] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:01.844 [20/370] Linking static target lib/librte_log.a 00:02:01.845 [21/370] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:01.845 [22/370] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:01.845 [23/370] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:01.845 [24/370] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:01.845 [25/370] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:02.106 [26/370] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:02.106 [27/370] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:02.106 [28/370] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:02.106 [29/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:02.106 [30/370] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:02.106 [31/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:02.106 [32/370] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:02.106 [33/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:02.367 [34/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:02.367 [35/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:02.367 [36/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:02.367 [37/370] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:02.367 [38/370] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:02.367 [39/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:02.367 [40/370] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:02.367 [41/370] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:02.367 [42/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:02.367 [43/370] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:02.367 [44/370] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:02.367 [45/370] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:02.367 [46/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:02.367 [47/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:02.367 [48/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:02.367 [49/370] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:02.367 [50/370] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:02.367 [51/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:02.367 [52/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:02.367 [53/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:02.367 [54/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:02.367 [55/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:02.367 [56/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:02.367 [57/370] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:02.367 [58/370] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:02.367 [59/370] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:02.367 [60/370] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:02.367 [61/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:02.367 [62/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:02.367 [63/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:02.367 [64/370] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:02.367 [65/370] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:02.367 [66/370] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:02.367 [67/370] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:02.367 [68/370] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:02.367 [69/370] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:02.367 [70/370] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:02.367 [71/370] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:02.367 [72/370] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:02.367 [73/370] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:02.367 [74/370] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:02.367 [75/370] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:02.367 [76/370] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:02.367 [77/370] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:02.367 [78/370] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:02.367 [79/370] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:02.367 [80/370] Linking static target lib/librte_pci.a 00:02:02.367 [81/370] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:02.367 [82/370] Linking static target lib/librte_telemetry.a 00:02:02.367 [83/370] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:02.367 [84/370] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:02.367 [85/370] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:02.367 [86/370] Linking static target lib/librte_ring.a 00:02:02.367 [87/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:02.367 [88/370] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:02.367 [89/370] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:02.367 [90/370] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:02.367 [91/370] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:02.367 [92/370] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:02.367 [93/370] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:02.367 [94/370] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:02.367 [95/370] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:02.367 [96/370] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:02.367 [97/370] Linking static target lib/librte_mempool.a 00:02:02.367 [98/370] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:02.367 [99/370] Linking static target lib/librte_meter.a 00:02:02.367 [100/370] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:02.367 [101/370] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:02.367 [102/370] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:02.367 [103/370] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:02.367 [104/370] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:02.625 [105/370] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:02.625 [106/370] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:02.625 [107/370] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:02.626 [108/370] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:02.626 [109/370] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:02.626 [110/370] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:02.626 [111/370] Linking static target lib/librte_rcu.a 00:02:02.626 [112/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:02.626 [113/370] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:02.626 [114/370] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:02.626 [115/370] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:02.626 [116/370] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:02:02.626 [117/370] Linking static target lib/librte_net.a 00:02:02.626 [118/370] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:02.626 [119/370] Linking static target lib/librte_eal.a 00:02:02.626 [120/370] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:02.626 [121/370] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:02.626 [122/370] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:02.885 [123/370] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:02.885 [124/370] Linking target lib/librte_log.so.24.0 00:02:02.885 [125/370] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:02.885 [126/370] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:02.885 [127/370] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:02.885 [128/370] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:02.885 [129/370] Linking static target lib/librte_mbuf.a 00:02:02.885 [130/370] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:02.885 [131/370] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:02.885 [132/370] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:02.885 [133/370] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:02.885 [134/370] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:02.885 [135/370] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:02.885 [136/370] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:02.885 [137/370] Linking static target lib/librte_cmdline.a 00:02:02.885 [138/370] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:02.885 [139/370] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:02.885 [140/370] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:03.147 [141/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:02:03.147 [142/370] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:03.147 [143/370] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:02:03.147 [144/370] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:03.147 [145/370] Linking static target lib/librte_timer.a 00:02:03.147 [146/370] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:03.147 [147/370] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:03.147 [148/370] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:03.147 [149/370] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:03.147 [150/370] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:03.147 [151/370] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:03.147 [152/370] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:03.147 [153/370] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:03.147 [154/370] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:03.147 [155/370] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:02:03.147 [156/370] Linking static target lib/librte_compressdev.a 00:02:03.147 [157/370] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:03.147 [158/370] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:02:03.147 [159/370] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:03.147 [160/370] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:03.147 [161/370] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:03.147 [162/370] Linking target lib/librte_kvargs.so.24.0 00:02:03.147 [163/370] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:02:03.147 [164/370] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:03.147 [165/370] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:03.147 [166/370] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:03.147 [167/370] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:03.147 [168/370] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:03.147 [169/370] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:03.147 [170/370] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:03.147 [171/370] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:03.147 [172/370] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:03.147 [173/370] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:03.147 [174/370] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:03.147 [175/370] Linking static target lib/librte_dmadev.a 00:02:03.147 [176/370] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:03.148 [177/370] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:03.148 [178/370] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:03.148 [179/370] Linking target lib/librte_telemetry.so.24.0 00:02:03.408 [180/370] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:02:03.408 [181/370] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:03.408 [182/370] Linking static target lib/librte_reorder.a 00:02:03.408 [183/370] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:03.408 [184/370] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:03.408 [185/370] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:03.408 [186/370] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:03.408 [187/370] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:03.408 [188/370] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:03.408 [189/370] Linking static target lib/librte_power.a 00:02:03.408 [190/370] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:02:03.408 [191/370] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:03.408 [192/370] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:03.408 [193/370] Linking static target lib/librte_security.a 00:02:03.408 [194/370] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:02:03.408 [195/370] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:02:03.408 [196/370] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:02:03.408 [197/370] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:02:03.408 [198/370] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:02:03.408 [199/370] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:03.408 [200/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:02:03.408 [201/370] Compiling C object drivers/librte_bus_auxiliary.so.24.0.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:03.668 [202/370] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:02:03.668 [203/370] Linking static target drivers/librte_bus_auxiliary.a 00:02:03.668 [204/370] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:02:03.668 [205/370] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:03.668 [206/370] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:03.668 [207/370] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:03.668 [208/370] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:03.668 [209/370] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:03.668 [210/370] Linking static target drivers/librte_bus_vdev.a 00:02:03.668 [211/370] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:03.668 [212/370] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:03.668 [213/370] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:02:03.668 [214/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:02:03.668 [215/370] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:03.668 [216/370] Linking static target lib/librte_hash.a 00:02:03.668 [217/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:02:03.668 [218/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:02:03.668 [219/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:02:03.668 [220/370] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:02:03.668 [221/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:02:03.668 [222/370] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:02:03.668 [223/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:02:03.668 [224/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:02:03.668 [225/370] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:03.668 [226/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:02:03.668 [227/370] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:02:03.668 [228/370] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:03.926 [229/370] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:03.926 [230/370] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:02:03.926 [231/370] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:03.926 [232/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:02:03.926 [233/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:02:03.926 [234/370] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:03.926 [235/370] Linking static target drivers/librte_bus_pci.a 00:02:03.926 [236/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:02:03.926 [237/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:02:03.926 [238/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:02:03.926 [239/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:02:03.926 [240/370] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:03.926 [241/370] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:02:03.926 [242/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:02:03.926 [243/370] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:03.926 [244/370] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:02:03.926 [245/370] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:02:03.926 [246/370] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:03.926 [247/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:02:03.926 [248/370] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:03.926 [249/370] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:03.926 [250/370] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:03.926 [251/370] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:02:03.926 [252/370] Linking static target lib/librte_cryptodev.a 00:02:03.926 [253/370] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:03.926 [254/370] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:03.926 [255/370] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:02:04.186 [256/370] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:02:04.186 [257/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:02:04.186 [258/370] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:02:04.186 [259/370] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:02:04.186 [260/370] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:02:04.186 [261/370] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:02:04.186 [262/370] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:02:04.186 [263/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:02:04.186 [264/370] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:02:04.186 [265/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:02:04.186 [266/370] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:04.186 [267/370] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:02:04.186 [268/370] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:02:04.186 [269/370] Linking static target drivers/libtmp_rte_compress_isal.a 00:02:04.186 [270/370] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:02:04.186 [271/370] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:04.186 [272/370] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:02:04.186 [273/370] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:04.186 [274/370] Linking static target drivers/librte_mempool_ring.a 00:02:04.186 [275/370] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:02:04.186 [276/370] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:02:04.186 [277/370] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:04.186 [278/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:02:04.186 [279/370] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:02:04.186 [280/370] Linking static target drivers/libtmp_rte_common_mlx5.a 00:02:04.186 [281/370] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:02:04.186 [282/370] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:04.186 [283/370] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:02:04.445 [284/370] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:02:04.445 [285/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:02:04.445 [286/370] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:04.445 [287/370] Compiling C object drivers/librte_crypto_mlx5.so.24.0.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:04.445 [288/370] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:02:04.445 [289/370] Linking static target drivers/librte_crypto_mlx5.a 00:02:04.445 [290/370] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:04.445 [291/370] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:04.445 [292/370] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:02:04.445 [293/370] Compiling C object drivers/librte_compress_isal.so.24.0.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:04.445 [294/370] Linking static target drivers/librte_compress_isal.a 00:02:04.445 [295/370] Linking static target lib/librte_ethdev.a 00:02:04.445 [296/370] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:04.445 [297/370] Compiling C object drivers/librte_compress_mlx5.so.24.0.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:04.445 [298/370] Linking static target drivers/librte_compress_mlx5.a 00:02:04.445 [299/370] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:04.445 [300/370] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:02:04.445 [301/370] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:04.704 [302/370] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:02:04.704 [303/370] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:04.704 [304/370] Compiling C object drivers/librte_common_mlx5.so.24.0.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:04.704 [305/370] Linking static target drivers/librte_common_mlx5.a 00:02:04.704 [306/370] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:04.704 [307/370] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.0.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:04.704 [308/370] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:04.704 [309/370] Linking static target drivers/librte_crypto_ipsec_mb.a 00:02:05.271 [310/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:02:05.271 [311/370] Linking static target drivers/libtmp_rte_common_qat.a 00:02:05.530 [312/370] Generating drivers/rte_common_qat.pmd.c with a custom command 00:02:05.530 [313/370] Compiling C object drivers/librte_common_qat.so.24.0.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:05.530 [314/370] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:05.530 [315/370] Linking static target drivers/librte_common_qat.a 00:02:05.789 [316/370] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:05.789 [317/370] Linking static target lib/librte_vhost.a 00:02:06.359 [318/370] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.269 [319/370] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.807 [320/370] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:02:14.099 [321/370] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.004 [322/370] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.004 [323/370] Linking target lib/librte_eal.so.24.0 00:02:16.264 [324/370] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:02:16.264 [325/370] Linking target lib/librte_ring.so.24.0 00:02:16.264 [326/370] Linking target lib/librte_meter.so.24.0 00:02:16.264 [327/370] Linking target lib/librte_pci.so.24.0 00:02:16.264 [328/370] Linking target lib/librte_timer.so.24.0 00:02:16.264 [329/370] Linking target lib/librte_dmadev.so.24.0 00:02:16.264 [330/370] Linking target drivers/librte_bus_vdev.so.24.0 00:02:16.264 [331/370] Linking target drivers/librte_bus_auxiliary.so.24.0 00:02:16.264 [332/370] Generating symbol file drivers/librte_bus_auxiliary.so.24.0.p/librte_bus_auxiliary.so.24.0.symbols 00:02:16.264 [333/370] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:02:16.264 [334/370] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:02:16.264 [335/370] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:02:16.264 [336/370] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:02:16.264 [337/370] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:02:16.264 [338/370] Generating symbol file drivers/librte_bus_vdev.so.24.0.p/librte_bus_vdev.so.24.0.symbols 00:02:16.523 [339/370] Linking target lib/librte_rcu.so.24.0 00:02:16.523 [340/370] Linking target lib/librte_mempool.so.24.0 00:02:16.523 [341/370] Linking target drivers/librte_bus_pci.so.24.0 00:02:16.523 [342/370] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:02:16.523 [343/370] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:02:16.523 [344/370] Generating symbol file drivers/librte_bus_pci.so.24.0.p/librte_bus_pci.so.24.0.symbols 00:02:16.523 [345/370] Linking target drivers/librte_mempool_ring.so.24.0 00:02:16.523 [346/370] Linking target lib/librte_mbuf.so.24.0 00:02:16.782 [347/370] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:02:16.782 [348/370] Linking target lib/librte_net.so.24.0 00:02:16.782 [349/370] Linking target lib/librte_compressdev.so.24.0 00:02:16.782 [350/370] Linking target lib/librte_cryptodev.so.24.0 00:02:16.782 [351/370] Linking target lib/librte_reorder.so.24.0 00:02:17.040 [352/370] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:02:17.040 [353/370] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:02:17.040 [354/370] Generating symbol file lib/librte_compressdev.so.24.0.p/librte_compressdev.so.24.0.symbols 00:02:17.040 [355/370] Linking target lib/librte_hash.so.24.0 00:02:17.040 [356/370] Linking target lib/librte_security.so.24.0 00:02:17.040 [357/370] Linking target lib/librte_cmdline.so.24.0 00:02:17.040 [358/370] Linking target drivers/librte_compress_isal.so.24.0 00:02:17.040 [359/370] Linking target lib/librte_ethdev.so.24.0 00:02:17.299 [360/370] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:02:17.299 [361/370] Generating symbol file lib/librte_security.so.24.0.p/librte_security.so.24.0.symbols 00:02:17.299 [362/370] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:02:17.299 [363/370] Linking target drivers/librte_common_mlx5.so.24.0 00:02:17.299 [364/370] Linking target lib/librte_power.so.24.0 00:02:17.299 [365/370] Linking target lib/librte_vhost.so.24.0 00:02:17.559 [366/370] Generating symbol file drivers/librte_common_mlx5.so.24.0.p/librte_common_mlx5.so.24.0.symbols 00:02:17.559 [367/370] Linking target drivers/librte_crypto_mlx5.so.24.0 00:02:17.559 [368/370] Linking target drivers/librte_compress_mlx5.so.24.0 00:02:17.559 [369/370] Linking target drivers/librte_crypto_ipsec_mb.so.24.0 00:02:17.559 [370/370] Linking target drivers/librte_common_qat.so.24.0 00:02:17.559 INFO: autodetecting backend as ninja 00:02:17.559 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 72 00:02:18.963 CC lib/log/log.o 00:02:18.963 CC lib/log/log_deprecated.o 00:02:18.963 CC lib/log/log_flags.o 00:02:18.963 CC lib/ut_mock/mock.o 00:02:18.963 CC lib/ut/ut.o 00:02:19.222 LIB libspdk_ut_mock.a 00:02:19.222 LIB libspdk_log.a 00:02:19.222 SO libspdk_ut_mock.so.6.0 00:02:19.222 LIB libspdk_ut.a 00:02:19.222 SO libspdk_log.so.7.0 00:02:19.222 SO libspdk_ut.so.2.0 00:02:19.222 SYMLINK libspdk_ut_mock.so 00:02:19.222 SYMLINK libspdk_ut.so 00:02:19.222 SYMLINK libspdk_log.so 00:02:19.790 CC lib/util/base64.o 00:02:19.790 CC lib/util/bit_array.o 00:02:19.790 CC lib/util/cpuset.o 00:02:19.790 CC lib/util/crc16.o 00:02:19.790 CC lib/util/crc32.o 00:02:19.790 CC lib/util/crc32c.o 00:02:19.790 CC lib/dma/dma.o 00:02:19.790 CC lib/util/crc32_ieee.o 00:02:19.790 CC lib/util/crc64.o 00:02:19.790 CC lib/util/dif.o 00:02:19.790 CXX lib/trace_parser/trace.o 00:02:19.790 CC lib/util/fd.o 00:02:19.790 CC lib/util/file.o 00:02:19.790 CC lib/util/iov.o 00:02:19.790 CC lib/util/hexlify.o 00:02:19.790 CC lib/util/math.o 00:02:19.790 CC lib/util/strerror_tls.o 00:02:19.790 CC lib/util/pipe.o 00:02:19.790 CC lib/util/string.o 00:02:19.790 CC lib/util/uuid.o 00:02:19.790 CC lib/util/xor.o 00:02:19.790 CC lib/util/fd_group.o 00:02:19.790 CC lib/util/zipf.o 00:02:19.790 CC lib/ioat/ioat.o 00:02:19.790 CC lib/vfio_user/host/vfio_user_pci.o 00:02:19.790 CC lib/vfio_user/host/vfio_user.o 00:02:19.790 LIB libspdk_dma.a 00:02:19.790 SO libspdk_dma.so.4.0 00:02:20.048 SYMLINK libspdk_dma.so 00:02:20.048 LIB libspdk_ioat.a 00:02:20.048 SO libspdk_ioat.so.7.0 00:02:20.048 SYMLINK libspdk_ioat.so 00:02:20.048 LIB libspdk_vfio_user.a 00:02:20.048 SO libspdk_vfio_user.so.5.0 00:02:20.307 LIB libspdk_util.a 00:02:20.307 SYMLINK libspdk_vfio_user.so 00:02:20.307 SO libspdk_util.so.9.0 00:02:20.566 SYMLINK libspdk_util.so 00:02:20.566 LIB libspdk_trace_parser.a 00:02:20.566 SO libspdk_trace_parser.so.5.0 00:02:20.824 SYMLINK libspdk_trace_parser.so 00:02:20.824 CC lib/json/json_util.o 00:02:20.824 CC lib/json/json_parse.o 00:02:20.824 CC lib/rdma/common.o 00:02:20.824 CC lib/reduce/reduce.o 00:02:20.824 CC lib/rdma/rdma_verbs.o 00:02:20.824 CC lib/json/json_write.o 00:02:20.824 CC lib/vmd/vmd.o 00:02:20.824 CC lib/vmd/led.o 00:02:20.824 CC lib/conf/conf.o 00:02:20.824 CC lib/idxd/idxd.o 00:02:20.824 CC lib/idxd/idxd_user.o 00:02:20.824 CC lib/env_dpdk/env.o 00:02:20.824 CC lib/env_dpdk/memory.o 00:02:20.824 CC lib/env_dpdk/pci.o 00:02:20.824 CC lib/env_dpdk/init.o 00:02:20.824 CC lib/env_dpdk/threads.o 00:02:20.824 CC lib/env_dpdk/pci_ioat.o 00:02:20.824 CC lib/env_dpdk/pci_virtio.o 00:02:20.824 CC lib/env_dpdk/pci_vmd.o 00:02:20.824 CC lib/env_dpdk/pci_idxd.o 00:02:20.824 CC lib/env_dpdk/pci_event.o 00:02:20.824 CC lib/env_dpdk/sigbus_handler.o 00:02:20.824 CC lib/env_dpdk/pci_dpdk.o 00:02:20.824 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:20.824 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:21.081 LIB libspdk_conf.a 00:02:21.081 SO libspdk_conf.so.6.0 00:02:21.349 LIB libspdk_rdma.a 00:02:21.349 LIB libspdk_json.a 00:02:21.349 SO libspdk_rdma.so.6.0 00:02:21.349 SYMLINK libspdk_conf.so 00:02:21.349 SO libspdk_json.so.6.0 00:02:21.349 SYMLINK libspdk_rdma.so 00:02:21.349 SYMLINK libspdk_json.so 00:02:21.349 LIB libspdk_idxd.a 00:02:21.611 SO libspdk_idxd.so.12.0 00:02:21.611 LIB libspdk_reduce.a 00:02:21.611 LIB libspdk_vmd.a 00:02:21.611 SO libspdk_reduce.so.6.0 00:02:21.611 SYMLINK libspdk_idxd.so 00:02:21.611 SO libspdk_vmd.so.6.0 00:02:21.611 SYMLINK libspdk_reduce.so 00:02:21.611 SYMLINK libspdk_vmd.so 00:02:21.611 CC lib/jsonrpc/jsonrpc_server.o 00:02:21.611 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:21.611 CC lib/jsonrpc/jsonrpc_client.o 00:02:21.611 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:22.178 LIB libspdk_jsonrpc.a 00:02:22.178 SO libspdk_jsonrpc.so.6.0 00:02:22.178 SYMLINK libspdk_jsonrpc.so 00:02:22.437 LIB libspdk_env_dpdk.a 00:02:22.437 SO libspdk_env_dpdk.so.14.0 00:02:22.437 CC lib/rpc/rpc.o 00:02:22.696 SYMLINK libspdk_env_dpdk.so 00:02:22.696 LIB libspdk_rpc.a 00:02:22.696 SO libspdk_rpc.so.6.0 00:02:22.955 SYMLINK libspdk_rpc.so 00:02:23.213 CC lib/notify/notify.o 00:02:23.213 CC lib/notify/notify_rpc.o 00:02:23.214 CC lib/trace/trace.o 00:02:23.214 CC lib/keyring/keyring.o 00:02:23.214 CC lib/keyring/keyring_rpc.o 00:02:23.214 CC lib/trace/trace_flags.o 00:02:23.214 CC lib/trace/trace_rpc.o 00:02:23.472 LIB libspdk_notify.a 00:02:23.472 SO libspdk_notify.so.6.0 00:02:23.472 LIB libspdk_keyring.a 00:02:23.472 LIB libspdk_trace.a 00:02:23.472 SYMLINK libspdk_notify.so 00:02:23.472 SO libspdk_keyring.so.1.0 00:02:23.472 SO libspdk_trace.so.10.0 00:02:23.731 SYMLINK libspdk_keyring.so 00:02:23.731 SYMLINK libspdk_trace.so 00:02:23.990 CC lib/sock/sock.o 00:02:23.990 CC lib/thread/iobuf.o 00:02:23.990 CC lib/sock/sock_rpc.o 00:02:23.990 CC lib/thread/thread.o 00:02:24.557 LIB libspdk_sock.a 00:02:24.557 SO libspdk_sock.so.9.0 00:02:24.557 SYMLINK libspdk_sock.so 00:02:24.816 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:24.816 CC lib/nvme/nvme_ctrlr.o 00:02:24.816 CC lib/nvme/nvme_fabric.o 00:02:24.816 CC lib/nvme/nvme_ns_cmd.o 00:02:25.075 CC lib/nvme/nvme_ns.o 00:02:25.075 CC lib/nvme/nvme_pcie_common.o 00:02:25.075 CC lib/nvme/nvme_pcie.o 00:02:25.075 CC lib/nvme/nvme_qpair.o 00:02:25.075 CC lib/nvme/nvme.o 00:02:25.075 CC lib/nvme/nvme_quirks.o 00:02:25.075 CC lib/nvme/nvme_transport.o 00:02:25.075 CC lib/nvme/nvme_discovery.o 00:02:25.075 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:25.075 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:25.075 CC lib/nvme/nvme_tcp.o 00:02:25.075 CC lib/nvme/nvme_opal.o 00:02:25.075 CC lib/nvme/nvme_io_msg.o 00:02:25.075 CC lib/nvme/nvme_poll_group.o 00:02:25.075 CC lib/nvme/nvme_stubs.o 00:02:25.075 CC lib/nvme/nvme_zns.o 00:02:25.075 CC lib/nvme/nvme_auth.o 00:02:25.075 CC lib/nvme/nvme_cuse.o 00:02:25.075 CC lib/nvme/nvme_rdma.o 00:02:25.075 LIB libspdk_thread.a 00:02:25.075 SO libspdk_thread.so.10.0 00:02:25.334 SYMLINK libspdk_thread.so 00:02:25.592 CC lib/virtio/virtio.o 00:02:25.592 CC lib/virtio/virtio_vhost_user.o 00:02:25.592 CC lib/virtio/virtio_vfio_user.o 00:02:25.592 CC lib/virtio/virtio_pci.o 00:02:25.592 CC lib/init/json_config.o 00:02:25.592 CC lib/init/subsystem.o 00:02:25.592 CC lib/init/rpc.o 00:02:25.592 CC lib/init/subsystem_rpc.o 00:02:25.592 CC lib/accel/accel.o 00:02:25.592 CC lib/accel/accel_rpc.o 00:02:25.592 CC lib/accel/accel_sw.o 00:02:25.592 CC lib/blob/blobstore.o 00:02:25.592 CC lib/blob/zeroes.o 00:02:25.592 CC lib/blob/request.o 00:02:25.592 CC lib/blob/blob_bs_dev.o 00:02:25.851 LIB libspdk_init.a 00:02:25.851 SO libspdk_init.so.5.0 00:02:25.851 LIB libspdk_virtio.a 00:02:26.110 SYMLINK libspdk_init.so 00:02:26.110 SO libspdk_virtio.so.7.0 00:02:26.110 SYMLINK libspdk_virtio.so 00:02:26.369 CC lib/event/app.o 00:02:26.369 CC lib/event/reactor.o 00:02:26.369 CC lib/event/log_rpc.o 00:02:26.369 CC lib/event/scheduler_static.o 00:02:26.369 CC lib/event/app_rpc.o 00:02:26.369 LIB libspdk_accel.a 00:02:26.369 SO libspdk_accel.so.15.0 00:02:26.628 SYMLINK libspdk_accel.so 00:02:26.887 LIB libspdk_event.a 00:02:26.887 SO libspdk_event.so.13.0 00:02:26.887 CC lib/bdev/bdev.o 00:02:26.887 CC lib/bdev/bdev_rpc.o 00:02:26.887 CC lib/bdev/bdev_zone.o 00:02:26.887 CC lib/bdev/part.o 00:02:26.887 CC lib/bdev/scsi_nvme.o 00:02:26.887 SYMLINK libspdk_event.so 00:02:27.145 LIB libspdk_nvme.a 00:02:27.404 SO libspdk_nvme.so.13.0 00:02:27.663 SYMLINK libspdk_nvme.so 00:02:28.602 LIB libspdk_blob.a 00:02:28.602 SO libspdk_blob.so.11.0 00:02:28.602 SYMLINK libspdk_blob.so 00:02:29.170 CC lib/lvol/lvol.o 00:02:29.170 CC lib/blobfs/blobfs.o 00:02:29.170 CC lib/blobfs/tree.o 00:02:29.429 LIB libspdk_bdev.a 00:02:29.688 SO libspdk_bdev.so.15.0 00:02:29.688 SYMLINK libspdk_bdev.so 00:02:29.946 LIB libspdk_blobfs.a 00:02:29.946 SO libspdk_blobfs.so.10.0 00:02:29.946 LIB libspdk_lvol.a 00:02:29.946 SO libspdk_lvol.so.10.0 00:02:29.946 SYMLINK libspdk_blobfs.so 00:02:30.210 CC lib/ublk/ublk.o 00:02:30.210 CC lib/ublk/ublk_rpc.o 00:02:30.210 CC lib/nbd/nbd.o 00:02:30.210 CC lib/nbd/nbd_rpc.o 00:02:30.210 CC lib/nvmf/ctrlr.o 00:02:30.210 CC lib/scsi/dev.o 00:02:30.210 CC lib/nvmf/ctrlr_discovery.o 00:02:30.210 CC lib/scsi/lun.o 00:02:30.210 CC lib/ftl/ftl_core.o 00:02:30.210 CC lib/nvmf/ctrlr_bdev.o 00:02:30.210 CC lib/scsi/scsi_bdev.o 00:02:30.210 CC lib/nvmf/nvmf.o 00:02:30.210 CC lib/ftl/ftl_init.o 00:02:30.210 CC lib/ftl/ftl_layout.o 00:02:30.210 CC lib/scsi/port.o 00:02:30.210 CC lib/scsi/scsi.o 00:02:30.210 CC lib/nvmf/subsystem.o 00:02:30.210 CC lib/ftl/ftl_debug.o 00:02:30.210 CC lib/nvmf/nvmf_rpc.o 00:02:30.210 CC lib/scsi/scsi_pr.o 00:02:30.210 CC lib/scsi/scsi_rpc.o 00:02:30.210 CC lib/nvmf/transport.o 00:02:30.210 CC lib/ftl/ftl_io.o 00:02:30.210 SYMLINK libspdk_lvol.so 00:02:30.210 CC lib/nvmf/stubs.o 00:02:30.210 CC lib/scsi/task.o 00:02:30.210 CC lib/nvmf/tcp.o 00:02:30.210 CC lib/ftl/ftl_sb.o 00:02:30.210 CC lib/nvmf/rdma.o 00:02:30.210 CC lib/ftl/ftl_l2p.o 00:02:30.210 CC lib/ftl/ftl_l2p_flat.o 00:02:30.210 CC lib/nvmf/auth.o 00:02:30.210 CC lib/ftl/ftl_nv_cache.o 00:02:30.210 CC lib/ftl/ftl_band_ops.o 00:02:30.210 CC lib/ftl/ftl_band.o 00:02:30.210 CC lib/ftl/ftl_writer.o 00:02:30.210 CC lib/ftl/ftl_reloc.o 00:02:30.210 CC lib/ftl/ftl_rq.o 00:02:30.210 CC lib/ftl/ftl_l2p_cache.o 00:02:30.210 CC lib/ftl/ftl_p2l.o 00:02:30.210 CC lib/ftl/mngt/ftl_mngt.o 00:02:30.210 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:30.210 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:30.210 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:30.210 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:30.210 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:30.210 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:30.210 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:30.210 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:30.210 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:30.210 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:30.210 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:30.210 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:30.210 CC lib/ftl/utils/ftl_conf.o 00:02:30.210 CC lib/ftl/utils/ftl_md.o 00:02:30.210 CC lib/ftl/utils/ftl_bitmap.o 00:02:30.210 CC lib/ftl/utils/ftl_mempool.o 00:02:30.210 CC lib/ftl/utils/ftl_property.o 00:02:30.210 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:30.210 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:30.210 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:30.210 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:30.210 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:30.210 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:30.210 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:30.210 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:30.210 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:30.210 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:30.210 CC lib/ftl/base/ftl_base_dev.o 00:02:30.210 CC lib/ftl/ftl_trace.o 00:02:30.210 CC lib/ftl/base/ftl_base_bdev.o 00:02:30.777 LIB libspdk_nbd.a 00:02:30.777 SO libspdk_nbd.so.7.0 00:02:30.777 SYMLINK libspdk_nbd.so 00:02:31.036 LIB libspdk_scsi.a 00:02:31.036 LIB libspdk_ublk.a 00:02:31.036 SO libspdk_scsi.so.9.0 00:02:31.036 SO libspdk_ublk.so.3.0 00:02:31.036 SYMLINK libspdk_ublk.so 00:02:31.036 SYMLINK libspdk_scsi.so 00:02:31.296 LIB libspdk_ftl.a 00:02:31.555 CC lib/vhost/vhost.o 00:02:31.555 CC lib/vhost/vhost_rpc.o 00:02:31.555 CC lib/iscsi/conn.o 00:02:31.555 CC lib/vhost/vhost_blk.o 00:02:31.555 CC lib/vhost/vhost_scsi.o 00:02:31.555 CC lib/iscsi/init_grp.o 00:02:31.555 CC lib/iscsi/iscsi.o 00:02:31.555 CC lib/vhost/rte_vhost_user.o 00:02:31.555 CC lib/iscsi/md5.o 00:02:31.555 CC lib/iscsi/param.o 00:02:31.555 CC lib/iscsi/portal_grp.o 00:02:31.555 CC lib/iscsi/tgt_node.o 00:02:31.555 CC lib/iscsi/iscsi_subsystem.o 00:02:31.555 CC lib/iscsi/iscsi_rpc.o 00:02:31.555 CC lib/iscsi/task.o 00:02:31.555 SO libspdk_ftl.so.9.0 00:02:32.124 SYMLINK libspdk_ftl.so 00:02:32.413 LIB libspdk_nvmf.a 00:02:32.413 SO libspdk_nvmf.so.18.0 00:02:32.672 LIB libspdk_vhost.a 00:02:32.672 SYMLINK libspdk_nvmf.so 00:02:32.672 SO libspdk_vhost.so.8.0 00:02:32.672 SYMLINK libspdk_vhost.so 00:02:32.931 LIB libspdk_iscsi.a 00:02:32.931 SO libspdk_iscsi.so.8.0 00:02:33.189 SYMLINK libspdk_iscsi.so 00:02:33.757 CC module/env_dpdk/env_dpdk_rpc.o 00:02:33.757 CC module/sock/posix/posix.o 00:02:33.757 CC module/accel/ioat/accel_ioat.o 00:02:33.757 CC module/accel/ioat/accel_ioat_rpc.o 00:02:33.757 LIB libspdk_env_dpdk_rpc.a 00:02:33.757 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:02:33.757 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:02:33.757 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:33.757 CC module/accel/error/accel_error.o 00:02:33.757 CC module/accel/error/accel_error_rpc.o 00:02:33.757 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:02:33.757 CC module/accel/iaa/accel_iaa_rpc.o 00:02:33.757 CC module/accel/dsa/accel_dsa_rpc.o 00:02:33.757 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:02:33.757 CC module/accel/dsa/accel_dsa.o 00:02:33.757 CC module/accel/iaa/accel_iaa.o 00:02:33.757 CC module/keyring/file/keyring.o 00:02:33.757 CC module/keyring/file/keyring_rpc.o 00:02:33.757 CC module/blob/bdev/blob_bdev.o 00:02:33.757 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:33.757 CC module/scheduler/gscheduler/gscheduler.o 00:02:34.016 SO libspdk_env_dpdk_rpc.so.6.0 00:02:34.016 SYMLINK libspdk_env_dpdk_rpc.so 00:02:34.016 LIB libspdk_keyring_file.a 00:02:34.016 LIB libspdk_scheduler_gscheduler.a 00:02:34.016 LIB libspdk_scheduler_dpdk_governor.a 00:02:34.016 LIB libspdk_accel_ioat.a 00:02:34.016 LIB libspdk_accel_error.a 00:02:34.016 SO libspdk_keyring_file.so.1.0 00:02:34.016 SO libspdk_scheduler_gscheduler.so.4.0 00:02:34.016 LIB libspdk_scheduler_dynamic.a 00:02:34.016 SO libspdk_scheduler_dpdk_governor.so.4.0 00:02:34.016 LIB libspdk_accel_iaa.a 00:02:34.016 SO libspdk_accel_ioat.so.6.0 00:02:34.016 SO libspdk_accel_error.so.2.0 00:02:34.274 LIB libspdk_accel_dsa.a 00:02:34.274 SO libspdk_scheduler_dynamic.so.4.0 00:02:34.274 SO libspdk_accel_iaa.so.3.0 00:02:34.274 SYMLINK libspdk_scheduler_gscheduler.so 00:02:34.274 SYMLINK libspdk_scheduler_dpdk_governor.so 00:02:34.274 SYMLINK libspdk_keyring_file.so 00:02:34.274 LIB libspdk_blob_bdev.a 00:02:34.274 SO libspdk_accel_dsa.so.5.0 00:02:34.274 SYMLINK libspdk_accel_ioat.so 00:02:34.274 SYMLINK libspdk_accel_error.so 00:02:34.274 SO libspdk_blob_bdev.so.11.0 00:02:34.274 SYMLINK libspdk_scheduler_dynamic.so 00:02:34.274 SYMLINK libspdk_accel_iaa.so 00:02:34.274 SYMLINK libspdk_accel_dsa.so 00:02:34.274 SYMLINK libspdk_blob_bdev.so 00:02:34.534 LIB libspdk_sock_posix.a 00:02:34.793 SO libspdk_sock_posix.so.6.0 00:02:34.793 SYMLINK libspdk_sock_posix.so 00:02:34.793 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:34.793 CC module/blobfs/bdev/blobfs_bdev.o 00:02:34.793 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:34.793 CC module/bdev/gpt/vbdev_gpt.o 00:02:34.793 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:34.793 CC module/bdev/gpt/gpt.o 00:02:34.793 CC module/bdev/split/vbdev_split_rpc.o 00:02:34.793 CC module/bdev/split/vbdev_split.o 00:02:34.793 CC module/bdev/crypto/vbdev_crypto.o 00:02:34.793 CC module/bdev/delay/vbdev_delay.o 00:02:34.793 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:02:34.793 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:34.793 CC module/bdev/nvme/bdev_nvme.o 00:02:34.793 CC module/bdev/null/bdev_null_rpc.o 00:02:34.793 CC module/bdev/aio/bdev_aio.o 00:02:34.793 CC module/bdev/nvme/nvme_rpc.o 00:02:34.793 CC module/bdev/null/bdev_null.o 00:02:34.793 CC module/bdev/malloc/bdev_malloc.o 00:02:34.793 CC module/bdev/raid/bdev_raid.o 00:02:34.793 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:34.793 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:34.793 CC module/bdev/nvme/vbdev_opal.o 00:02:34.793 CC module/bdev/aio/bdev_aio_rpc.o 00:02:34.793 CC module/bdev/raid/bdev_raid_rpc.o 00:02:34.793 CC module/bdev/raid/bdev_raid_sb.o 00:02:34.793 CC module/bdev/nvme/bdev_mdns_client.o 00:02:34.793 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:34.793 CC module/bdev/error/vbdev_error.o 00:02:34.793 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:34.793 CC module/bdev/passthru/vbdev_passthru.o 00:02:34.793 CC module/bdev/raid/raid0.o 00:02:34.793 CC module/bdev/error/vbdev_error_rpc.o 00:02:34.793 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:34.793 CC module/bdev/iscsi/bdev_iscsi.o 00:02:34.793 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:34.793 CC module/bdev/raid/raid1.o 00:02:34.793 CC module/bdev/raid/concat.o 00:02:34.793 CC module/bdev/lvol/vbdev_lvol.o 00:02:34.793 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:34.793 CC module/bdev/compress/vbdev_compress.o 00:02:34.793 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:34.793 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:34.793 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:34.793 CC module/bdev/compress/vbdev_compress_rpc.o 00:02:34.793 CC module/bdev/ftl/bdev_ftl.o 00:02:34.793 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:35.052 LIB libspdk_accel_dpdk_cryptodev.a 00:02:35.052 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:02:35.052 LIB libspdk_accel_dpdk_compressdev.a 00:02:35.052 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:02:35.052 SO libspdk_accel_dpdk_compressdev.so.3.0 00:02:35.052 LIB libspdk_blobfs_bdev.a 00:02:35.052 LIB libspdk_bdev_split.a 00:02:35.052 SO libspdk_bdev_split.so.6.0 00:02:35.052 SO libspdk_blobfs_bdev.so.6.0 00:02:35.052 LIB libspdk_bdev_error.a 00:02:35.052 LIB libspdk_bdev_passthru.a 00:02:35.310 SYMLINK libspdk_accel_dpdk_compressdev.so 00:02:35.310 LIB libspdk_bdev_malloc.a 00:02:35.310 LIB libspdk_bdev_aio.a 00:02:35.310 SO libspdk_bdev_error.so.6.0 00:02:35.310 LIB libspdk_bdev_crypto.a 00:02:35.310 SO libspdk_bdev_passthru.so.6.0 00:02:35.310 SO libspdk_bdev_malloc.so.6.0 00:02:35.310 LIB libspdk_bdev_null.a 00:02:35.310 SYMLINK libspdk_blobfs_bdev.so 00:02:35.310 SYMLINK libspdk_bdev_split.so 00:02:35.310 LIB libspdk_bdev_gpt.a 00:02:35.310 LIB libspdk_bdev_delay.a 00:02:35.310 LIB libspdk_bdev_ftl.a 00:02:35.311 SO libspdk_bdev_aio.so.6.0 00:02:35.311 SO libspdk_bdev_crypto.so.6.0 00:02:35.311 SO libspdk_bdev_null.so.6.0 00:02:35.311 SYMLINK libspdk_bdev_error.so 00:02:35.311 SO libspdk_bdev_gpt.so.6.0 00:02:35.311 LIB libspdk_bdev_iscsi.a 00:02:35.311 SYMLINK libspdk_bdev_passthru.so 00:02:35.311 SO libspdk_bdev_ftl.so.6.0 00:02:35.311 SO libspdk_bdev_delay.so.6.0 00:02:35.311 SYMLINK libspdk_bdev_malloc.so 00:02:35.311 LIB libspdk_bdev_zone_block.a 00:02:35.311 SYMLINK libspdk_bdev_crypto.so 00:02:35.311 SYMLINK libspdk_bdev_aio.so 00:02:35.311 SO libspdk_bdev_iscsi.so.6.0 00:02:35.311 LIB libspdk_bdev_compress.a 00:02:35.311 SYMLINK libspdk_bdev_null.so 00:02:35.311 SO libspdk_bdev_zone_block.so.6.0 00:02:35.311 SYMLINK libspdk_bdev_gpt.so 00:02:35.311 SYMLINK libspdk_bdev_ftl.so 00:02:35.311 SYMLINK libspdk_bdev_delay.so 00:02:35.311 SO libspdk_bdev_compress.so.6.0 00:02:35.311 SYMLINK libspdk_bdev_iscsi.so 00:02:35.311 SYMLINK libspdk_bdev_zone_block.so 00:02:35.570 LIB libspdk_bdev_lvol.a 00:02:35.570 SYMLINK libspdk_bdev_compress.so 00:02:35.570 SO libspdk_bdev_lvol.so.6.0 00:02:35.570 LIB libspdk_bdev_virtio.a 00:02:35.570 SO libspdk_bdev_virtio.so.6.0 00:02:35.570 SYMLINK libspdk_bdev_lvol.so 00:02:35.570 SYMLINK libspdk_bdev_virtio.so 00:02:35.828 LIB libspdk_bdev_raid.a 00:02:36.087 SO libspdk_bdev_raid.so.6.0 00:02:36.087 SYMLINK libspdk_bdev_raid.so 00:02:37.025 LIB libspdk_bdev_nvme.a 00:02:37.284 SO libspdk_bdev_nvme.so.7.0 00:02:37.284 SYMLINK libspdk_bdev_nvme.so 00:02:38.222 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:38.222 CC module/event/subsystems/iobuf/iobuf.o 00:02:38.222 CC module/event/subsystems/vmd/vmd.o 00:02:38.222 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:38.222 CC module/event/subsystems/keyring/keyring.o 00:02:38.222 CC module/event/subsystems/sock/sock.o 00:02:38.222 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:38.222 CC module/event/subsystems/scheduler/scheduler.o 00:02:38.222 LIB libspdk_event_iobuf.a 00:02:38.222 LIB libspdk_event_keyring.a 00:02:38.222 LIB libspdk_event_sock.a 00:02:38.222 LIB libspdk_event_vhost_blk.a 00:02:38.222 LIB libspdk_event_vmd.a 00:02:38.222 LIB libspdk_event_scheduler.a 00:02:38.222 SO libspdk_event_iobuf.so.3.0 00:02:38.222 SO libspdk_event_keyring.so.1.0 00:02:38.222 SO libspdk_event_vmd.so.6.0 00:02:38.222 SO libspdk_event_sock.so.5.0 00:02:38.222 SO libspdk_event_vhost_blk.so.3.0 00:02:38.222 SO libspdk_event_scheduler.so.4.0 00:02:38.222 SYMLINK libspdk_event_keyring.so 00:02:38.222 SYMLINK libspdk_event_iobuf.so 00:02:38.222 SYMLINK libspdk_event_sock.so 00:02:38.222 SYMLINK libspdk_event_vhost_blk.so 00:02:38.222 SYMLINK libspdk_event_vmd.so 00:02:38.480 SYMLINK libspdk_event_scheduler.so 00:02:38.738 CC module/event/subsystems/accel/accel.o 00:02:38.996 LIB libspdk_event_accel.a 00:02:38.996 SO libspdk_event_accel.so.6.0 00:02:38.996 SYMLINK libspdk_event_accel.so 00:02:39.255 CC module/event/subsystems/bdev/bdev.o 00:02:39.513 LIB libspdk_event_bdev.a 00:02:39.513 SO libspdk_event_bdev.so.6.0 00:02:39.772 SYMLINK libspdk_event_bdev.so 00:02:40.030 CC module/event/subsystems/scsi/scsi.o 00:02:40.030 CC module/event/subsystems/nbd/nbd.o 00:02:40.030 CC module/event/subsystems/ublk/ublk.o 00:02:40.030 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:40.030 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:40.290 LIB libspdk_event_nbd.a 00:02:40.290 LIB libspdk_event_ublk.a 00:02:40.290 LIB libspdk_event_scsi.a 00:02:40.290 SO libspdk_event_nbd.so.6.0 00:02:40.290 SO libspdk_event_ublk.so.3.0 00:02:40.290 SO libspdk_event_scsi.so.6.0 00:02:40.290 LIB libspdk_event_nvmf.a 00:02:40.290 SYMLINK libspdk_event_nbd.so 00:02:40.290 SYMLINK libspdk_event_ublk.so 00:02:40.290 SYMLINK libspdk_event_scsi.so 00:02:40.290 SO libspdk_event_nvmf.so.6.0 00:02:40.549 SYMLINK libspdk_event_nvmf.so 00:02:40.808 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:40.808 CC module/event/subsystems/iscsi/iscsi.o 00:02:40.808 LIB libspdk_event_vhost_scsi.a 00:02:40.808 LIB libspdk_event_iscsi.a 00:02:40.808 SO libspdk_event_vhost_scsi.so.3.0 00:02:41.069 SO libspdk_event_iscsi.so.6.0 00:02:41.069 SYMLINK libspdk_event_vhost_scsi.so 00:02:41.069 SYMLINK libspdk_event_iscsi.so 00:02:41.328 SO libspdk.so.6.0 00:02:41.328 SYMLINK libspdk.so 00:02:41.590 CC app/trace_record/trace_record.o 00:02:41.590 CC app/spdk_lspci/spdk_lspci.o 00:02:41.590 CC app/spdk_nvme_identify/identify.o 00:02:41.590 CXX app/trace/trace.o 00:02:41.590 CC app/spdk_top/spdk_top.o 00:02:41.590 CC app/spdk_nvme_discover/discovery_aer.o 00:02:41.590 CC test/rpc_client/rpc_client_test.o 00:02:41.590 CC app/spdk_nvme_perf/perf.o 00:02:41.590 TEST_HEADER include/spdk/accel.h 00:02:41.590 TEST_HEADER include/spdk/accel_module.h 00:02:41.590 TEST_HEADER include/spdk/assert.h 00:02:41.590 TEST_HEADER include/spdk/barrier.h 00:02:41.590 TEST_HEADER include/spdk/base64.h 00:02:41.590 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:41.590 TEST_HEADER include/spdk/bdev.h 00:02:41.590 TEST_HEADER include/spdk/bdev_module.h 00:02:41.590 TEST_HEADER include/spdk/bdev_zone.h 00:02:41.590 TEST_HEADER include/spdk/bit_array.h 00:02:41.590 TEST_HEADER include/spdk/bit_pool.h 00:02:41.590 TEST_HEADER include/spdk/blob_bdev.h 00:02:41.590 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:41.590 TEST_HEADER include/spdk/blobfs.h 00:02:41.590 TEST_HEADER include/spdk/blob.h 00:02:41.590 CC app/iscsi_tgt/iscsi_tgt.o 00:02:41.590 TEST_HEADER include/spdk/conf.h 00:02:41.590 TEST_HEADER include/spdk/config.h 00:02:41.590 CC app/spdk_dd/spdk_dd.o 00:02:41.853 TEST_HEADER include/spdk/cpuset.h 00:02:41.853 TEST_HEADER include/spdk/crc16.h 00:02:41.853 CC app/nvmf_tgt/nvmf_main.o 00:02:41.853 TEST_HEADER include/spdk/crc32.h 00:02:41.853 TEST_HEADER include/spdk/crc64.h 00:02:41.853 TEST_HEADER include/spdk/dif.h 00:02:41.853 CC app/vhost/vhost.o 00:02:41.853 TEST_HEADER include/spdk/dma.h 00:02:41.853 TEST_HEADER include/spdk/endian.h 00:02:41.853 TEST_HEADER include/spdk/env_dpdk.h 00:02:41.853 TEST_HEADER include/spdk/env.h 00:02:41.853 TEST_HEADER include/spdk/event.h 00:02:41.853 TEST_HEADER include/spdk/fd_group.h 00:02:41.853 TEST_HEADER include/spdk/fd.h 00:02:41.853 TEST_HEADER include/spdk/file.h 00:02:41.853 TEST_HEADER include/spdk/ftl.h 00:02:41.853 CC examples/vmd/led/led.o 00:02:41.853 TEST_HEADER include/spdk/gpt_spec.h 00:02:41.853 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:41.853 CC examples/ioat/verify/verify.o 00:02:41.853 CC examples/nvme/hotplug/hotplug.o 00:02:41.853 CC examples/nvme/hello_world/hello_world.o 00:02:41.853 CC test/env/vtophys/vtophys.o 00:02:41.853 TEST_HEADER include/spdk/hexlify.h 00:02:41.853 TEST_HEADER include/spdk/histogram_data.h 00:02:41.853 CC examples/nvme/arbitration/arbitration.o 00:02:41.853 CC examples/sock/hello_world/hello_sock.o 00:02:41.853 TEST_HEADER include/spdk/idxd.h 00:02:41.853 CC examples/ioat/perf/perf.o 00:02:41.853 TEST_HEADER include/spdk/idxd_spec.h 00:02:41.853 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:41.853 CC app/fio/nvme/fio_plugin.o 00:02:41.853 TEST_HEADER include/spdk/init.h 00:02:41.853 CC test/app/jsoncat/jsoncat.o 00:02:41.853 CC app/spdk_tgt/spdk_tgt.o 00:02:41.853 TEST_HEADER include/spdk/ioat.h 00:02:41.853 CC examples/vmd/lsvmd/lsvmd.o 00:02:41.853 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:41.853 CC examples/nvme/abort/abort.o 00:02:41.853 CC test/env/pci/pci_ut.o 00:02:41.853 TEST_HEADER include/spdk/ioat_spec.h 00:02:41.853 CC examples/idxd/perf/perf.o 00:02:41.853 CC test/thread/poller_perf/poller_perf.o 00:02:41.853 CC examples/nvme/reconnect/reconnect.o 00:02:41.853 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:41.853 CC test/event/reactor_perf/reactor_perf.o 00:02:41.853 CC test/nvme/e2edp/nvme_dp.o 00:02:41.853 CC examples/util/zipf/zipf.o 00:02:41.853 CC test/nvme/sgl/sgl.o 00:02:41.853 CC test/event/reactor/reactor.o 00:02:41.853 TEST_HEADER include/spdk/iscsi_spec.h 00:02:41.853 CC test/app/histogram_perf/histogram_perf.o 00:02:41.853 CC test/event/event_perf/event_perf.o 00:02:41.853 TEST_HEADER include/spdk/json.h 00:02:41.853 CC examples/accel/perf/accel_perf.o 00:02:41.853 CC test/nvme/connect_stress/connect_stress.o 00:02:41.853 TEST_HEADER include/spdk/jsonrpc.h 00:02:41.853 TEST_HEADER include/spdk/keyring.h 00:02:41.853 CC test/nvme/err_injection/err_injection.o 00:02:41.853 TEST_HEADER include/spdk/keyring_module.h 00:02:41.853 CC test/nvme/boot_partition/boot_partition.o 00:02:41.853 CC test/nvme/reset/reset.o 00:02:41.853 CC test/nvme/aer/aer.o 00:02:41.853 CC test/nvme/compliance/nvme_compliance.o 00:02:41.853 TEST_HEADER include/spdk/likely.h 00:02:41.853 TEST_HEADER include/spdk/log.h 00:02:41.853 CC test/nvme/simple_copy/simple_copy.o 00:02:41.853 CC test/nvme/reserve/reserve.o 00:02:41.853 TEST_HEADER include/spdk/lvol.h 00:02:41.853 CC test/nvme/overhead/overhead.o 00:02:41.853 CC test/nvme/startup/startup.o 00:02:41.853 TEST_HEADER include/spdk/memory.h 00:02:41.853 TEST_HEADER include/spdk/mmio.h 00:02:41.853 CC test/env/memory/memory_ut.o 00:02:41.853 CC test/event/app_repeat/app_repeat.o 00:02:41.853 CC test/app/stub/stub.o 00:02:41.853 TEST_HEADER include/spdk/nbd.h 00:02:41.853 CC examples/blob/cli/blobcli.o 00:02:41.853 TEST_HEADER include/spdk/notify.h 00:02:41.853 TEST_HEADER include/spdk/nvme.h 00:02:41.853 TEST_HEADER include/spdk/nvme_intel.h 00:02:41.853 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:41.853 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:41.853 CC examples/nvmf/nvmf/nvmf.o 00:02:41.853 TEST_HEADER include/spdk/nvme_spec.h 00:02:41.853 CC examples/blob/hello_world/hello_blob.o 00:02:41.853 CC examples/bdev/hello_world/hello_bdev.o 00:02:41.853 CC test/event/scheduler/scheduler.o 00:02:41.853 CC examples/bdev/bdevperf/bdevperf.o 00:02:41.853 CC examples/thread/thread/thread_ex.o 00:02:41.853 CC test/app/bdev_svc/bdev_svc.o 00:02:41.853 TEST_HEADER include/spdk/nvme_zns.h 00:02:41.853 CC test/accel/dif/dif.o 00:02:41.853 CC test/blobfs/mkfs/mkfs.o 00:02:41.853 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:41.853 CC app/fio/bdev/fio_plugin.o 00:02:41.853 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:41.853 CC test/dma/test_dma/test_dma.o 00:02:41.853 TEST_HEADER include/spdk/nvmf.h 00:02:41.853 CC test/bdev/bdevio/bdevio.o 00:02:41.853 TEST_HEADER include/spdk/nvmf_spec.h 00:02:41.853 TEST_HEADER include/spdk/nvmf_transport.h 00:02:41.853 LINK spdk_lspci 00:02:41.853 TEST_HEADER include/spdk/opal.h 00:02:42.116 TEST_HEADER include/spdk/opal_spec.h 00:02:42.116 TEST_HEADER include/spdk/pci_ids.h 00:02:42.116 TEST_HEADER include/spdk/pipe.h 00:02:42.116 TEST_HEADER include/spdk/queue.h 00:02:42.116 TEST_HEADER include/spdk/reduce.h 00:02:42.116 TEST_HEADER include/spdk/rpc.h 00:02:42.116 TEST_HEADER include/spdk/scheduler.h 00:02:42.116 TEST_HEADER include/spdk/scsi.h 00:02:42.116 TEST_HEADER include/spdk/scsi_spec.h 00:02:42.116 TEST_HEADER include/spdk/sock.h 00:02:42.116 TEST_HEADER include/spdk/stdinc.h 00:02:42.116 TEST_HEADER include/spdk/string.h 00:02:42.116 TEST_HEADER include/spdk/thread.h 00:02:42.116 CC test/env/mem_callbacks/mem_callbacks.o 00:02:42.116 LINK spdk_nvme_discover 00:02:42.116 TEST_HEADER include/spdk/trace.h 00:02:42.116 TEST_HEADER include/spdk/trace_parser.h 00:02:42.116 TEST_HEADER include/spdk/tree.h 00:02:42.116 TEST_HEADER include/spdk/ublk.h 00:02:42.116 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:42.116 TEST_HEADER include/spdk/util.h 00:02:42.116 TEST_HEADER include/spdk/uuid.h 00:02:42.116 TEST_HEADER include/spdk/version.h 00:02:42.116 CC test/lvol/esnap/esnap.o 00:02:42.116 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:42.116 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:42.116 LINK rpc_client_test 00:02:42.116 LINK interrupt_tgt 00:02:42.116 TEST_HEADER include/spdk/vhost.h 00:02:42.116 TEST_HEADER include/spdk/vmd.h 00:02:42.116 TEST_HEADER include/spdk/xor.h 00:02:42.116 TEST_HEADER include/spdk/zipf.h 00:02:42.116 CXX test/cpp_headers/accel.o 00:02:42.116 LINK reactor 00:02:42.116 LINK led 00:02:42.117 LINK reactor_perf 00:02:42.117 LINK zipf 00:02:42.117 LINK nvmf_tgt 00:02:42.117 LINK lsvmd 00:02:42.117 LINK histogram_perf 00:02:42.117 LINK event_perf 00:02:42.117 LINK iscsi_tgt 00:02:42.117 LINK vtophys 00:02:42.117 LINK env_dpdk_post_init 00:02:42.117 LINK spdk_trace_record 00:02:42.117 LINK vhost 00:02:42.117 LINK jsoncat 00:02:42.117 LINK app_repeat 00:02:42.117 LINK pmr_persistence 00:02:42.380 LINK hello_world 00:02:42.380 LINK poller_perf 00:02:42.380 LINK boot_partition 00:02:42.380 LINK cmb_copy 00:02:42.380 LINK verify 00:02:42.380 LINK startup 00:02:42.380 LINK ioat_perf 00:02:42.380 LINK stub 00:02:42.380 LINK connect_stress 00:02:42.380 LINK hello_sock 00:02:42.380 LINK spdk_tgt 00:02:42.380 LINK err_injection 00:02:42.380 LINK simple_copy 00:02:42.380 LINK reserve 00:02:42.380 LINK sgl 00:02:42.380 LINK hotplug 00:02:42.380 LINK bdev_svc 00:02:42.380 LINK mkfs 00:02:42.380 LINK reset 00:02:42.380 LINK hello_bdev 00:02:42.380 LINK hello_blob 00:02:42.380 LINK reconnect 00:02:42.380 LINK aer 00:02:42.380 LINK thread 00:02:42.380 LINK arbitration 00:02:42.380 LINK nvme_dp 00:02:42.380 LINK scheduler 00:02:42.646 LINK idxd_perf 00:02:42.646 LINK overhead 00:02:42.646 CXX test/cpp_headers/accel_module.o 00:02:42.646 LINK nvme_compliance 00:02:42.646 LINK spdk_dd 00:02:42.646 CXX test/cpp_headers/assert.o 00:02:42.646 LINK spdk_trace 00:02:42.646 CXX test/cpp_headers/barrier.o 00:02:42.646 CXX test/cpp_headers/base64.o 00:02:42.646 CXX test/cpp_headers/bdev.o 00:02:42.646 CXX test/cpp_headers/bdev_module.o 00:02:42.646 LINK nvmf 00:02:42.646 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:42.646 CXX test/cpp_headers/bdev_zone.o 00:02:42.646 CXX test/cpp_headers/bit_array.o 00:02:42.646 CXX test/cpp_headers/bit_pool.o 00:02:42.646 LINK bdevio 00:02:42.646 CXX test/cpp_headers/blob_bdev.o 00:02:42.646 LINK abort 00:02:42.646 CXX test/cpp_headers/blobfs_bdev.o 00:02:42.646 CXX test/cpp_headers/blobfs.o 00:02:42.646 CC test/nvme/fused_ordering/fused_ordering.o 00:02:42.646 CXX test/cpp_headers/blob.o 00:02:42.646 LINK pci_ut 00:02:42.646 CXX test/cpp_headers/conf.o 00:02:42.646 CXX test/cpp_headers/config.o 00:02:42.646 CXX test/cpp_headers/cpuset.o 00:02:42.646 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:42.646 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:42.646 CXX test/cpp_headers/crc32.o 00:02:42.646 CXX test/cpp_headers/crc16.o 00:02:42.646 LINK dif 00:02:42.646 CXX test/cpp_headers/crc64.o 00:02:42.646 LINK test_dma 00:02:42.646 CXX test/cpp_headers/dif.o 00:02:42.646 CXX test/cpp_headers/dma.o 00:02:42.646 CXX test/cpp_headers/endian.o 00:02:42.646 CXX test/cpp_headers/env_dpdk.o 00:02:42.646 CXX test/cpp_headers/env.o 00:02:42.646 CXX test/cpp_headers/event.o 00:02:42.646 CXX test/cpp_headers/fd_group.o 00:02:42.646 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:42.646 CXX test/cpp_headers/fd.o 00:02:42.646 CXX test/cpp_headers/file.o 00:02:42.646 CXX test/cpp_headers/ftl.o 00:02:42.909 CC test/nvme/cuse/cuse.o 00:02:42.909 CC test/nvme/fdp/fdp.o 00:02:42.909 CXX test/cpp_headers/gpt_spec.o 00:02:42.909 LINK nvme_manage 00:02:42.909 CXX test/cpp_headers/hexlify.o 00:02:42.909 CXX test/cpp_headers/histogram_data.o 00:02:42.909 LINK accel_perf 00:02:42.909 CXX test/cpp_headers/idxd.o 00:02:42.909 CXX test/cpp_headers/idxd_spec.o 00:02:42.909 CXX test/cpp_headers/init.o 00:02:42.909 CXX test/cpp_headers/ioat.o 00:02:42.909 CXX test/cpp_headers/ioat_spec.o 00:02:42.909 LINK blobcli 00:02:42.909 CXX test/cpp_headers/iscsi_spec.o 00:02:42.909 CXX test/cpp_headers/jsonrpc.o 00:02:42.909 CXX test/cpp_headers/json.o 00:02:42.909 LINK spdk_nvme 00:02:42.909 CXX test/cpp_headers/keyring.o 00:02:42.909 CXX test/cpp_headers/keyring_module.o 00:02:42.909 CXX test/cpp_headers/likely.o 00:02:42.909 CXX test/cpp_headers/log.o 00:02:42.909 CXX test/cpp_headers/lvol.o 00:02:42.909 LINK nvme_fuzz 00:02:42.909 CXX test/cpp_headers/memory.o 00:02:42.909 CXX test/cpp_headers/mmio.o 00:02:42.909 CXX test/cpp_headers/nbd.o 00:02:42.909 CXX test/cpp_headers/notify.o 00:02:42.909 CXX test/cpp_headers/nvme.o 00:02:42.909 CXX test/cpp_headers/nvme_intel.o 00:02:42.909 CXX test/cpp_headers/nvme_ocssd.o 00:02:42.909 LINK spdk_bdev 00:02:43.173 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:43.173 CXX test/cpp_headers/nvme_spec.o 00:02:43.173 CXX test/cpp_headers/nvme_zns.o 00:02:43.173 CXX test/cpp_headers/nvmf_cmd.o 00:02:43.173 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:43.173 CXX test/cpp_headers/nvmf_spec.o 00:02:43.173 CXX test/cpp_headers/nvmf.o 00:02:43.173 CXX test/cpp_headers/nvmf_transport.o 00:02:43.173 CXX test/cpp_headers/opal.o 00:02:43.173 LINK mem_callbacks 00:02:43.173 CXX test/cpp_headers/opal_spec.o 00:02:43.173 LINK spdk_nvme_perf 00:02:43.173 CXX test/cpp_headers/pci_ids.o 00:02:43.173 CXX test/cpp_headers/pipe.o 00:02:43.173 LINK fused_ordering 00:02:43.173 CXX test/cpp_headers/queue.o 00:02:43.173 CXX test/cpp_headers/reduce.o 00:02:43.173 CXX test/cpp_headers/rpc.o 00:02:43.173 CXX test/cpp_headers/scheduler.o 00:02:43.173 LINK doorbell_aers 00:02:43.173 CXX test/cpp_headers/scsi.o 00:02:43.173 CXX test/cpp_headers/sock.o 00:02:43.173 CXX test/cpp_headers/scsi_spec.o 00:02:43.173 CXX test/cpp_headers/stdinc.o 00:02:43.173 CXX test/cpp_headers/string.o 00:02:43.173 LINK spdk_nvme_identify 00:02:43.173 CXX test/cpp_headers/thread.o 00:02:43.173 CXX test/cpp_headers/trace.o 00:02:43.173 CXX test/cpp_headers/trace_parser.o 00:02:43.173 CXX test/cpp_headers/tree.o 00:02:43.173 CXX test/cpp_headers/ublk.o 00:02:43.432 CXX test/cpp_headers/util.o 00:02:43.432 CXX test/cpp_headers/uuid.o 00:02:43.432 CXX test/cpp_headers/version.o 00:02:43.432 CXX test/cpp_headers/vfio_user_pci.o 00:02:43.432 LINK spdk_top 00:02:43.432 CXX test/cpp_headers/vfio_user_spec.o 00:02:43.432 CXX test/cpp_headers/vhost.o 00:02:43.432 CXX test/cpp_headers/vmd.o 00:02:43.432 CXX test/cpp_headers/xor.o 00:02:43.432 CXX test/cpp_headers/zipf.o 00:02:43.432 LINK fdp 00:02:43.432 LINK vhost_fuzz 00:02:43.432 LINK bdevperf 00:02:43.691 LINK memory_ut 00:02:44.627 LINK cuse 00:02:44.627 LINK iscsi_fuzz 00:02:47.917 LINK esnap 00:02:47.917 00:02:47.917 real 1m30.778s 00:02:47.917 user 17m20.018s 00:02:47.917 sys 4m22.148s 00:02:47.917 23:44:48 make -- common/autotest_common.sh@1122 -- $ xtrace_disable 00:02:47.917 23:44:48 make -- common/autotest_common.sh@10 -- $ set +x 00:02:47.917 ************************************ 00:02:47.917 END TEST make 00:02:47.917 ************************************ 00:02:47.917 23:44:48 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:02:47.917 23:44:48 -- pm/common@29 -- $ signal_monitor_resources TERM 00:02:47.917 23:44:48 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:02:47.917 23:44:48 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:47.917 23:44:48 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:02:47.917 23:44:48 -- pm/common@44 -- $ pid=230190 00:02:47.917 23:44:48 -- pm/common@50 -- $ kill -TERM 230190 00:02:47.917 23:44:48 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:47.917 23:44:48 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:02:47.917 23:44:48 -- pm/common@44 -- $ pid=230192 00:02:47.917 23:44:48 -- pm/common@50 -- $ kill -TERM 230192 00:02:47.917 23:44:48 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:47.917 23:44:48 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:02:47.917 23:44:48 -- pm/common@44 -- $ pid=230194 00:02:47.917 23:44:48 -- pm/common@50 -- $ kill -TERM 230194 00:02:47.917 23:44:48 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:47.917 23:44:48 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:02:47.917 23:44:48 -- pm/common@44 -- $ pid=230223 00:02:47.917 23:44:48 -- pm/common@50 -- $ sudo -E kill -TERM 230223 00:02:48.228 23:44:48 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:02:48.228 23:44:48 -- nvmf/common.sh@7 -- # uname -s 00:02:48.228 23:44:48 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:02:48.228 23:44:48 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:02:48.228 23:44:48 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:02:48.228 23:44:48 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:02:48.228 23:44:48 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:02:48.228 23:44:48 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:02:48.228 23:44:48 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:02:48.228 23:44:48 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:02:48.228 23:44:48 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:02:48.228 23:44:48 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:02:48.228 23:44:48 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:02:48.229 23:44:48 -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:02:48.229 23:44:48 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:02:48.229 23:44:48 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:02:48.229 23:44:48 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:02:48.229 23:44:48 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:02:48.229 23:44:48 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:02:48.229 23:44:48 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:02:48.229 23:44:48 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:48.229 23:44:48 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:48.229 23:44:48 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:48.229 23:44:48 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:48.229 23:44:48 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:48.229 23:44:48 -- paths/export.sh@5 -- # export PATH 00:02:48.229 23:44:48 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:48.229 23:44:48 -- nvmf/common.sh@47 -- # : 0 00:02:48.229 23:44:48 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:02:48.229 23:44:48 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:02:48.229 23:44:48 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:02:48.229 23:44:48 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:02:48.229 23:44:48 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:02:48.229 23:44:48 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:02:48.229 23:44:48 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:02:48.229 23:44:48 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:02:48.229 23:44:48 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:02:48.229 23:44:48 -- spdk/autotest.sh@32 -- # uname -s 00:02:48.229 23:44:48 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:02:48.229 23:44:48 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:02:48.229 23:44:48 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:02:48.229 23:44:48 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:02:48.229 23:44:48 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:02:48.229 23:44:48 -- spdk/autotest.sh@44 -- # modprobe nbd 00:02:48.229 23:44:48 -- spdk/autotest.sh@46 -- # type -P udevadm 00:02:48.229 23:44:48 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:02:48.229 23:44:48 -- spdk/autotest.sh@48 -- # udevadm_pid=295844 00:02:48.229 23:44:48 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:02:48.229 23:44:48 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:02:48.229 23:44:48 -- pm/common@17 -- # local monitor 00:02:48.229 23:44:48 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:48.229 23:44:48 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:48.229 23:44:48 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:48.229 23:44:48 -- pm/common@21 -- # date +%s 00:02:48.229 23:44:48 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:48.229 23:44:48 -- pm/common@21 -- # date +%s 00:02:48.229 23:44:48 -- pm/common@25 -- # sleep 1 00:02:48.229 23:44:48 -- pm/common@21 -- # date +%s 00:02:48.229 23:44:48 -- pm/common@21 -- # date +%s 00:02:48.229 23:44:48 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1715723088 00:02:48.229 23:44:48 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1715723088 00:02:48.229 23:44:48 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1715723088 00:02:48.229 23:44:48 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1715723088 00:02:48.229 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1715723088_collect-vmstat.pm.log 00:02:48.229 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1715723088_collect-cpu-load.pm.log 00:02:48.229 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1715723088_collect-cpu-temp.pm.log 00:02:48.229 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1715723088_collect-bmc-pm.bmc.pm.log 00:02:49.167 23:44:49 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:02:49.167 23:44:49 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:02:49.167 23:44:49 -- common/autotest_common.sh@720 -- # xtrace_disable 00:02:49.167 23:44:49 -- common/autotest_common.sh@10 -- # set +x 00:02:49.167 23:44:49 -- spdk/autotest.sh@59 -- # create_test_list 00:02:49.167 23:44:49 -- common/autotest_common.sh@744 -- # xtrace_disable 00:02:49.167 23:44:49 -- common/autotest_common.sh@10 -- # set +x 00:02:49.167 23:44:49 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:02:49.167 23:44:49 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:49.167 23:44:49 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:49.167 23:44:49 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:02:49.167 23:44:49 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:49.167 23:44:49 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:02:49.167 23:44:49 -- common/autotest_common.sh@1451 -- # uname 00:02:49.167 23:44:49 -- common/autotest_common.sh@1451 -- # '[' Linux = FreeBSD ']' 00:02:49.167 23:44:49 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:02:49.167 23:44:49 -- common/autotest_common.sh@1471 -- # uname 00:02:49.167 23:44:49 -- common/autotest_common.sh@1471 -- # [[ Linux = FreeBSD ]] 00:02:49.167 23:44:49 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:02:49.167 23:44:49 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:02:49.167 23:44:49 -- spdk/autotest.sh@72 -- # hash lcov 00:02:49.167 23:44:49 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:02:49.167 23:44:49 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:02:49.167 --rc lcov_branch_coverage=1 00:02:49.167 --rc lcov_function_coverage=1 00:02:49.167 --rc genhtml_branch_coverage=1 00:02:49.167 --rc genhtml_function_coverage=1 00:02:49.167 --rc genhtml_legend=1 00:02:49.167 --rc geninfo_all_blocks=1 00:02:49.167 ' 00:02:49.167 23:44:49 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:02:49.167 --rc lcov_branch_coverage=1 00:02:49.167 --rc lcov_function_coverage=1 00:02:49.167 --rc genhtml_branch_coverage=1 00:02:49.167 --rc genhtml_function_coverage=1 00:02:49.167 --rc genhtml_legend=1 00:02:49.167 --rc geninfo_all_blocks=1 00:02:49.167 ' 00:02:49.167 23:44:49 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:02:49.167 --rc lcov_branch_coverage=1 00:02:49.167 --rc lcov_function_coverage=1 00:02:49.167 --rc genhtml_branch_coverage=1 00:02:49.167 --rc genhtml_function_coverage=1 00:02:49.167 --rc genhtml_legend=1 00:02:49.167 --rc geninfo_all_blocks=1 00:02:49.167 --no-external' 00:02:49.167 23:44:49 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:02:49.167 --rc lcov_branch_coverage=1 00:02:49.167 --rc lcov_function_coverage=1 00:02:49.167 --rc genhtml_branch_coverage=1 00:02:49.167 --rc genhtml_function_coverage=1 00:02:49.167 --rc genhtml_legend=1 00:02:49.167 --rc geninfo_all_blocks=1 00:02:49.167 --no-external' 00:02:49.167 23:44:49 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:02:49.426 lcov: LCOV version 1.14 00:02:49.426 23:44:49 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:03:04.316 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:03:04.316 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:03:04.316 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcno:no functions found 00:03:04.316 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcno 00:03:04.316 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcno:no functions found 00:03:04.316 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcno 00:03:04.316 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcno:no functions found 00:03:04.316 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcno 00:03:22.433 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:03:22.433 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:03:22.433 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:03:22.433 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:03:22.433 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:03:22.433 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:03:22.433 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:03:22.433 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:03:22.433 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:03:22.433 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:03:22.433 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:03:22.433 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:03:22.433 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:03:22.433 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:03:22.433 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:03:22.433 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:03:22.433 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:03:22.434 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:03:22.434 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:03:22.435 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:03:22.435 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:03:22.435 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:03:22.435 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:03:22.435 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:03:22.435 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:03:22.435 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:03:22.435 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:03:22.435 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:03:22.435 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:03:22.435 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:03:22.435 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:03:22.435 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:03:22.435 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:03:22.435 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:03:22.435 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:03:22.435 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:03:22.435 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:03:22.435 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:03:22.435 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:03:22.435 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:03:22.435 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:03:22.435 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:03:22.435 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:03:22.435 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:03:22.435 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:03:22.435 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:03:22.435 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:03:22.435 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:03:22.435 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:03:22.435 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:03:22.435 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:03:22.435 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:03:22.435 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:03:22.435 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:03:22.435 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:03:22.435 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:03:22.435 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:03:22.435 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:03:22.435 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:03:22.435 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:03:22.435 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:03:22.435 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:03:22.435 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:03:22.435 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:03:22.435 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:03:22.435 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:03:22.435 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:03:22.435 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:03:22.435 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:03:23.378 23:45:23 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:03:23.378 23:45:23 -- common/autotest_common.sh@720 -- # xtrace_disable 00:03:23.378 23:45:23 -- common/autotest_common.sh@10 -- # set +x 00:03:23.378 23:45:23 -- spdk/autotest.sh@91 -- # rm -f 00:03:23.378 23:45:23 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:27.574 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:03:27.574 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:03:27.574 0000:5e:00.0 (8086 0b60): Already using the nvme driver 00:03:27.574 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:27.574 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:27.574 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:27.574 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:27.574 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:27.574 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:27.574 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:27.574 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:27.574 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:27.574 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:27.574 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:27.574 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:27.574 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:27.574 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:27.574 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:27.574 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:27.574 23:45:27 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:03:27.574 23:45:27 -- common/autotest_common.sh@1665 -- # zoned_devs=() 00:03:27.574 23:45:27 -- common/autotest_common.sh@1665 -- # local -gA zoned_devs 00:03:27.574 23:45:27 -- common/autotest_common.sh@1666 -- # local nvme bdf 00:03:27.574 23:45:27 -- common/autotest_common.sh@1668 -- # for nvme in /sys/block/nvme* 00:03:27.574 23:45:27 -- common/autotest_common.sh@1669 -- # is_block_zoned nvme0n1 00:03:27.574 23:45:27 -- common/autotest_common.sh@1658 -- # local device=nvme0n1 00:03:27.574 23:45:27 -- common/autotest_common.sh@1660 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:27.574 23:45:27 -- common/autotest_common.sh@1661 -- # [[ none != none ]] 00:03:27.574 23:45:27 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:03:27.574 23:45:27 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:03:27.574 23:45:27 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:03:27.574 23:45:27 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:03:27.574 23:45:27 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:03:27.574 23:45:27 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:27.574 No valid GPT data, bailing 00:03:27.574 23:45:28 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:27.574 23:45:28 -- scripts/common.sh@391 -- # pt= 00:03:27.574 23:45:28 -- scripts/common.sh@392 -- # return 1 00:03:27.574 23:45:28 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:27.574 1+0 records in 00:03:27.574 1+0 records out 00:03:27.574 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00465275 s, 225 MB/s 00:03:27.574 23:45:28 -- spdk/autotest.sh@118 -- # sync 00:03:27.574 23:45:28 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:27.574 23:45:28 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:27.574 23:45:28 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:32.857 23:45:33 -- spdk/autotest.sh@124 -- # uname -s 00:03:32.857 23:45:33 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:03:32.857 23:45:33 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:32.857 23:45:33 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:32.857 23:45:33 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:32.857 23:45:33 -- common/autotest_common.sh@10 -- # set +x 00:03:32.857 ************************************ 00:03:32.857 START TEST setup.sh 00:03:32.857 ************************************ 00:03:32.857 23:45:33 setup.sh -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:32.857 * Looking for test storage... 00:03:32.857 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:32.857 23:45:33 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:03:32.857 23:45:33 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:32.857 23:45:33 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:32.857 23:45:33 setup.sh -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:32.857 23:45:33 setup.sh -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:32.857 23:45:33 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:32.857 ************************************ 00:03:32.857 START TEST acl 00:03:32.857 ************************************ 00:03:32.857 23:45:33 setup.sh.acl -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:32.857 * Looking for test storage... 00:03:32.857 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:32.857 23:45:33 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:03:32.857 23:45:33 setup.sh.acl -- common/autotest_common.sh@1665 -- # zoned_devs=() 00:03:32.857 23:45:33 setup.sh.acl -- common/autotest_common.sh@1665 -- # local -gA zoned_devs 00:03:32.857 23:45:33 setup.sh.acl -- common/autotest_common.sh@1666 -- # local nvme bdf 00:03:32.857 23:45:33 setup.sh.acl -- common/autotest_common.sh@1668 -- # for nvme in /sys/block/nvme* 00:03:32.857 23:45:33 setup.sh.acl -- common/autotest_common.sh@1669 -- # is_block_zoned nvme0n1 00:03:32.857 23:45:33 setup.sh.acl -- common/autotest_common.sh@1658 -- # local device=nvme0n1 00:03:32.857 23:45:33 setup.sh.acl -- common/autotest_common.sh@1660 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:32.857 23:45:33 setup.sh.acl -- common/autotest_common.sh@1661 -- # [[ none != none ]] 00:03:32.857 23:45:33 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:03:32.857 23:45:33 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:03:32.857 23:45:33 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:03:32.857 23:45:33 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:03:32.857 23:45:33 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:03:32.857 23:45:33 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:32.857 23:45:33 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:37.085 23:45:37 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:03:37.085 23:45:37 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:03:37.085 23:45:37 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.085 23:45:37 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:03:37.085 23:45:37 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:03:37.085 23:45:37 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.281 Hugepages 00:03:41.281 node hugesize free / total 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.281 00:03:41.281 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:41.281 23:45:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:5e:00.0 == *:*:*.* ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:85:05.5 == *:*:*.* ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:d7:05.5 == *:*:*.* ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:41.282 23:45:41 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:03:41.282 23:45:41 setup.sh.acl -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:41.282 23:45:41 setup.sh.acl -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:41.282 23:45:41 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:41.282 ************************************ 00:03:41.282 START TEST denied 00:03:41.282 ************************************ 00:03:41.282 23:45:41 setup.sh.acl.denied -- common/autotest_common.sh@1121 -- # denied 00:03:41.282 23:45:41 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:5e:00.0' 00:03:41.282 23:45:41 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:03:41.282 23:45:41 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:5e:00.0' 00:03:41.282 23:45:41 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:03:41.282 23:45:41 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:45.476 0000:5e:00.0 (8086 0b60): Skipping denied controller at 0000:5e:00.0 00:03:45.476 23:45:45 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:5e:00.0 00:03:45.476 23:45:45 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:03:45.476 23:45:45 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:03:45.476 23:45:45 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:5e:00.0 ]] 00:03:45.476 23:45:45 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:5e:00.0/driver 00:03:45.476 23:45:45 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:45.476 23:45:45 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:45.476 23:45:45 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:03:45.476 23:45:45 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:45.476 23:45:45 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:50.749 00:03:50.749 real 0m9.363s 00:03:50.749 user 0m3.090s 00:03:50.749 sys 0m5.570s 00:03:50.749 23:45:50 setup.sh.acl.denied -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:50.749 23:45:50 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:03:50.749 ************************************ 00:03:50.749 END TEST denied 00:03:50.749 ************************************ 00:03:50.749 23:45:50 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:03:50.749 23:45:50 setup.sh.acl -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:50.749 23:45:50 setup.sh.acl -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:50.750 23:45:50 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:50.750 ************************************ 00:03:50.750 START TEST allowed 00:03:50.750 ************************************ 00:03:50.750 23:45:51 setup.sh.acl.allowed -- common/autotest_common.sh@1121 -- # allowed 00:03:50.750 23:45:51 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:5e:00.0 00:03:50.750 23:45:51 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:03:50.750 23:45:51 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:5e:00.0 .*: nvme -> .*' 00:03:50.750 23:45:51 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:03:50.750 23:45:51 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:57.321 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:03:57.321 23:45:57 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:03:57.321 23:45:57 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:03:57.321 23:45:57 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:03:57.321 23:45:57 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:57.321 23:45:57 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:01.517 00:04:01.517 real 0m10.469s 00:04:01.517 user 0m2.814s 00:04:01.517 sys 0m5.375s 00:04:01.517 23:46:01 setup.sh.acl.allowed -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:01.517 23:46:01 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:04:01.517 ************************************ 00:04:01.517 END TEST allowed 00:04:01.517 ************************************ 00:04:01.517 00:04:01.517 real 0m28.266s 00:04:01.517 user 0m8.959s 00:04:01.517 sys 0m16.626s 00:04:01.517 23:46:01 setup.sh.acl -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:01.517 23:46:01 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:01.517 ************************************ 00:04:01.517 END TEST acl 00:04:01.517 ************************************ 00:04:01.517 23:46:01 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:04:01.517 23:46:01 setup.sh -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:01.517 23:46:01 setup.sh -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:01.517 23:46:01 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:01.517 ************************************ 00:04:01.517 START TEST hugepages 00:04:01.517 ************************************ 00:04:01.517 23:46:01 setup.sh.hugepages -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:04:01.517 * Looking for test storage... 00:04:01.517 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 75659348 kB' 'MemAvailable: 80100820 kB' 'Buffers: 14216 kB' 'Cached: 10567168 kB' 'SwapCached: 0 kB' 'Active: 6664388 kB' 'Inactive: 4410104 kB' 'Active(anon): 6095056 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 497108 kB' 'Mapped: 188356 kB' 'Shmem: 5601948 kB' 'KReclaimable: 230908 kB' 'Slab: 564088 kB' 'SReclaimable: 230908 kB' 'SUnreclaim: 333180 kB' 'KernelStack: 15920 kB' 'PageTables: 8208 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52438196 kB' 'Committed_AS: 7396720 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200696 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.517 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.518 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:01.519 23:46:01 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:04:01.519 23:46:01 setup.sh.hugepages -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:01.519 23:46:01 setup.sh.hugepages -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:01.519 23:46:01 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:01.519 ************************************ 00:04:01.519 START TEST default_setup 00:04:01.519 ************************************ 00:04:01.519 23:46:01 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1121 -- # default_setup 00:04:01.519 23:46:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:04:01.519 23:46:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:04:01.519 23:46:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:01.519 23:46:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:04:01.519 23:46:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:01.519 23:46:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:04:01.519 23:46:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:01.519 23:46:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:01.519 23:46:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:01.519 23:46:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:01.519 23:46:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:04:01.519 23:46:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:01.519 23:46:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:01.519 23:46:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:01.519 23:46:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:01.519 23:46:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:01.519 23:46:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:01.519 23:46:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:01.519 23:46:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:04:01.519 23:46:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:04:01.519 23:46:01 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:04:01.519 23:46:01 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:04.848 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:04.848 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:05.107 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:05.107 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:05.107 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:05.107 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:05.107 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:05.107 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:05.107 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:05.107 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:05.366 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:05.366 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:05.366 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:05.366 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:05.366 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:05.366 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:05.366 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:05.366 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:07.910 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77792904 kB' 'MemAvailable: 82234376 kB' 'Buffers: 14216 kB' 'Cached: 10567288 kB' 'SwapCached: 0 kB' 'Active: 6684528 kB' 'Inactive: 4410104 kB' 'Active(anon): 6115196 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 515912 kB' 'Mapped: 188984 kB' 'Shmem: 5602068 kB' 'KReclaimable: 230908 kB' 'Slab: 563128 kB' 'SReclaimable: 230908 kB' 'SUnreclaim: 332220 kB' 'KernelStack: 16288 kB' 'PageTables: 8804 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7420048 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200920 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.910 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.911 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77796156 kB' 'MemAvailable: 82237628 kB' 'Buffers: 14216 kB' 'Cached: 10567296 kB' 'SwapCached: 0 kB' 'Active: 6678520 kB' 'Inactive: 4410104 kB' 'Active(anon): 6109188 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510408 kB' 'Mapped: 188408 kB' 'Shmem: 5602076 kB' 'KReclaimable: 230908 kB' 'Slab: 563088 kB' 'SReclaimable: 230908 kB' 'SUnreclaim: 332180 kB' 'KernelStack: 16080 kB' 'PageTables: 8420 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7413952 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200760 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.912 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:07.913 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77796784 kB' 'MemAvailable: 82238256 kB' 'Buffers: 14216 kB' 'Cached: 10567296 kB' 'SwapCached: 0 kB' 'Active: 6678208 kB' 'Inactive: 4410104 kB' 'Active(anon): 6108876 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510096 kB' 'Mapped: 188392 kB' 'Shmem: 5602076 kB' 'KReclaimable: 230908 kB' 'Slab: 563088 kB' 'SReclaimable: 230908 kB' 'SUnreclaim: 332180 kB' 'KernelStack: 16080 kB' 'PageTables: 8596 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7413972 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200728 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.914 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.915 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:07.916 nr_hugepages=1024 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:07.916 resv_hugepages=0 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:07.916 surplus_hugepages=0 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:07.916 anon_hugepages=0 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77799384 kB' 'MemAvailable: 82240856 kB' 'Buffers: 14216 kB' 'Cached: 10567336 kB' 'SwapCached: 0 kB' 'Active: 6679124 kB' 'Inactive: 4410104 kB' 'Active(anon): 6109792 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510944 kB' 'Mapped: 188400 kB' 'Shmem: 5602116 kB' 'KReclaimable: 230908 kB' 'Slab: 563056 kB' 'SReclaimable: 230908 kB' 'SUnreclaim: 332148 kB' 'KernelStack: 16192 kB' 'PageTables: 9172 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7414752 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200824 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.916 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.917 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069888 kB' 'MemFree: 35257564 kB' 'MemUsed: 12812324 kB' 'SwapCached: 0 kB' 'Active: 5697492 kB' 'Inactive: 4207904 kB' 'Active(anon): 5227816 kB' 'Inactive(anon): 0 kB' 'Active(file): 469676 kB' 'Inactive(file): 4207904 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9636968 kB' 'Mapped: 157576 kB' 'AnonPages: 271528 kB' 'Shmem: 4959388 kB' 'KernelStack: 10408 kB' 'PageTables: 6424 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 157520 kB' 'Slab: 356872 kB' 'SReclaimable: 157520 kB' 'SUnreclaim: 199352 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:07.918 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.178 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:08.179 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:08.179 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:08.179 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.179 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:08.179 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:08.179 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:08.179 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.179 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:08.179 23:46:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:08.179 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:08.179 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:08.179 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:08.179 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:08.179 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:08.179 node0=1024 expecting 1024 00:04:08.179 23:46:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:08.179 00:04:08.179 real 0m6.652s 00:04:08.179 user 0m1.658s 00:04:08.179 sys 0m2.729s 00:04:08.179 23:46:08 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:08.179 23:46:08 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:04:08.179 ************************************ 00:04:08.179 END TEST default_setup 00:04:08.179 ************************************ 00:04:08.179 23:46:08 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:04:08.179 23:46:08 setup.sh.hugepages -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:08.179 23:46:08 setup.sh.hugepages -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:08.179 23:46:08 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:08.179 ************************************ 00:04:08.179 START TEST per_node_1G_alloc 00:04:08.179 ************************************ 00:04:08.179 23:46:08 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1121 -- # per_node_1G_alloc 00:04:08.179 23:46:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:04:08.179 23:46:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:04:08.179 23:46:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:08.179 23:46:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:04:08.179 23:46:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:04:08.179 23:46:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:04:08.179 23:46:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:08.179 23:46:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:08.179 23:46:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:08.179 23:46:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:04:08.179 23:46:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:04:08.179 23:46:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:08.179 23:46:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:08.179 23:46:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:08.179 23:46:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:08.179 23:46:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:08.179 23:46:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:04:08.179 23:46:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:08.179 23:46:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:08.179 23:46:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:08.179 23:46:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:08.179 23:46:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:08.179 23:46:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:04:08.179 23:46:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:04:08.179 23:46:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:04:08.179 23:46:08 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:08.179 23:46:08 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:12.378 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:12.378 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:12.378 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:12.378 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:12.378 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:12.378 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:12.378 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:12.378 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:12.378 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:12.378 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:12.378 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:12.378 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:12.378 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:12.378 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:12.378 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:12.378 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:12.378 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:12.378 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:12.378 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77789480 kB' 'MemAvailable: 82230952 kB' 'Buffers: 14216 kB' 'Cached: 10567428 kB' 'SwapCached: 0 kB' 'Active: 6679668 kB' 'Inactive: 4410104 kB' 'Active(anon): 6110336 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511464 kB' 'Mapped: 187360 kB' 'Shmem: 5602208 kB' 'KReclaimable: 230908 kB' 'Slab: 562476 kB' 'SReclaimable: 230908 kB' 'SUnreclaim: 331568 kB' 'KernelStack: 16032 kB' 'PageTables: 8112 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7439600 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.378 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.379 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77791276 kB' 'MemAvailable: 82232748 kB' 'Buffers: 14216 kB' 'Cached: 10567432 kB' 'SwapCached: 0 kB' 'Active: 6678448 kB' 'Inactive: 4410104 kB' 'Active(anon): 6109116 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510252 kB' 'Mapped: 187364 kB' 'Shmem: 5602212 kB' 'KReclaimable: 230908 kB' 'Slab: 562444 kB' 'SReclaimable: 230908 kB' 'SUnreclaim: 331536 kB' 'KernelStack: 16000 kB' 'PageTables: 7608 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7408572 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200760 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.380 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77788540 kB' 'MemAvailable: 82230012 kB' 'Buffers: 14216 kB' 'Cached: 10567448 kB' 'SwapCached: 0 kB' 'Active: 6678076 kB' 'Inactive: 4410104 kB' 'Active(anon): 6108744 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 509844 kB' 'Mapped: 187356 kB' 'Shmem: 5602228 kB' 'KReclaimable: 230908 kB' 'Slab: 562380 kB' 'SReclaimable: 230908 kB' 'SUnreclaim: 331472 kB' 'KernelStack: 15920 kB' 'PageTables: 7872 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7408728 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200776 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.381 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.382 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:12.383 nr_hugepages=1024 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:12.383 resv_hugepages=0 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:12.383 surplus_hugepages=0 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:12.383 anon_hugepages=0 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.383 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77787936 kB' 'MemAvailable: 82229408 kB' 'Buffers: 14216 kB' 'Cached: 10567480 kB' 'SwapCached: 0 kB' 'Active: 6679000 kB' 'Inactive: 4410104 kB' 'Active(anon): 6109668 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510756 kB' 'Mapped: 187356 kB' 'Shmem: 5602260 kB' 'KReclaimable: 230908 kB' 'Slab: 562380 kB' 'SReclaimable: 230908 kB' 'SUnreclaim: 331472 kB' 'KernelStack: 15952 kB' 'PageTables: 8256 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7408872 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200824 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.384 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069888 kB' 'MemFree: 36300148 kB' 'MemUsed: 11769740 kB' 'SwapCached: 0 kB' 'Active: 5697416 kB' 'Inactive: 4207904 kB' 'Active(anon): 5227740 kB' 'Inactive(anon): 0 kB' 'Active(file): 469676 kB' 'Inactive(file): 4207904 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9636968 kB' 'Mapped: 156468 kB' 'AnonPages: 271592 kB' 'Shmem: 4959388 kB' 'KernelStack: 10312 kB' 'PageTables: 5640 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 157520 kB' 'Slab: 356508 kB' 'SReclaimable: 157520 kB' 'SUnreclaim: 198988 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.385 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.386 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44223600 kB' 'MemFree: 41486868 kB' 'MemUsed: 2736732 kB' 'SwapCached: 0 kB' 'Active: 981248 kB' 'Inactive: 202200 kB' 'Active(anon): 881592 kB' 'Inactive(anon): 0 kB' 'Active(file): 99656 kB' 'Inactive(file): 202200 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 944752 kB' 'Mapped: 30824 kB' 'AnonPages: 238924 kB' 'Shmem: 642896 kB' 'KernelStack: 5816 kB' 'PageTables: 2476 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 73388 kB' 'Slab: 205872 kB' 'SReclaimable: 73388 kB' 'SUnreclaim: 132484 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.387 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:12.388 node0=512 expecting 512 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:12.388 node1=512 expecting 512 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:12.388 00:04:12.388 real 0m4.002s 00:04:12.388 user 0m1.560s 00:04:12.388 sys 0m2.545s 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:12.388 23:46:12 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:12.388 ************************************ 00:04:12.388 END TEST per_node_1G_alloc 00:04:12.388 ************************************ 00:04:12.388 23:46:12 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:12.388 23:46:12 setup.sh.hugepages -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:12.388 23:46:12 setup.sh.hugepages -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:12.388 23:46:12 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:12.388 ************************************ 00:04:12.388 START TEST even_2G_alloc 00:04:12.388 ************************************ 00:04:12.388 23:46:12 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1121 -- # even_2G_alloc 00:04:12.388 23:46:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:12.388 23:46:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:12.388 23:46:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:12.388 23:46:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:12.388 23:46:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:12.388 23:46:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:12.388 23:46:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:12.388 23:46:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:12.388 23:46:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:12.388 23:46:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:12.388 23:46:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:12.388 23:46:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:12.388 23:46:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:12.388 23:46:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:12.388 23:46:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:12.388 23:46:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:12.388 23:46:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:04:12.388 23:46:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:12.388 23:46:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:12.388 23:46:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:12.388 23:46:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:12.388 23:46:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:12.388 23:46:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:12.388 23:46:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:12.389 23:46:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:12.389 23:46:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:04:12.389 23:46:12 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:12.389 23:46:12 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:15.677 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:15.677 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:15.936 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:15.936 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:15.936 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:15.936 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:15.936 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:15.936 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:15.936 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:15.936 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:15.936 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:15.936 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:15.936 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:15.936 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:15.936 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:15.936 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:15.936 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:15.936 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:15.936 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:15.936 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:15.936 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:15.936 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:15.936 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:15.936 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:15.936 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:15.936 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:15.936 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:15.936 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:15.936 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:15.936 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:15.936 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:15.936 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:15.936 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:15.936 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:15.936 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:15.936 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:15.936 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:15.936 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.936 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77764708 kB' 'MemAvailable: 82206180 kB' 'Buffers: 14216 kB' 'Cached: 10567588 kB' 'SwapCached: 0 kB' 'Active: 6680680 kB' 'Inactive: 4410104 kB' 'Active(anon): 6111348 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511716 kB' 'Mapped: 187444 kB' 'Shmem: 5602368 kB' 'KReclaimable: 230908 kB' 'Slab: 562368 kB' 'SReclaimable: 230908 kB' 'SUnreclaim: 331460 kB' 'KernelStack: 16096 kB' 'PageTables: 8376 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7409596 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200968 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.937 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:15.938 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77766184 kB' 'MemAvailable: 82207656 kB' 'Buffers: 14216 kB' 'Cached: 10567592 kB' 'SwapCached: 0 kB' 'Active: 6679872 kB' 'Inactive: 4410104 kB' 'Active(anon): 6110540 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511396 kB' 'Mapped: 187368 kB' 'Shmem: 5602372 kB' 'KReclaimable: 230908 kB' 'Slab: 562360 kB' 'SReclaimable: 230908 kB' 'SUnreclaim: 331452 kB' 'KernelStack: 16032 kB' 'PageTables: 8032 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7409612 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200856 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.204 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.205 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77763732 kB' 'MemAvailable: 82205204 kB' 'Buffers: 14216 kB' 'Cached: 10567608 kB' 'SwapCached: 0 kB' 'Active: 6679248 kB' 'Inactive: 4410104 kB' 'Active(anon): 6109916 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510736 kB' 'Mapped: 187368 kB' 'Shmem: 5602388 kB' 'KReclaimable: 230908 kB' 'Slab: 562360 kB' 'SReclaimable: 230908 kB' 'SUnreclaim: 331452 kB' 'KernelStack: 16016 kB' 'PageTables: 8216 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7409632 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200856 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.206 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.207 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:16.208 nr_hugepages=1024 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:16.208 resv_hugepages=0 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:16.208 surplus_hugepages=0 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:16.208 anon_hugepages=0 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77764216 kB' 'MemAvailable: 82205688 kB' 'Buffers: 14216 kB' 'Cached: 10567632 kB' 'SwapCached: 0 kB' 'Active: 6679460 kB' 'Inactive: 4410104 kB' 'Active(anon): 6110128 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510948 kB' 'Mapped: 187368 kB' 'Shmem: 5602412 kB' 'KReclaimable: 230908 kB' 'Slab: 562520 kB' 'SReclaimable: 230908 kB' 'SUnreclaim: 331612 kB' 'KernelStack: 16064 kB' 'PageTables: 8128 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7409656 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200856 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.208 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.209 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069888 kB' 'MemFree: 36298668 kB' 'MemUsed: 11771220 kB' 'SwapCached: 0 kB' 'Active: 5698036 kB' 'Inactive: 4207904 kB' 'Active(anon): 5228360 kB' 'Inactive(anon): 0 kB' 'Active(file): 469676 kB' 'Inactive(file): 4207904 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9636968 kB' 'Mapped: 156544 kB' 'AnonPages: 272136 kB' 'Shmem: 4959388 kB' 'KernelStack: 10312 kB' 'PageTables: 6128 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 157520 kB' 'Slab: 356572 kB' 'SReclaimable: 157520 kB' 'SUnreclaim: 199052 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.210 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.211 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44223600 kB' 'MemFree: 41472656 kB' 'MemUsed: 2750944 kB' 'SwapCached: 0 kB' 'Active: 982272 kB' 'Inactive: 202200 kB' 'Active(anon): 882616 kB' 'Inactive(anon): 0 kB' 'Active(file): 99656 kB' 'Inactive(file): 202200 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 944900 kB' 'Mapped: 30824 kB' 'AnonPages: 239684 kB' 'Shmem: 643044 kB' 'KernelStack: 5832 kB' 'PageTables: 2540 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 73388 kB' 'Slab: 205948 kB' 'SReclaimable: 73388 kB' 'SUnreclaim: 132560 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.212 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:16.213 node0=512 expecting 512 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:16.213 node1=512 expecting 512 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:16.213 00:04:16.213 real 0m4.004s 00:04:16.213 user 0m1.538s 00:04:16.213 sys 0m2.564s 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:16.213 23:46:16 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:16.213 ************************************ 00:04:16.213 END TEST even_2G_alloc 00:04:16.213 ************************************ 00:04:16.213 23:46:16 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:16.213 23:46:16 setup.sh.hugepages -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:16.213 23:46:16 setup.sh.hugepages -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:16.213 23:46:16 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:16.472 ************************************ 00:04:16.472 START TEST odd_alloc 00:04:16.472 ************************************ 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1121 -- # odd_alloc 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:16.472 23:46:16 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:19.764 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:19.765 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:19.765 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:19.765 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:19.765 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:19.765 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:19.765 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:19.765 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:19.765 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:19.765 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:19.765 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:19.765 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:19.765 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:19.765 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:19.765 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:19.765 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:19.765 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:19.765 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:19.765 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77758944 kB' 'MemAvailable: 82200416 kB' 'Buffers: 14216 kB' 'Cached: 10567744 kB' 'SwapCached: 0 kB' 'Active: 6681200 kB' 'Inactive: 4410104 kB' 'Active(anon): 6111868 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512684 kB' 'Mapped: 187368 kB' 'Shmem: 5602524 kB' 'KReclaimable: 230908 kB' 'Slab: 562192 kB' 'SReclaimable: 230908 kB' 'SUnreclaim: 331284 kB' 'KernelStack: 15856 kB' 'PageTables: 7748 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485748 kB' 'Committed_AS: 7407108 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200712 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:19.765 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.766 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:19.766 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:19.766 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:19.766 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.766 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:19.766 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:19.766 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:19.766 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.766 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:19.766 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:19.766 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.030 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77759320 kB' 'MemAvailable: 82200776 kB' 'Buffers: 14216 kB' 'Cached: 10567748 kB' 'SwapCached: 0 kB' 'Active: 6681396 kB' 'Inactive: 4410104 kB' 'Active(anon): 6112064 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512976 kB' 'Mapped: 187368 kB' 'Shmem: 5602528 kB' 'KReclaimable: 230876 kB' 'Slab: 562276 kB' 'SReclaimable: 230876 kB' 'SUnreclaim: 331400 kB' 'KernelStack: 15904 kB' 'PageTables: 7916 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485748 kB' 'Committed_AS: 7407128 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200680 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.031 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:20.032 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77759624 kB' 'MemAvailable: 82201080 kB' 'Buffers: 14216 kB' 'Cached: 10567752 kB' 'SwapCached: 0 kB' 'Active: 6681020 kB' 'Inactive: 4410104 kB' 'Active(anon): 6111688 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512596 kB' 'Mapped: 187368 kB' 'Shmem: 5602532 kB' 'KReclaimable: 230876 kB' 'Slab: 562276 kB' 'SReclaimable: 230876 kB' 'SUnreclaim: 331400 kB' 'KernelStack: 15904 kB' 'PageTables: 7916 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485748 kB' 'Committed_AS: 7407148 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200680 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.033 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:20.034 nr_hugepages=1025 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:20.034 resv_hugepages=0 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:20.034 surplus_hugepages=0 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:20.034 anon_hugepages=0 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.034 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77759624 kB' 'MemAvailable: 82201080 kB' 'Buffers: 14216 kB' 'Cached: 10567784 kB' 'SwapCached: 0 kB' 'Active: 6681284 kB' 'Inactive: 4410104 kB' 'Active(anon): 6111952 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512800 kB' 'Mapped: 187368 kB' 'Shmem: 5602564 kB' 'KReclaimable: 230876 kB' 'Slab: 562276 kB' 'SReclaimable: 230876 kB' 'SUnreclaim: 331400 kB' 'KernelStack: 15904 kB' 'PageTables: 7916 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485748 kB' 'Committed_AS: 7407168 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200696 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.035 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069888 kB' 'MemFree: 36302404 kB' 'MemUsed: 11767484 kB' 'SwapCached: 0 kB' 'Active: 5696544 kB' 'Inactive: 4207904 kB' 'Active(anon): 5226868 kB' 'Inactive(anon): 0 kB' 'Active(file): 469676 kB' 'Inactive(file): 4207904 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9636968 kB' 'Mapped: 156560 kB' 'AnonPages: 270708 kB' 'Shmem: 4959388 kB' 'KernelStack: 10056 kB' 'PageTables: 5292 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 157488 kB' 'Slab: 356348 kB' 'SReclaimable: 157488 kB' 'SUnreclaim: 198860 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.036 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.037 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44223600 kB' 'MemFree: 41457916 kB' 'MemUsed: 2765684 kB' 'SwapCached: 0 kB' 'Active: 984984 kB' 'Inactive: 202200 kB' 'Active(anon): 885328 kB' 'Inactive(anon): 0 kB' 'Active(file): 99656 kB' 'Inactive(file): 202200 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 945076 kB' 'Mapped: 30824 kB' 'AnonPages: 242268 kB' 'Shmem: 643220 kB' 'KernelStack: 5800 kB' 'PageTables: 2492 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 73388 kB' 'Slab: 205928 kB' 'SReclaimable: 73388 kB' 'SUnreclaim: 132540 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.038 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:20.039 node0=512 expecting 513 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:20.039 node1=513 expecting 512 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:20.039 00:04:20.039 real 0m3.738s 00:04:20.039 user 0m1.369s 00:04:20.039 sys 0m2.449s 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:20.039 23:46:20 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:20.039 ************************************ 00:04:20.039 END TEST odd_alloc 00:04:20.039 ************************************ 00:04:20.039 23:46:20 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:20.039 23:46:20 setup.sh.hugepages -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:20.039 23:46:20 setup.sh.hugepages -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:20.039 23:46:20 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:20.299 ************************************ 00:04:20.299 START TEST custom_alloc 00:04:20.299 ************************************ 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1121 -- # custom_alloc 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:20.299 23:46:20 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:23.596 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:23.596 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:23.913 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:23.913 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:23.913 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:23.913 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:23.913 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:23.913 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:23.913 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:23.913 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:23.913 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:23.913 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:23.913 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:23.913 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:23.913 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:23.913 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:23.913 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:23.913 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:23.913 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 76690292 kB' 'MemAvailable: 81131748 kB' 'Buffers: 14216 kB' 'Cached: 10567892 kB' 'SwapCached: 0 kB' 'Active: 6681608 kB' 'Inactive: 4410104 kB' 'Active(anon): 6112276 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512456 kB' 'Mapped: 187460 kB' 'Shmem: 5602672 kB' 'KReclaimable: 230876 kB' 'Slab: 562340 kB' 'SReclaimable: 230876 kB' 'SUnreclaim: 331464 kB' 'KernelStack: 15904 kB' 'PageTables: 7980 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962484 kB' 'Committed_AS: 7408072 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200744 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.913 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.914 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 76691024 kB' 'MemAvailable: 81132480 kB' 'Buffers: 14216 kB' 'Cached: 10567896 kB' 'SwapCached: 0 kB' 'Active: 6680752 kB' 'Inactive: 4410104 kB' 'Active(anon): 6111420 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512140 kB' 'Mapped: 187380 kB' 'Shmem: 5602676 kB' 'KReclaimable: 230876 kB' 'Slab: 562332 kB' 'SReclaimable: 230876 kB' 'SUnreclaim: 331456 kB' 'KernelStack: 15904 kB' 'PageTables: 7976 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962484 kB' 'Committed_AS: 7408092 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200696 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.915 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.916 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 76693544 kB' 'MemAvailable: 81135000 kB' 'Buffers: 14216 kB' 'Cached: 10567912 kB' 'SwapCached: 0 kB' 'Active: 6680780 kB' 'Inactive: 4410104 kB' 'Active(anon): 6111448 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512144 kB' 'Mapped: 187380 kB' 'Shmem: 5602692 kB' 'KReclaimable: 230876 kB' 'Slab: 562332 kB' 'SReclaimable: 230876 kB' 'SUnreclaim: 331456 kB' 'KernelStack: 15920 kB' 'PageTables: 7976 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962484 kB' 'Committed_AS: 7408128 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200696 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.917 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.918 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:24.182 nr_hugepages=1536 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:24.182 resv_hugepages=0 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:24.182 surplus_hugepages=0 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:24.182 anon_hugepages=0 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 76695668 kB' 'MemAvailable: 81137124 kB' 'Buffers: 14216 kB' 'Cached: 10567936 kB' 'SwapCached: 0 kB' 'Active: 6680764 kB' 'Inactive: 4410104 kB' 'Active(anon): 6111432 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512088 kB' 'Mapped: 187380 kB' 'Shmem: 5602716 kB' 'KReclaimable: 230876 kB' 'Slab: 562364 kB' 'SReclaimable: 230876 kB' 'SUnreclaim: 331488 kB' 'KernelStack: 15904 kB' 'PageTables: 7920 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962484 kB' 'Committed_AS: 7410328 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200680 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.182 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069888 kB' 'MemFree: 36307596 kB' 'MemUsed: 11762292 kB' 'SwapCached: 0 kB' 'Active: 5698596 kB' 'Inactive: 4207904 kB' 'Active(anon): 5228920 kB' 'Inactive(anon): 0 kB' 'Active(file): 469676 kB' 'Inactive(file): 4207904 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9636992 kB' 'Mapped: 156572 kB' 'AnonPages: 272756 kB' 'Shmem: 4959412 kB' 'KernelStack: 10056 kB' 'PageTables: 5344 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 157488 kB' 'Slab: 356180 kB' 'SReclaimable: 157488 kB' 'SUnreclaim: 198692 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.183 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44223600 kB' 'MemFree: 40384164 kB' 'MemUsed: 3839436 kB' 'SwapCached: 0 kB' 'Active: 987856 kB' 'Inactive: 202200 kB' 'Active(anon): 888200 kB' 'Inactive(anon): 0 kB' 'Active(file): 99656 kB' 'Inactive(file): 202200 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 945200 kB' 'Mapped: 31312 kB' 'AnonPages: 245008 kB' 'Shmem: 643344 kB' 'KernelStack: 5816 kB' 'PageTables: 2484 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 73388 kB' 'Slab: 206184 kB' 'SReclaimable: 73388 kB' 'SUnreclaim: 132796 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.184 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:24.185 node0=512 expecting 512 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:24.185 node1=1024 expecting 1024 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:24.185 00:04:24.185 real 0m3.973s 00:04:24.185 user 0m1.552s 00:04:24.185 sys 0m2.517s 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:24.185 23:46:24 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:24.185 ************************************ 00:04:24.185 END TEST custom_alloc 00:04:24.185 ************************************ 00:04:24.185 23:46:24 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:24.185 23:46:24 setup.sh.hugepages -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:24.185 23:46:24 setup.sh.hugepages -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:24.185 23:46:24 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:24.185 ************************************ 00:04:24.185 START TEST no_shrink_alloc 00:04:24.185 ************************************ 00:04:24.185 23:46:24 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1121 -- # no_shrink_alloc 00:04:24.185 23:46:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:24.185 23:46:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:24.185 23:46:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:24.185 23:46:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:04:24.185 23:46:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:24.185 23:46:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:24.185 23:46:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:24.185 23:46:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:24.185 23:46:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:24.185 23:46:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:24.185 23:46:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:24.185 23:46:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:24.185 23:46:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:24.185 23:46:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:24.185 23:46:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:24.185 23:46:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:24.185 23:46:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:24.185 23:46:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:24.185 23:46:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:24.185 23:46:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:04:24.185 23:46:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:24.185 23:46:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:28.381 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:28.381 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:28.381 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:28.381 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:28.381 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:28.381 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:28.381 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:28.381 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:28.381 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:28.381 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:28.381 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:28.381 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:28.381 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:28.381 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:28.381 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:28.381 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:28.381 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:28.381 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:28.381 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77664272 kB' 'MemAvailable: 82105728 kB' 'Buffers: 14216 kB' 'Cached: 10568048 kB' 'SwapCached: 0 kB' 'Active: 6681112 kB' 'Inactive: 4410104 kB' 'Active(anon): 6111780 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512316 kB' 'Mapped: 187468 kB' 'Shmem: 5602828 kB' 'KReclaimable: 230876 kB' 'Slab: 562448 kB' 'SReclaimable: 230876 kB' 'SUnreclaim: 331572 kB' 'KernelStack: 15888 kB' 'PageTables: 7892 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7408924 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200808 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.381 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.382 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77664976 kB' 'MemAvailable: 82106432 kB' 'Buffers: 14216 kB' 'Cached: 10568052 kB' 'SwapCached: 0 kB' 'Active: 6681360 kB' 'Inactive: 4410104 kB' 'Active(anon): 6112028 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512556 kB' 'Mapped: 187388 kB' 'Shmem: 5602832 kB' 'KReclaimable: 230876 kB' 'Slab: 562436 kB' 'SReclaimable: 230876 kB' 'SUnreclaim: 331560 kB' 'KernelStack: 15904 kB' 'PageTables: 7928 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7408944 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200776 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.383 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:28.384 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77665272 kB' 'MemAvailable: 82106728 kB' 'Buffers: 14216 kB' 'Cached: 10568052 kB' 'SwapCached: 0 kB' 'Active: 6681048 kB' 'Inactive: 4410104 kB' 'Active(anon): 6111716 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512236 kB' 'Mapped: 187388 kB' 'Shmem: 5602832 kB' 'KReclaimable: 230876 kB' 'Slab: 562436 kB' 'SReclaimable: 230876 kB' 'SUnreclaim: 331560 kB' 'KernelStack: 15904 kB' 'PageTables: 7928 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7408964 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200776 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.385 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.386 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:28.387 nr_hugepages=1024 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:28.387 resv_hugepages=0 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:28.387 surplus_hugepages=0 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:28.387 anon_hugepages=0 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77666960 kB' 'MemAvailable: 82108416 kB' 'Buffers: 14216 kB' 'Cached: 10568056 kB' 'SwapCached: 0 kB' 'Active: 6681172 kB' 'Inactive: 4410104 kB' 'Active(anon): 6111840 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512356 kB' 'Mapped: 187388 kB' 'Shmem: 5602836 kB' 'KReclaimable: 230876 kB' 'Slab: 562436 kB' 'SReclaimable: 230876 kB' 'SUnreclaim: 331560 kB' 'KernelStack: 15872 kB' 'PageTables: 7824 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7410012 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200728 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.387 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:28.388 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069888 kB' 'MemFree: 35265936 kB' 'MemUsed: 12803952 kB' 'SwapCached: 0 kB' 'Active: 5700084 kB' 'Inactive: 4207904 kB' 'Active(anon): 5230408 kB' 'Inactive(anon): 0 kB' 'Active(file): 469676 kB' 'Inactive(file): 4207904 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9637024 kB' 'Mapped: 156580 kB' 'AnonPages: 274244 kB' 'Shmem: 4959444 kB' 'KernelStack: 10136 kB' 'PageTables: 5700 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 157488 kB' 'Slab: 356076 kB' 'SReclaimable: 157488 kB' 'SUnreclaim: 198588 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.389 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:28.390 node0=1024 expecting 1024 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:28.390 23:46:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:31.678 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:31.678 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:31.678 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:31.678 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:31.678 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:31.678 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:31.678 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:31.678 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:31.678 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:31.678 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:31.678 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:31.678 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:31.678 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:31.678 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:31.678 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:31.678 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:31.678 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:31.943 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:31.943 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:31.943 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77689384 kB' 'MemAvailable: 82130792 kB' 'Buffers: 14216 kB' 'Cached: 10568176 kB' 'SwapCached: 0 kB' 'Active: 6681584 kB' 'Inactive: 4410104 kB' 'Active(anon): 6112252 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512100 kB' 'Mapped: 187508 kB' 'Shmem: 5602956 kB' 'KReclaimable: 230780 kB' 'Slab: 562388 kB' 'SReclaimable: 230780 kB' 'SUnreclaim: 331608 kB' 'KernelStack: 15920 kB' 'PageTables: 7980 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7409316 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200728 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.943 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77688932 kB' 'MemAvailable: 82130340 kB' 'Buffers: 14216 kB' 'Cached: 10568180 kB' 'SwapCached: 0 kB' 'Active: 6681308 kB' 'Inactive: 4410104 kB' 'Active(anon): 6111976 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512296 kB' 'Mapped: 187396 kB' 'Shmem: 5602960 kB' 'KReclaimable: 230780 kB' 'Slab: 562392 kB' 'SReclaimable: 230780 kB' 'SUnreclaim: 331612 kB' 'KernelStack: 15904 kB' 'PageTables: 7924 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7409332 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200712 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.944 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77689512 kB' 'MemAvailable: 82130920 kB' 'Buffers: 14216 kB' 'Cached: 10568184 kB' 'SwapCached: 0 kB' 'Active: 6681020 kB' 'Inactive: 4410104 kB' 'Active(anon): 6111688 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512000 kB' 'Mapped: 187396 kB' 'Shmem: 5602964 kB' 'KReclaimable: 230780 kB' 'Slab: 562392 kB' 'SReclaimable: 230780 kB' 'SUnreclaim: 331612 kB' 'KernelStack: 15904 kB' 'PageTables: 7924 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7409356 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200712 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.945 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:31.946 nr_hugepages=1024 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:31.946 resv_hugepages=0 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:31.946 surplus_hugepages=0 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:31.946 anon_hugepages=0 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77689844 kB' 'MemAvailable: 82131252 kB' 'Buffers: 14216 kB' 'Cached: 10568240 kB' 'SwapCached: 0 kB' 'Active: 6681024 kB' 'Inactive: 4410104 kB' 'Active(anon): 6111692 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4410104 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511900 kB' 'Mapped: 187396 kB' 'Shmem: 5603020 kB' 'KReclaimable: 230780 kB' 'Slab: 562392 kB' 'SReclaimable: 230780 kB' 'SUnreclaim: 331612 kB' 'KernelStack: 15888 kB' 'PageTables: 7868 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7409376 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200712 kB' 'VmallocChunk: 0 kB' 'Percpu: 50880 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 988584 kB' 'DirectMap2M: 15464448 kB' 'DirectMap1G: 84934656 kB' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.946 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:31.947 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.207 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069888 kB' 'MemFree: 35259448 kB' 'MemUsed: 12810440 kB' 'SwapCached: 0 kB' 'Active: 5699276 kB' 'Inactive: 4207904 kB' 'Active(anon): 5229600 kB' 'Inactive(anon): 0 kB' 'Active(file): 469676 kB' 'Inactive(file): 4207904 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9637048 kB' 'Mapped: 156588 kB' 'AnonPages: 273232 kB' 'Shmem: 4959468 kB' 'KernelStack: 10056 kB' 'PageTables: 5336 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 157488 kB' 'Slab: 356036 kB' 'SReclaimable: 157488 kB' 'SUnreclaim: 198548 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.208 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:32.209 node0=1024 expecting 1024 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:32.209 00:04:32.209 real 0m7.868s 00:04:32.209 user 0m2.949s 00:04:32.209 sys 0m5.115s 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:32.209 23:46:32 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:32.209 ************************************ 00:04:32.209 END TEST no_shrink_alloc 00:04:32.209 ************************************ 00:04:32.209 23:46:32 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:04:32.209 23:46:32 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:04:32.209 23:46:32 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:32.209 23:46:32 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:32.209 23:46:32 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:32.209 23:46:32 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:32.209 23:46:32 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:32.209 23:46:32 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:32.209 23:46:32 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:32.209 23:46:32 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:32.209 23:46:32 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:32.209 23:46:32 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:32.209 23:46:32 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:32.209 23:46:32 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:32.209 00:04:32.209 real 0m30.987s 00:04:32.209 user 0m10.917s 00:04:32.209 sys 0m18.401s 00:04:32.209 23:46:32 setup.sh.hugepages -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:32.209 23:46:32 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:32.209 ************************************ 00:04:32.209 END TEST hugepages 00:04:32.209 ************************************ 00:04:32.209 23:46:32 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:04:32.209 23:46:32 setup.sh -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:32.209 23:46:32 setup.sh -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:32.209 23:46:32 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:32.209 ************************************ 00:04:32.209 START TEST driver 00:04:32.209 ************************************ 00:04:32.209 23:46:32 setup.sh.driver -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:04:32.468 * Looking for test storage... 00:04:32.468 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:32.468 23:46:32 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:04:32.468 23:46:32 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:32.468 23:46:32 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:37.741 23:46:37 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:37.741 23:46:37 setup.sh.driver -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:37.741 23:46:37 setup.sh.driver -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:37.741 23:46:37 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:37.741 ************************************ 00:04:37.741 START TEST guess_driver 00:04:37.741 ************************************ 00:04:37.741 23:46:38 setup.sh.driver.guess_driver -- common/autotest_common.sh@1121 -- # guess_driver 00:04:37.741 23:46:38 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:37.741 23:46:38 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:04:37.741 23:46:38 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:04:37.741 23:46:38 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:04:37.741 23:46:38 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:04:37.741 23:46:38 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:37.741 23:46:38 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:37.741 23:46:38 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:37.741 23:46:38 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:37.741 23:46:38 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 216 > 0 )) 00:04:37.741 23:46:38 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:37.741 23:46:38 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:04:37.741 23:46:38 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:04:37.741 23:46:38 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:37.741 23:46:38 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:37.741 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:37.741 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:37.741 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:37.741 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:37.741 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:37.741 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:37.741 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:37.741 23:46:38 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:04:37.741 23:46:38 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:04:37.741 23:46:38 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:37.741 23:46:38 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:37.741 23:46:38 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:37.741 Looking for driver=vfio-pci 00:04:37.741 23:46:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:37.741 23:46:38 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:04:37.741 23:46:38 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:04:37.741 23:46:38 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:41.931 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:04:41.931 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:04:41.931 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:41.931 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:04:41.931 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:04:41.931 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:41.931 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:41.931 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:41.931 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:41.931 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:41.931 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:41.932 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:41.932 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:41.932 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:41.932 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:41.932 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:41.932 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:41.932 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:41.932 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:41.932 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:41.932 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:41.932 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:41.932 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:41.932 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:41.932 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:41.932 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:41.932 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:41.932 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:41.932 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:41.932 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:41.932 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:41.932 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:41.932 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:41.932 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:41.932 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:41.932 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:41.932 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:41.932 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:41.932 23:46:41 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:41.932 23:46:42 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:41.932 23:46:42 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:41.932 23:46:42 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:41.932 23:46:42 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:41.932 23:46:42 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:41.932 23:46:42 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:41.932 23:46:42 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:41.932 23:46:42 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:41.932 23:46:42 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:41.932 23:46:42 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:41.932 23:46:42 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:41.932 23:46:42 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:41.932 23:46:42 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:41.932 23:46:42 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:41.932 23:46:42 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:43.837 23:46:44 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:43.837 23:46:44 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:43.837 23:46:44 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:43.837 23:46:44 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:43.837 23:46:44 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:04:43.837 23:46:44 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:43.837 23:46:44 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:49.172 00:04:49.172 real 0m11.483s 00:04:49.172 user 0m3.035s 00:04:49.172 sys 0m5.587s 00:04:49.172 23:46:49 setup.sh.driver.guess_driver -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:49.172 23:46:49 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:04:49.172 ************************************ 00:04:49.172 END TEST guess_driver 00:04:49.172 ************************************ 00:04:49.172 00:04:49.172 real 0m16.839s 00:04:49.172 user 0m4.572s 00:04:49.172 sys 0m8.619s 00:04:49.172 23:46:49 setup.sh.driver -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:49.172 23:46:49 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:49.172 ************************************ 00:04:49.172 END TEST driver 00:04:49.172 ************************************ 00:04:49.172 23:46:49 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:04:49.172 23:46:49 setup.sh -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:49.172 23:46:49 setup.sh -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:49.172 23:46:49 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:49.172 ************************************ 00:04:49.172 START TEST devices 00:04:49.172 ************************************ 00:04:49.172 23:46:49 setup.sh.devices -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:04:49.172 * Looking for test storage... 00:04:49.172 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:49.172 23:46:49 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:49.172 23:46:49 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:04:49.172 23:46:49 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:49.172 23:46:49 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:53.373 23:46:53 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:04:53.373 23:46:53 setup.sh.devices -- common/autotest_common.sh@1665 -- # zoned_devs=() 00:04:53.373 23:46:53 setup.sh.devices -- common/autotest_common.sh@1665 -- # local -gA zoned_devs 00:04:53.373 23:46:53 setup.sh.devices -- common/autotest_common.sh@1666 -- # local nvme bdf 00:04:53.373 23:46:53 setup.sh.devices -- common/autotest_common.sh@1668 -- # for nvme in /sys/block/nvme* 00:04:53.373 23:46:53 setup.sh.devices -- common/autotest_common.sh@1669 -- # is_block_zoned nvme0n1 00:04:53.373 23:46:53 setup.sh.devices -- common/autotest_common.sh@1658 -- # local device=nvme0n1 00:04:53.373 23:46:53 setup.sh.devices -- common/autotest_common.sh@1660 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:53.373 23:46:53 setup.sh.devices -- common/autotest_common.sh@1661 -- # [[ none != none ]] 00:04:53.373 23:46:53 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:04:53.373 23:46:53 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:04:53.373 23:46:53 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:53.373 23:46:53 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:53.373 23:46:53 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:53.373 23:46:53 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:53.373 23:46:53 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:53.373 23:46:53 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:53.373 23:46:53 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5e:00.0 00:04:53.373 23:46:53 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:04:53.373 23:46:53 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:53.373 23:46:53 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:04:53.373 23:46:53 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:53.373 No valid GPT data, bailing 00:04:53.373 23:46:53 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:53.373 23:46:53 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:04:53.373 23:46:53 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:04:53.373 23:46:53 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:53.373 23:46:53 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:53.373 23:46:53 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:53.373 23:46:53 setup.sh.devices -- setup/common.sh@80 -- # echo 7681501126656 00:04:53.373 23:46:53 setup.sh.devices -- setup/devices.sh@204 -- # (( 7681501126656 >= min_disk_size )) 00:04:53.373 23:46:53 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:53.373 23:46:53 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:5e:00.0 00:04:53.373 23:46:53 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:53.373 23:46:53 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:53.373 23:46:53 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:53.373 23:46:53 setup.sh.devices -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:53.373 23:46:53 setup.sh.devices -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:53.373 23:46:53 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:04:53.373 ************************************ 00:04:53.373 START TEST nvme_mount 00:04:53.373 ************************************ 00:04:53.373 23:46:53 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1121 -- # nvme_mount 00:04:53.373 23:46:53 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:53.373 23:46:53 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:53.373 23:46:53 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:53.373 23:46:53 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:53.373 23:46:53 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:53.373 23:46:53 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:53.373 23:46:53 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:04:53.373 23:46:53 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:04:53.373 23:46:53 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:53.373 23:46:53 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:04:53.373 23:46:53 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:04:53.373 23:46:53 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:04:53.373 23:46:53 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:53.373 23:46:53 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:53.373 23:46:53 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:53.373 23:46:53 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:53.373 23:46:53 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:53.373 23:46:53 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:53.373 23:46:53 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:54.310 Creating new GPT entries in memory. 00:04:54.310 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:54.310 other utilities. 00:04:54.310 23:46:54 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:04:54.310 23:46:54 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:54.310 23:46:54 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:54.310 23:46:54 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:54.310 23:46:54 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:55.688 Creating new GPT entries in memory. 00:04:55.688 The operation has completed successfully. 00:04:55.688 23:46:55 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:04:55.688 23:46:55 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:55.688 23:46:55 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 329563 00:04:55.688 23:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:55.688 23:46:55 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:55.688 23:46:55 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:55.688 23:46:55 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:55.688 23:46:55 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:55.688 23:46:55 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:55.688 23:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:55.688 23:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:55.688 23:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:55.688 23:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:55.688 23:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:55.688 23:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:55.688 23:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:55.688 23:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:04:55.688 23:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:55.688 23:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:55.688 23:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:55.688 23:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:55.688 23:46:55 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:55.688 23:46:55 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:59.881 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:59.881 23:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:59.881 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:59.881 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:04:59.881 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:59.881 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:59.881 23:47:00 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:59.881 23:47:00 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:59.881 23:47:00 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:59.881 23:47:00 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:59.881 23:47:00 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:59.881 23:47:00 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:59.881 23:47:00 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:59.881 23:47:00 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:59.881 23:47:00 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:59.881 23:47:00 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:59.881 23:47:00 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:59.881 23:47:00 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:59.881 23:47:00 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:59.881 23:47:00 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:04:59.881 23:47:00 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:59.881 23:47:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.881 23:47:00 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:59.881 23:47:00 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:59.881 23:47:00 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:59.881 23:47:00 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:03.169 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.428 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:03.428 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:03.428 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:03.428 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:03.428 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:03.428 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:03.428 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:5e:00.0 data@nvme0n1 '' '' 00:05:03.428 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:03.428 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:05:03.428 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:03.428 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:05:03.428 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:03.428 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:03.428 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:03.428 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.428 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:03.428 23:47:03 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:03.428 23:47:03 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:03.428 23:47:03 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.627 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.628 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.628 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.628 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.628 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.628 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.628 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.628 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.628 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.628 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:07.628 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:07.628 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:05:07.628 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:05:07.628 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:07.628 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:07.628 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:07.628 23:47:07 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:07.628 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:07.628 00:05:07.628 real 0m13.875s 00:05:07.628 user 0m4.142s 00:05:07.628 sys 0m7.727s 00:05:07.628 23:47:07 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:07.628 23:47:07 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:05:07.628 ************************************ 00:05:07.628 END TEST nvme_mount 00:05:07.628 ************************************ 00:05:07.628 23:47:07 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:05:07.628 23:47:07 setup.sh.devices -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:07.628 23:47:07 setup.sh.devices -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:07.628 23:47:07 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:07.628 ************************************ 00:05:07.628 START TEST dm_mount 00:05:07.628 ************************************ 00:05:07.628 23:47:07 setup.sh.devices.dm_mount -- common/autotest_common.sh@1121 -- # dm_mount 00:05:07.628 23:47:07 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:05:07.628 23:47:07 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:05:07.628 23:47:07 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:05:07.628 23:47:07 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:05:07.628 23:47:07 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:07.628 23:47:07 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:05:07.628 23:47:07 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:07.628 23:47:07 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:07.628 23:47:07 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:05:07.628 23:47:07 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:05:07.628 23:47:07 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:07.628 23:47:07 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:07.628 23:47:07 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:07.628 23:47:07 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:07.628 23:47:07 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:07.628 23:47:07 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:07.628 23:47:07 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:07.628 23:47:07 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:07.628 23:47:07 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:07.628 23:47:07 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:07.628 23:47:07 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:05:08.567 Creating new GPT entries in memory. 00:05:08.567 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:08.567 other utilities. 00:05:08.567 23:47:08 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:08.567 23:47:08 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:08.567 23:47:08 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:08.567 23:47:08 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:08.567 23:47:08 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:09.505 Creating new GPT entries in memory. 00:05:09.505 The operation has completed successfully. 00:05:09.505 23:47:09 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:09.505 23:47:09 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:09.505 23:47:09 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:09.505 23:47:09 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:09.505 23:47:09 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:10.442 The operation has completed successfully. 00:05:10.442 23:47:10 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:10.442 23:47:10 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:10.442 23:47:10 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 333901 00:05:10.442 23:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:10.442 23:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:10.442 23:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:10.442 23:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:10.442 23:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:05:10.442 23:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:10.442 23:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:05:10.442 23:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:10.442 23:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:10.442 23:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:10.442 23:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:05:10.442 23:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:10.443 23:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:10.443 23:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:10.443 23:47:10 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:05:10.443 23:47:10 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:10.443 23:47:10 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:10.443 23:47:10 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:10.443 23:47:10 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:10.443 23:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:5e:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:10.443 23:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:10.443 23:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:10.443 23:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:10.443 23:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:10.443 23:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:10.443 23:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:10.443 23:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:05:10.443 23:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:10.443 23:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.443 23:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:10.443 23:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:10.443 23:47:10 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:10.443 23:47:11 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.672 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.673 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.673 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.673 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.673 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.673 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.673 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:14.673 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:14.673 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:14.673 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:14.673 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:14.673 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:14.673 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:5e:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:14.673 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:14.673 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:14.673 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:14.673 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:05:14.673 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:14.673 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:14.673 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:14.673 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.673 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:14.673 23:47:14 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:14.673 23:47:14 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:14.673 23:47:14 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.963 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.964 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.964 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.964 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.964 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.964 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.964 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.964 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:17.964 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:17.964 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:05:17.964 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:05:17.964 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:17.964 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:17.964 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:17.964 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:17.964 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:17.964 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:17.964 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:17.964 23:47:18 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:17.964 00:05:17.964 real 0m10.691s 00:05:17.964 user 0m2.691s 00:05:17.964 sys 0m5.104s 00:05:17.964 23:47:18 setup.sh.devices.dm_mount -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:17.964 23:47:18 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:05:17.964 ************************************ 00:05:17.964 END TEST dm_mount 00:05:17.964 ************************************ 00:05:17.964 23:47:18 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:05:17.964 23:47:18 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:05:17.964 23:47:18 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:17.964 23:47:18 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:17.964 23:47:18 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:17.964 23:47:18 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:17.964 23:47:18 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:18.223 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:18.223 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:05:18.223 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:18.223 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:18.223 23:47:18 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:05:18.223 23:47:18 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:18.223 23:47:18 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:18.223 23:47:18 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:18.223 23:47:18 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:18.223 23:47:18 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:18.223 23:47:18 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:18.223 00:05:18.223 real 0m29.169s 00:05:18.223 user 0m8.329s 00:05:18.223 sys 0m15.845s 00:05:18.223 23:47:18 setup.sh.devices -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:18.223 23:47:18 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:18.223 ************************************ 00:05:18.223 END TEST devices 00:05:18.224 ************************************ 00:05:18.483 00:05:18.483 real 1m45.740s 00:05:18.483 user 0m32.948s 00:05:18.483 sys 0m59.818s 00:05:18.483 23:47:18 setup.sh -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:18.483 23:47:18 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:18.483 ************************************ 00:05:18.483 END TEST setup.sh 00:05:18.483 ************************************ 00:05:18.483 23:47:18 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:05:22.680 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:22.680 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:22.680 Hugepages 00:05:22.680 node hugesize free / total 00:05:22.680 node0 1048576kB 0 / 0 00:05:22.680 node0 2048kB 1024 / 1024 00:05:22.680 node1 1048576kB 0 / 0 00:05:22.680 node1 2048kB 1024 / 1024 00:05:22.680 00:05:22.680 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:22.680 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:05:22.680 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:05:22.680 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:05:22.680 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:05:22.680 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:05:22.680 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:05:22.680 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:05:22.680 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:05:22.680 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:05:22.680 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:05:22.680 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:05:22.680 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:05:22.680 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:05:22.680 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:05:22.680 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:05:22.680 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:05:22.680 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:05:22.680 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:05:22.680 VMD 0000:d7:05.5 8086 201d 1 vfio-pci - - 00:05:22.680 23:47:22 -- spdk/autotest.sh@130 -- # uname -s 00:05:22.680 23:47:22 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:05:22.680 23:47:22 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:05:22.680 23:47:22 -- common/autotest_common.sh@1527 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:25.973 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:25.973 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:25.973 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:25.973 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:25.973 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:25.973 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:25.973 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:25.973 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:25.973 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:25.973 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:25.973 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:25.973 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:25.973 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:25.973 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:25.973 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:25.973 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:25.973 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:25.973 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:28.509 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:05:28.509 23:47:28 -- common/autotest_common.sh@1528 -- # sleep 1 00:05:29.078 23:47:29 -- common/autotest_common.sh@1529 -- # bdfs=() 00:05:29.078 23:47:29 -- common/autotest_common.sh@1529 -- # local bdfs 00:05:29.078 23:47:29 -- common/autotest_common.sh@1530 -- # bdfs=($(get_nvme_bdfs)) 00:05:29.078 23:47:29 -- common/autotest_common.sh@1530 -- # get_nvme_bdfs 00:05:29.078 23:47:29 -- common/autotest_common.sh@1509 -- # bdfs=() 00:05:29.078 23:47:29 -- common/autotest_common.sh@1509 -- # local bdfs 00:05:29.078 23:47:29 -- common/autotest_common.sh@1510 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:29.078 23:47:29 -- common/autotest_common.sh@1510 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:29.078 23:47:29 -- common/autotest_common.sh@1510 -- # jq -r '.config[].params.traddr' 00:05:29.338 23:47:29 -- common/autotest_common.sh@1511 -- # (( 1 == 0 )) 00:05:29.338 23:47:29 -- common/autotest_common.sh@1515 -- # printf '%s\n' 0000:5e:00.0 00:05:29.338 23:47:29 -- common/autotest_common.sh@1532 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:32.628 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:32.628 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:32.887 Waiting for block devices as requested 00:05:32.887 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:05:32.887 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:33.145 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:33.145 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:33.145 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:33.404 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:33.404 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:33.404 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:33.664 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:33.664 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:33.664 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:33.922 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:33.922 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:33.922 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:34.181 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:34.181 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:34.181 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:34.439 23:47:34 -- common/autotest_common.sh@1534 -- # for bdf in "${bdfs[@]}" 00:05:34.439 23:47:34 -- common/autotest_common.sh@1535 -- # get_nvme_ctrlr_from_bdf 0000:5e:00.0 00:05:34.439 23:47:34 -- common/autotest_common.sh@1498 -- # readlink -f /sys/class/nvme/nvme0 00:05:34.439 23:47:34 -- common/autotest_common.sh@1498 -- # grep 0000:5e:00.0/nvme/nvme 00:05:34.439 23:47:34 -- common/autotest_common.sh@1498 -- # bdf_sysfs_path=/sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:05:34.439 23:47:34 -- common/autotest_common.sh@1499 -- # [[ -z /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 ]] 00:05:34.440 23:47:34 -- common/autotest_common.sh@1503 -- # basename /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:05:34.440 23:47:34 -- common/autotest_common.sh@1503 -- # printf '%s\n' nvme0 00:05:34.440 23:47:34 -- common/autotest_common.sh@1535 -- # nvme_ctrlr=/dev/nvme0 00:05:34.440 23:47:34 -- common/autotest_common.sh@1536 -- # [[ -z /dev/nvme0 ]] 00:05:34.440 23:47:34 -- common/autotest_common.sh@1541 -- # nvme id-ctrl /dev/nvme0 00:05:34.440 23:47:34 -- common/autotest_common.sh@1541 -- # grep oacs 00:05:34.440 23:47:34 -- common/autotest_common.sh@1541 -- # cut -d: -f2 00:05:34.440 23:47:34 -- common/autotest_common.sh@1541 -- # oacs=' 0x3f' 00:05:34.440 23:47:34 -- common/autotest_common.sh@1542 -- # oacs_ns_manage=8 00:05:34.440 23:47:34 -- common/autotest_common.sh@1544 -- # [[ 8 -ne 0 ]] 00:05:34.440 23:47:34 -- common/autotest_common.sh@1550 -- # nvme id-ctrl /dev/nvme0 00:05:34.440 23:47:34 -- common/autotest_common.sh@1550 -- # grep unvmcap 00:05:34.440 23:47:34 -- common/autotest_common.sh@1550 -- # cut -d: -f2 00:05:34.440 23:47:34 -- common/autotest_common.sh@1550 -- # unvmcap=' 0' 00:05:34.440 23:47:34 -- common/autotest_common.sh@1551 -- # [[ 0 -eq 0 ]] 00:05:34.440 23:47:34 -- common/autotest_common.sh@1553 -- # continue 00:05:34.440 23:47:34 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:05:34.440 23:47:34 -- common/autotest_common.sh@726 -- # xtrace_disable 00:05:34.440 23:47:34 -- common/autotest_common.sh@10 -- # set +x 00:05:34.440 23:47:34 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:05:34.440 23:47:34 -- common/autotest_common.sh@720 -- # xtrace_disable 00:05:34.440 23:47:34 -- common/autotest_common.sh@10 -- # set +x 00:05:34.440 23:47:34 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:38.708 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:38.708 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:38.708 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:38.708 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:38.708 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:38.708 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:38.708 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:38.708 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:38.708 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:38.708 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:38.708 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:38.708 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:38.709 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:38.709 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:38.709 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:38.709 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:38.709 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:38.709 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:41.244 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:05:41.244 23:47:41 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:05:41.244 23:47:41 -- common/autotest_common.sh@726 -- # xtrace_disable 00:05:41.244 23:47:41 -- common/autotest_common.sh@10 -- # set +x 00:05:41.244 23:47:41 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:05:41.244 23:47:41 -- common/autotest_common.sh@1587 -- # mapfile -t bdfs 00:05:41.244 23:47:41 -- common/autotest_common.sh@1587 -- # get_nvme_bdfs_by_id 0x0a54 00:05:41.244 23:47:41 -- common/autotest_common.sh@1573 -- # bdfs=() 00:05:41.244 23:47:41 -- common/autotest_common.sh@1573 -- # local bdfs 00:05:41.244 23:47:41 -- common/autotest_common.sh@1575 -- # get_nvme_bdfs 00:05:41.245 23:47:41 -- common/autotest_common.sh@1509 -- # bdfs=() 00:05:41.245 23:47:41 -- common/autotest_common.sh@1509 -- # local bdfs 00:05:41.245 23:47:41 -- common/autotest_common.sh@1510 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:41.245 23:47:41 -- common/autotest_common.sh@1510 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:41.245 23:47:41 -- common/autotest_common.sh@1510 -- # jq -r '.config[].params.traddr' 00:05:41.245 23:47:41 -- common/autotest_common.sh@1511 -- # (( 1 == 0 )) 00:05:41.245 23:47:41 -- common/autotest_common.sh@1515 -- # printf '%s\n' 0000:5e:00.0 00:05:41.245 23:47:41 -- common/autotest_common.sh@1575 -- # for bdf in $(get_nvme_bdfs) 00:05:41.245 23:47:41 -- common/autotest_common.sh@1576 -- # cat /sys/bus/pci/devices/0000:5e:00.0/device 00:05:41.245 23:47:41 -- common/autotest_common.sh@1576 -- # device=0x0b60 00:05:41.245 23:47:41 -- common/autotest_common.sh@1577 -- # [[ 0x0b60 == \0\x\0\a\5\4 ]] 00:05:41.245 23:47:41 -- common/autotest_common.sh@1582 -- # printf '%s\n' 00:05:41.245 23:47:41 -- common/autotest_common.sh@1588 -- # [[ -z '' ]] 00:05:41.245 23:47:41 -- common/autotest_common.sh@1589 -- # return 0 00:05:41.245 23:47:41 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:05:41.245 23:47:41 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:05:41.245 23:47:41 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:05:41.245 23:47:41 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:05:41.245 23:47:41 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:05:41.813 Restarting all devices. 00:05:46.011 lstat() error: No such file or directory 00:05:46.011 QAT Error: No GENERAL section found 00:05:46.011 Failed to configure qat_dev0 00:05:46.011 lstat() error: No such file or directory 00:05:46.011 QAT Error: No GENERAL section found 00:05:46.011 Failed to configure qat_dev1 00:05:46.011 lstat() error: No such file or directory 00:05:46.011 QAT Error: No GENERAL section found 00:05:46.011 Failed to configure qat_dev2 00:05:46.011 enable sriov 00:05:46.011 Checking status of all devices. 00:05:46.011 There is 3 QAT acceleration device(s) in the system: 00:05:46.011 qat_dev0 - type: c6xx, inst_id: 0, node_id: 0, bsf: 0000:3d:00.0, #accel: 5 #engines: 10 state: down 00:05:46.011 qat_dev1 - type: c6xx, inst_id: 1, node_id: 0, bsf: 0000:3f:00.0, #accel: 5 #engines: 10 state: down 00:05:46.011 qat_dev2 - type: c6xx, inst_id: 2, node_id: 1, bsf: 0000:da:00.0, #accel: 5 #engines: 10 state: down 00:05:46.948 0000:3d:00.0 set to 16 VFs 00:05:48.327 0000:3f:00.0 set to 16 VFs 00:05:49.722 0000:da:00.0 set to 16 VFs 00:05:53.011 Properly configured the qat device with driver uio_pci_generic. 00:05:53.011 23:47:53 -- spdk/autotest.sh@162 -- # timing_enter lib 00:05:53.011 23:47:53 -- common/autotest_common.sh@720 -- # xtrace_disable 00:05:53.011 23:47:53 -- common/autotest_common.sh@10 -- # set +x 00:05:53.011 23:47:53 -- spdk/autotest.sh@164 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:05:53.011 23:47:53 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:53.011 23:47:53 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:53.011 23:47:53 -- common/autotest_common.sh@10 -- # set +x 00:05:53.011 ************************************ 00:05:53.011 START TEST env 00:05:53.011 ************************************ 00:05:53.011 23:47:53 env -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:05:53.011 * Looking for test storage... 00:05:53.011 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:05:53.011 23:47:53 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:05:53.011 23:47:53 env -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:53.011 23:47:53 env -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:53.011 23:47:53 env -- common/autotest_common.sh@10 -- # set +x 00:05:53.011 ************************************ 00:05:53.011 START TEST env_memory 00:05:53.011 ************************************ 00:05:53.011 23:47:53 env.env_memory -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:05:53.011 00:05:53.011 00:05:53.011 CUnit - A unit testing framework for C - Version 2.1-3 00:05:53.011 http://cunit.sourceforge.net/ 00:05:53.011 00:05:53.011 00:05:53.011 Suite: memory 00:05:53.011 Test: alloc and free memory map ...[2024-05-14 23:47:53.360726] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:53.011 passed 00:05:53.011 Test: mem map translation ...[2024-05-14 23:47:53.390074] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:53.011 [2024-05-14 23:47:53.390098] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:53.011 [2024-05-14 23:47:53.390154] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:53.011 [2024-05-14 23:47:53.390168] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:53.011 passed 00:05:53.011 Test: mem map registration ...[2024-05-14 23:47:53.447906] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:53.011 [2024-05-14 23:47:53.447929] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:53.011 passed 00:05:53.011 Test: mem map adjacent registrations ...passed 00:05:53.011 00:05:53.011 Run Summary: Type Total Ran Passed Failed Inactive 00:05:53.011 suites 1 1 n/a 0 0 00:05:53.011 tests 4 4 4 0 0 00:05:53.011 asserts 152 152 152 0 n/a 00:05:53.011 00:05:53.011 Elapsed time = 0.201 seconds 00:05:53.011 00:05:53.011 real 0m0.215s 00:05:53.011 user 0m0.201s 00:05:53.011 sys 0m0.013s 00:05:53.011 23:47:53 env.env_memory -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:53.011 23:47:53 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:05:53.011 ************************************ 00:05:53.011 END TEST env_memory 00:05:53.011 ************************************ 00:05:53.011 23:47:53 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:53.011 23:47:53 env -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:53.011 23:47:53 env -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:53.011 23:47:53 env -- common/autotest_common.sh@10 -- # set +x 00:05:53.271 ************************************ 00:05:53.271 START TEST env_vtophys 00:05:53.271 ************************************ 00:05:53.271 23:47:53 env.env_vtophys -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:53.271 EAL: lib.eal log level changed from notice to debug 00:05:53.271 EAL: Detected lcore 0 as core 0 on socket 0 00:05:53.271 EAL: Detected lcore 1 as core 1 on socket 0 00:05:53.271 EAL: Detected lcore 2 as core 2 on socket 0 00:05:53.271 EAL: Detected lcore 3 as core 3 on socket 0 00:05:53.271 EAL: Detected lcore 4 as core 4 on socket 0 00:05:53.271 EAL: Detected lcore 5 as core 8 on socket 0 00:05:53.271 EAL: Detected lcore 6 as core 9 on socket 0 00:05:53.271 EAL: Detected lcore 7 as core 10 on socket 0 00:05:53.271 EAL: Detected lcore 8 as core 11 on socket 0 00:05:53.271 EAL: Detected lcore 9 as core 16 on socket 0 00:05:53.271 EAL: Detected lcore 10 as core 17 on socket 0 00:05:53.271 EAL: Detected lcore 11 as core 18 on socket 0 00:05:53.271 EAL: Detected lcore 12 as core 19 on socket 0 00:05:53.271 EAL: Detected lcore 13 as core 20 on socket 0 00:05:53.271 EAL: Detected lcore 14 as core 24 on socket 0 00:05:53.271 EAL: Detected lcore 15 as core 25 on socket 0 00:05:53.271 EAL: Detected lcore 16 as core 26 on socket 0 00:05:53.271 EAL: Detected lcore 17 as core 27 on socket 0 00:05:53.271 EAL: Detected lcore 18 as core 0 on socket 1 00:05:53.271 EAL: Detected lcore 19 as core 1 on socket 1 00:05:53.271 EAL: Detected lcore 20 as core 2 on socket 1 00:05:53.271 EAL: Detected lcore 21 as core 3 on socket 1 00:05:53.271 EAL: Detected lcore 22 as core 4 on socket 1 00:05:53.271 EAL: Detected lcore 23 as core 8 on socket 1 00:05:53.271 EAL: Detected lcore 24 as core 9 on socket 1 00:05:53.271 EAL: Detected lcore 25 as core 10 on socket 1 00:05:53.271 EAL: Detected lcore 26 as core 11 on socket 1 00:05:53.271 EAL: Detected lcore 27 as core 16 on socket 1 00:05:53.271 EAL: Detected lcore 28 as core 17 on socket 1 00:05:53.271 EAL: Detected lcore 29 as core 18 on socket 1 00:05:53.271 EAL: Detected lcore 30 as core 19 on socket 1 00:05:53.271 EAL: Detected lcore 31 as core 20 on socket 1 00:05:53.271 EAL: Detected lcore 32 as core 24 on socket 1 00:05:53.271 EAL: Detected lcore 33 as core 25 on socket 1 00:05:53.271 EAL: Detected lcore 34 as core 26 on socket 1 00:05:53.271 EAL: Detected lcore 35 as core 27 on socket 1 00:05:53.271 EAL: Detected lcore 36 as core 0 on socket 0 00:05:53.271 EAL: Detected lcore 37 as core 1 on socket 0 00:05:53.271 EAL: Detected lcore 38 as core 2 on socket 0 00:05:53.271 EAL: Detected lcore 39 as core 3 on socket 0 00:05:53.271 EAL: Detected lcore 40 as core 4 on socket 0 00:05:53.271 EAL: Detected lcore 41 as core 8 on socket 0 00:05:53.271 EAL: Detected lcore 42 as core 9 on socket 0 00:05:53.271 EAL: Detected lcore 43 as core 10 on socket 0 00:05:53.271 EAL: Detected lcore 44 as core 11 on socket 0 00:05:53.271 EAL: Detected lcore 45 as core 16 on socket 0 00:05:53.271 EAL: Detected lcore 46 as core 17 on socket 0 00:05:53.271 EAL: Detected lcore 47 as core 18 on socket 0 00:05:53.271 EAL: Detected lcore 48 as core 19 on socket 0 00:05:53.271 EAL: Detected lcore 49 as core 20 on socket 0 00:05:53.271 EAL: Detected lcore 50 as core 24 on socket 0 00:05:53.271 EAL: Detected lcore 51 as core 25 on socket 0 00:05:53.271 EAL: Detected lcore 52 as core 26 on socket 0 00:05:53.271 EAL: Detected lcore 53 as core 27 on socket 0 00:05:53.271 EAL: Detected lcore 54 as core 0 on socket 1 00:05:53.271 EAL: Detected lcore 55 as core 1 on socket 1 00:05:53.271 EAL: Detected lcore 56 as core 2 on socket 1 00:05:53.271 EAL: Detected lcore 57 as core 3 on socket 1 00:05:53.271 EAL: Detected lcore 58 as core 4 on socket 1 00:05:53.271 EAL: Detected lcore 59 as core 8 on socket 1 00:05:53.271 EAL: Detected lcore 60 as core 9 on socket 1 00:05:53.271 EAL: Detected lcore 61 as core 10 on socket 1 00:05:53.271 EAL: Detected lcore 62 as core 11 on socket 1 00:05:53.271 EAL: Detected lcore 63 as core 16 on socket 1 00:05:53.271 EAL: Detected lcore 64 as core 17 on socket 1 00:05:53.271 EAL: Detected lcore 65 as core 18 on socket 1 00:05:53.271 EAL: Detected lcore 66 as core 19 on socket 1 00:05:53.271 EAL: Detected lcore 67 as core 20 on socket 1 00:05:53.271 EAL: Detected lcore 68 as core 24 on socket 1 00:05:53.271 EAL: Detected lcore 69 as core 25 on socket 1 00:05:53.271 EAL: Detected lcore 70 as core 26 on socket 1 00:05:53.271 EAL: Detected lcore 71 as core 27 on socket 1 00:05:53.271 EAL: Maximum logical cores by configuration: 128 00:05:53.271 EAL: Detected CPU lcores: 72 00:05:53.271 EAL: Detected NUMA nodes: 2 00:05:53.271 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:05:53.271 EAL: Detected shared linkage of DPDK 00:05:53.271 EAL: No shared files mode enabled, IPC will be disabled 00:05:53.271 EAL: No shared files mode enabled, IPC is disabled 00:05:53.271 EAL: PCI driver qat for device 0000:3d:01.0 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3d:01.1 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3d:01.2 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3d:01.3 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3d:01.4 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3d:01.5 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3d:01.6 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3d:01.7 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3d:02.0 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3d:02.1 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3d:02.2 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3d:02.3 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3d:02.4 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3d:02.5 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3d:02.6 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3d:02.7 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3f:01.0 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3f:01.1 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3f:01.2 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3f:01.3 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3f:01.4 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3f:01.5 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3f:01.6 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3f:01.7 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3f:02.0 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3f:02.1 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3f:02.2 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3f:02.3 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3f:02.4 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3f:02.5 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3f:02.6 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:3f:02.7 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:da:01.0 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:da:01.1 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:da:01.2 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:da:01.3 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:da:01.4 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:da:01.5 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:da:01.6 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:da:01.7 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:da:02.0 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:da:02.1 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:da:02.2 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:da:02.3 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:da:02.4 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:da:02.5 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:da:02.6 wants IOVA as 'PA' 00:05:53.271 EAL: PCI driver qat for device 0000:da:02.7 wants IOVA as 'PA' 00:05:53.271 EAL: Bus pci wants IOVA as 'PA' 00:05:53.271 EAL: Bus auxiliary wants IOVA as 'DC' 00:05:53.271 EAL: Bus vdev wants IOVA as 'DC' 00:05:53.271 EAL: Selected IOVA mode 'PA' 00:05:53.271 EAL: Probing VFIO support... 00:05:53.271 EAL: IOMMU type 1 (Type 1) is supported 00:05:53.271 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:53.271 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:53.271 EAL: VFIO support initialized 00:05:53.271 EAL: Ask a virtual area of 0x2e000 bytes 00:05:53.271 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:53.271 EAL: Setting up physically contiguous memory... 00:05:53.271 EAL: Setting maximum number of open files to 524288 00:05:53.271 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:53.271 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:53.271 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:53.271 EAL: Ask a virtual area of 0x61000 bytes 00:05:53.271 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:53.271 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:53.271 EAL: Ask a virtual area of 0x400000000 bytes 00:05:53.271 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:53.271 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:53.271 EAL: Ask a virtual area of 0x61000 bytes 00:05:53.271 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:53.271 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:53.271 EAL: Ask a virtual area of 0x400000000 bytes 00:05:53.271 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:53.271 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:53.271 EAL: Ask a virtual area of 0x61000 bytes 00:05:53.271 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:53.271 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:53.271 EAL: Ask a virtual area of 0x400000000 bytes 00:05:53.271 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:53.271 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:53.271 EAL: Ask a virtual area of 0x61000 bytes 00:05:53.271 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:53.271 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:53.271 EAL: Ask a virtual area of 0x400000000 bytes 00:05:53.271 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:53.271 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:53.271 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:53.271 EAL: Ask a virtual area of 0x61000 bytes 00:05:53.271 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:53.271 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:53.271 EAL: Ask a virtual area of 0x400000000 bytes 00:05:53.271 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:53.271 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:53.271 EAL: Ask a virtual area of 0x61000 bytes 00:05:53.271 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:53.271 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:53.271 EAL: Ask a virtual area of 0x400000000 bytes 00:05:53.271 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:53.271 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:53.271 EAL: Ask a virtual area of 0x61000 bytes 00:05:53.271 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:53.271 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:53.271 EAL: Ask a virtual area of 0x400000000 bytes 00:05:53.271 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:53.271 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:53.271 EAL: Ask a virtual area of 0x61000 bytes 00:05:53.271 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:53.271 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:53.271 EAL: Ask a virtual area of 0x400000000 bytes 00:05:53.271 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:53.271 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:53.271 EAL: Hugepages will be freed exactly as allocated. 00:05:53.271 EAL: No shared files mode enabled, IPC is disabled 00:05:53.271 EAL: No shared files mode enabled, IPC is disabled 00:05:53.271 EAL: TSC frequency is ~2300000 KHz 00:05:53.271 EAL: Main lcore 0 is ready (tid=7f0ae1f63b00;cpuset=[0]) 00:05:53.271 EAL: Trying to obtain current memory policy. 00:05:53.271 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:53.271 EAL: Restoring previous memory policy: 0 00:05:53.271 EAL: request: mp_malloc_sync 00:05:53.271 EAL: No shared files mode enabled, IPC is disabled 00:05:53.271 EAL: Heap on socket 0 was expanded by 2MB 00:05:53.271 EAL: PCI device 0000:3d:01.0 on NUMA socket 0 00:05:53.271 EAL: probe driver: 8086:37c9 qat 00:05:53.271 EAL: PCI memory mapped at 0x202001000000 00:05:53.271 EAL: PCI memory mapped at 0x202001001000 00:05:53.271 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:05:53.271 EAL: PCI device 0000:3d:01.1 on NUMA socket 0 00:05:53.271 EAL: probe driver: 8086:37c9 qat 00:05:53.271 EAL: PCI memory mapped at 0x202001002000 00:05:53.271 EAL: PCI memory mapped at 0x202001003000 00:05:53.271 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:05:53.271 EAL: PCI device 0000:3d:01.2 on NUMA socket 0 00:05:53.271 EAL: probe driver: 8086:37c9 qat 00:05:53.271 EAL: PCI memory mapped at 0x202001004000 00:05:53.271 EAL: PCI memory mapped at 0x202001005000 00:05:53.271 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:05:53.271 EAL: PCI device 0000:3d:01.3 on NUMA socket 0 00:05:53.271 EAL: probe driver: 8086:37c9 qat 00:05:53.271 EAL: PCI memory mapped at 0x202001006000 00:05:53.271 EAL: PCI memory mapped at 0x202001007000 00:05:53.271 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:05:53.271 EAL: PCI device 0000:3d:01.4 on NUMA socket 0 00:05:53.271 EAL: probe driver: 8086:37c9 qat 00:05:53.271 EAL: PCI memory mapped at 0x202001008000 00:05:53.271 EAL: PCI memory mapped at 0x202001009000 00:05:53.271 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:05:53.271 EAL: PCI device 0000:3d:01.5 on NUMA socket 0 00:05:53.271 EAL: probe driver: 8086:37c9 qat 00:05:53.271 EAL: PCI memory mapped at 0x20200100a000 00:05:53.271 EAL: PCI memory mapped at 0x20200100b000 00:05:53.271 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:05:53.271 EAL: PCI device 0000:3d:01.6 on NUMA socket 0 00:05:53.271 EAL: probe driver: 8086:37c9 qat 00:05:53.271 EAL: PCI memory mapped at 0x20200100c000 00:05:53.271 EAL: PCI memory mapped at 0x20200100d000 00:05:53.271 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:05:53.271 EAL: PCI device 0000:3d:01.7 on NUMA socket 0 00:05:53.271 EAL: probe driver: 8086:37c9 qat 00:05:53.271 EAL: PCI memory mapped at 0x20200100e000 00:05:53.271 EAL: PCI memory mapped at 0x20200100f000 00:05:53.271 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:05:53.271 EAL: PCI device 0000:3d:02.0 on NUMA socket 0 00:05:53.271 EAL: probe driver: 8086:37c9 qat 00:05:53.271 EAL: PCI memory mapped at 0x202001010000 00:05:53.271 EAL: PCI memory mapped at 0x202001011000 00:05:53.271 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:05:53.271 EAL: PCI device 0000:3d:02.1 on NUMA socket 0 00:05:53.271 EAL: probe driver: 8086:37c9 qat 00:05:53.271 EAL: PCI memory mapped at 0x202001012000 00:05:53.271 EAL: PCI memory mapped at 0x202001013000 00:05:53.271 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:05:53.271 EAL: PCI device 0000:3d:02.2 on NUMA socket 0 00:05:53.271 EAL: probe driver: 8086:37c9 qat 00:05:53.271 EAL: PCI memory mapped at 0x202001014000 00:05:53.271 EAL: PCI memory mapped at 0x202001015000 00:05:53.271 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:05:53.271 EAL: PCI device 0000:3d:02.3 on NUMA socket 0 00:05:53.271 EAL: probe driver: 8086:37c9 qat 00:05:53.271 EAL: PCI memory mapped at 0x202001016000 00:05:53.271 EAL: PCI memory mapped at 0x202001017000 00:05:53.271 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:05:53.271 EAL: PCI device 0000:3d:02.4 on NUMA socket 0 00:05:53.271 EAL: probe driver: 8086:37c9 qat 00:05:53.271 EAL: PCI memory mapped at 0x202001018000 00:05:53.271 EAL: PCI memory mapped at 0x202001019000 00:05:53.271 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:05:53.272 EAL: PCI device 0000:3d:02.5 on NUMA socket 0 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x20200101a000 00:05:53.272 EAL: PCI memory mapped at 0x20200101b000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:05:53.272 EAL: PCI device 0000:3d:02.6 on NUMA socket 0 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x20200101c000 00:05:53.272 EAL: PCI memory mapped at 0x20200101d000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:05:53.272 EAL: PCI device 0000:3d:02.7 on NUMA socket 0 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x20200101e000 00:05:53.272 EAL: PCI memory mapped at 0x20200101f000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:05:53.272 EAL: PCI device 0000:3f:01.0 on NUMA socket 0 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x202001020000 00:05:53.272 EAL: PCI memory mapped at 0x202001021000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:05:53.272 EAL: PCI device 0000:3f:01.1 on NUMA socket 0 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x202001022000 00:05:53.272 EAL: PCI memory mapped at 0x202001023000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:05:53.272 EAL: PCI device 0000:3f:01.2 on NUMA socket 0 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x202001024000 00:05:53.272 EAL: PCI memory mapped at 0x202001025000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:05:53.272 EAL: PCI device 0000:3f:01.3 on NUMA socket 0 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x202001026000 00:05:53.272 EAL: PCI memory mapped at 0x202001027000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:05:53.272 EAL: PCI device 0000:3f:01.4 on NUMA socket 0 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x202001028000 00:05:53.272 EAL: PCI memory mapped at 0x202001029000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:05:53.272 EAL: PCI device 0000:3f:01.5 on NUMA socket 0 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x20200102a000 00:05:53.272 EAL: PCI memory mapped at 0x20200102b000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:05:53.272 EAL: PCI device 0000:3f:01.6 on NUMA socket 0 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x20200102c000 00:05:53.272 EAL: PCI memory mapped at 0x20200102d000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:05:53.272 EAL: PCI device 0000:3f:01.7 on NUMA socket 0 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x20200102e000 00:05:53.272 EAL: PCI memory mapped at 0x20200102f000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:05:53.272 EAL: PCI device 0000:3f:02.0 on NUMA socket 0 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x202001030000 00:05:53.272 EAL: PCI memory mapped at 0x202001031000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:05:53.272 EAL: PCI device 0000:3f:02.1 on NUMA socket 0 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x202001032000 00:05:53.272 EAL: PCI memory mapped at 0x202001033000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:05:53.272 EAL: PCI device 0000:3f:02.2 on NUMA socket 0 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x202001034000 00:05:53.272 EAL: PCI memory mapped at 0x202001035000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:05:53.272 EAL: PCI device 0000:3f:02.3 on NUMA socket 0 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x202001036000 00:05:53.272 EAL: PCI memory mapped at 0x202001037000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:05:53.272 EAL: PCI device 0000:3f:02.4 on NUMA socket 0 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x202001038000 00:05:53.272 EAL: PCI memory mapped at 0x202001039000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:05:53.272 EAL: PCI device 0000:3f:02.5 on NUMA socket 0 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x20200103a000 00:05:53.272 EAL: PCI memory mapped at 0x20200103b000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:05:53.272 EAL: PCI device 0000:3f:02.6 on NUMA socket 0 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x20200103c000 00:05:53.272 EAL: PCI memory mapped at 0x20200103d000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:05:53.272 EAL: PCI device 0000:3f:02.7 on NUMA socket 0 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x20200103e000 00:05:53.272 EAL: PCI memory mapped at 0x20200103f000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:05:53.272 EAL: PCI device 0000:da:01.0 on NUMA socket 1 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x202001040000 00:05:53.272 EAL: PCI memory mapped at 0x202001041000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:05:53.272 EAL: Trying to obtain current memory policy. 00:05:53.272 EAL: Setting policy MPOL_PREFERRED for socket 1 00:05:53.272 EAL: Restoring previous memory policy: 4 00:05:53.272 EAL: request: mp_malloc_sync 00:05:53.272 EAL: No shared files mode enabled, IPC is disabled 00:05:53.272 EAL: Heap on socket 1 was expanded by 2MB 00:05:53.272 EAL: PCI device 0000:da:01.1 on NUMA socket 1 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x202001042000 00:05:53.272 EAL: PCI memory mapped at 0x202001043000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:05:53.272 EAL: PCI device 0000:da:01.2 on NUMA socket 1 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x202001044000 00:05:53.272 EAL: PCI memory mapped at 0x202001045000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:05:53.272 EAL: PCI device 0000:da:01.3 on NUMA socket 1 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x202001046000 00:05:53.272 EAL: PCI memory mapped at 0x202001047000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:05:53.272 EAL: PCI device 0000:da:01.4 on NUMA socket 1 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x202001048000 00:05:53.272 EAL: PCI memory mapped at 0x202001049000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:05:53.272 EAL: PCI device 0000:da:01.5 on NUMA socket 1 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x20200104a000 00:05:53.272 EAL: PCI memory mapped at 0x20200104b000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:05:53.272 EAL: PCI device 0000:da:01.6 on NUMA socket 1 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x20200104c000 00:05:53.272 EAL: PCI memory mapped at 0x20200104d000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:05:53.272 EAL: PCI device 0000:da:01.7 on NUMA socket 1 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x20200104e000 00:05:53.272 EAL: PCI memory mapped at 0x20200104f000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:05:53.272 EAL: PCI device 0000:da:02.0 on NUMA socket 1 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x202001050000 00:05:53.272 EAL: PCI memory mapped at 0x202001051000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:05:53.272 EAL: PCI device 0000:da:02.1 on NUMA socket 1 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x202001052000 00:05:53.272 EAL: PCI memory mapped at 0x202001053000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:05:53.272 EAL: PCI device 0000:da:02.2 on NUMA socket 1 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x202001054000 00:05:53.272 EAL: PCI memory mapped at 0x202001055000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:05:53.272 EAL: PCI device 0000:da:02.3 on NUMA socket 1 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x202001056000 00:05:53.272 EAL: PCI memory mapped at 0x202001057000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:05:53.272 EAL: PCI device 0000:da:02.4 on NUMA socket 1 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x202001058000 00:05:53.272 EAL: PCI memory mapped at 0x202001059000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:05:53.272 EAL: PCI device 0000:da:02.5 on NUMA socket 1 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x20200105a000 00:05:53.272 EAL: PCI memory mapped at 0x20200105b000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:05:53.272 EAL: PCI device 0000:da:02.6 on NUMA socket 1 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x20200105c000 00:05:53.272 EAL: PCI memory mapped at 0x20200105d000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:05:53.272 EAL: PCI device 0000:da:02.7 on NUMA socket 1 00:05:53.272 EAL: probe driver: 8086:37c9 qat 00:05:53.272 EAL: PCI memory mapped at 0x20200105e000 00:05:53.272 EAL: PCI memory mapped at 0x20200105f000 00:05:53.272 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:05:53.272 EAL: No shared files mode enabled, IPC is disabled 00:05:53.272 EAL: No shared files mode enabled, IPC is disabled 00:05:53.272 EAL: No PCI address specified using 'addr=' in: bus=pci 00:05:53.272 EAL: Mem event callback 'spdk:(nil)' registered 00:05:53.272 00:05:53.272 00:05:53.272 CUnit - A unit testing framework for C - Version 2.1-3 00:05:53.272 http://cunit.sourceforge.net/ 00:05:53.272 00:05:53.272 00:05:53.272 Suite: components_suite 00:05:53.272 Test: vtophys_malloc_test ...passed 00:05:53.272 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:53.272 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:53.272 EAL: Restoring previous memory policy: 4 00:05:53.272 EAL: Calling mem event callback 'spdk:(nil)' 00:05:53.272 EAL: request: mp_malloc_sync 00:05:53.272 EAL: No shared files mode enabled, IPC is disabled 00:05:53.272 EAL: Heap on socket 0 was expanded by 4MB 00:05:53.272 EAL: Calling mem event callback 'spdk:(nil)' 00:05:53.272 EAL: request: mp_malloc_sync 00:05:53.272 EAL: No shared files mode enabled, IPC is disabled 00:05:53.272 EAL: Heap on socket 0 was shrunk by 4MB 00:05:53.272 EAL: Trying to obtain current memory policy. 00:05:53.272 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:53.272 EAL: Restoring previous memory policy: 4 00:05:53.272 EAL: Calling mem event callback 'spdk:(nil)' 00:05:53.272 EAL: request: mp_malloc_sync 00:05:53.272 EAL: No shared files mode enabled, IPC is disabled 00:05:53.272 EAL: Heap on socket 0 was expanded by 6MB 00:05:53.272 EAL: Calling mem event callback 'spdk:(nil)' 00:05:53.272 EAL: request: mp_malloc_sync 00:05:53.272 EAL: No shared files mode enabled, IPC is disabled 00:05:53.272 EAL: Heap on socket 0 was shrunk by 6MB 00:05:53.272 EAL: Trying to obtain current memory policy. 00:05:53.272 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:53.272 EAL: Restoring previous memory policy: 4 00:05:53.272 EAL: Calling mem event callback 'spdk:(nil)' 00:05:53.272 EAL: request: mp_malloc_sync 00:05:53.272 EAL: No shared files mode enabled, IPC is disabled 00:05:53.272 EAL: Heap on socket 0 was expanded by 10MB 00:05:53.272 EAL: Calling mem event callback 'spdk:(nil)' 00:05:53.272 EAL: request: mp_malloc_sync 00:05:53.272 EAL: No shared files mode enabled, IPC is disabled 00:05:53.272 EAL: Heap on socket 0 was shrunk by 10MB 00:05:53.272 EAL: Trying to obtain current memory policy. 00:05:53.272 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:53.272 EAL: Restoring previous memory policy: 4 00:05:53.272 EAL: Calling mem event callback 'spdk:(nil)' 00:05:53.272 EAL: request: mp_malloc_sync 00:05:53.272 EAL: No shared files mode enabled, IPC is disabled 00:05:53.272 EAL: Heap on socket 0 was expanded by 18MB 00:05:53.272 EAL: Calling mem event callback 'spdk:(nil)' 00:05:53.272 EAL: request: mp_malloc_sync 00:05:53.272 EAL: No shared files mode enabled, IPC is disabled 00:05:53.272 EAL: Heap on socket 0 was shrunk by 18MB 00:05:53.272 EAL: Trying to obtain current memory policy. 00:05:53.272 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:53.272 EAL: Restoring previous memory policy: 4 00:05:53.272 EAL: Calling mem event callback 'spdk:(nil)' 00:05:53.272 EAL: request: mp_malloc_sync 00:05:53.272 EAL: No shared files mode enabled, IPC is disabled 00:05:53.272 EAL: Heap on socket 0 was expanded by 34MB 00:05:53.272 EAL: Calling mem event callback 'spdk:(nil)' 00:05:53.272 EAL: request: mp_malloc_sync 00:05:53.272 EAL: No shared files mode enabled, IPC is disabled 00:05:53.272 EAL: Heap on socket 0 was shrunk by 34MB 00:05:53.272 EAL: Trying to obtain current memory policy. 00:05:53.272 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:53.272 EAL: Restoring previous memory policy: 4 00:05:53.272 EAL: Calling mem event callback 'spdk:(nil)' 00:05:53.272 EAL: request: mp_malloc_sync 00:05:53.272 EAL: No shared files mode enabled, IPC is disabled 00:05:53.272 EAL: Heap on socket 0 was expanded by 66MB 00:05:53.272 EAL: Calling mem event callback 'spdk:(nil)' 00:05:53.272 EAL: request: mp_malloc_sync 00:05:53.272 EAL: No shared files mode enabled, IPC is disabled 00:05:53.272 EAL: Heap on socket 0 was shrunk by 66MB 00:05:53.272 EAL: Trying to obtain current memory policy. 00:05:53.272 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:53.532 EAL: Restoring previous memory policy: 4 00:05:53.532 EAL: Calling mem event callback 'spdk:(nil)' 00:05:53.532 EAL: request: mp_malloc_sync 00:05:53.532 EAL: No shared files mode enabled, IPC is disabled 00:05:53.532 EAL: Heap on socket 0 was expanded by 130MB 00:05:53.532 EAL: Calling mem event callback 'spdk:(nil)' 00:05:53.532 EAL: request: mp_malloc_sync 00:05:53.532 EAL: No shared files mode enabled, IPC is disabled 00:05:53.532 EAL: Heap on socket 0 was shrunk by 130MB 00:05:53.532 EAL: Trying to obtain current memory policy. 00:05:53.532 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:53.532 EAL: Restoring previous memory policy: 4 00:05:53.532 EAL: Calling mem event callback 'spdk:(nil)' 00:05:53.532 EAL: request: mp_malloc_sync 00:05:53.532 EAL: No shared files mode enabled, IPC is disabled 00:05:53.532 EAL: Heap on socket 0 was expanded by 258MB 00:05:53.532 EAL: Calling mem event callback 'spdk:(nil)' 00:05:53.532 EAL: request: mp_malloc_sync 00:05:53.532 EAL: No shared files mode enabled, IPC is disabled 00:05:53.532 EAL: Heap on socket 0 was shrunk by 258MB 00:05:53.532 EAL: Trying to obtain current memory policy. 00:05:53.532 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:53.791 EAL: Restoring previous memory policy: 4 00:05:53.791 EAL: Calling mem event callback 'spdk:(nil)' 00:05:53.791 EAL: request: mp_malloc_sync 00:05:53.791 EAL: No shared files mode enabled, IPC is disabled 00:05:53.791 EAL: Heap on socket 0 was expanded by 514MB 00:05:53.791 EAL: Calling mem event callback 'spdk:(nil)' 00:05:53.791 EAL: request: mp_malloc_sync 00:05:53.791 EAL: No shared files mode enabled, IPC is disabled 00:05:53.791 EAL: Heap on socket 0 was shrunk by 514MB 00:05:53.791 EAL: Trying to obtain current memory policy. 00:05:53.791 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:54.050 EAL: Restoring previous memory policy: 4 00:05:54.050 EAL: Calling mem event callback 'spdk:(nil)' 00:05:54.050 EAL: request: mp_malloc_sync 00:05:54.050 EAL: No shared files mode enabled, IPC is disabled 00:05:54.050 EAL: Heap on socket 0 was expanded by 1026MB 00:05:54.308 EAL: Calling mem event callback 'spdk:(nil)' 00:05:54.568 EAL: request: mp_malloc_sync 00:05:54.568 EAL: No shared files mode enabled, IPC is disabled 00:05:54.568 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:54.568 passed 00:05:54.568 00:05:54.568 Run Summary: Type Total Ran Passed Failed Inactive 00:05:54.568 suites 1 1 n/a 0 0 00:05:54.568 tests 2 2 2 0 0 00:05:54.568 asserts 5778 5778 5778 0 n/a 00:05:54.568 00:05:54.568 Elapsed time = 1.183 seconds 00:05:54.568 EAL: No shared files mode enabled, IPC is disabled 00:05:54.568 EAL: No shared files mode enabled, IPC is disabled 00:05:54.568 EAL: No shared files mode enabled, IPC is disabled 00:05:54.568 00:05:54.568 real 0m1.379s 00:05:54.568 user 0m0.780s 00:05:54.568 sys 0m0.572s 00:05:54.568 23:47:54 env.env_vtophys -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:54.568 23:47:54 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:05:54.568 ************************************ 00:05:54.568 END TEST env_vtophys 00:05:54.568 ************************************ 00:05:54.568 23:47:55 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:05:54.568 23:47:55 env -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:54.568 23:47:55 env -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:54.568 23:47:55 env -- common/autotest_common.sh@10 -- # set +x 00:05:54.568 ************************************ 00:05:54.568 START TEST env_pci 00:05:54.568 ************************************ 00:05:54.568 23:47:55 env.env_pci -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:05:54.568 00:05:54.568 00:05:54.568 CUnit - A unit testing framework for C - Version 2.1-3 00:05:54.568 http://cunit.sourceforge.net/ 00:05:54.568 00:05:54.568 00:05:54.568 Suite: pci 00:05:54.568 Test: pci_hook ...[2024-05-14 23:47:55.115774] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 345027 has claimed it 00:05:54.568 EAL: Cannot find device (10000:00:01.0) 00:05:54.568 EAL: Failed to attach device on primary process 00:05:54.568 passed 00:05:54.568 00:05:54.568 Run Summary: Type Total Ran Passed Failed Inactive 00:05:54.568 suites 1 1 n/a 0 0 00:05:54.568 tests 1 1 1 0 0 00:05:54.568 asserts 25 25 25 0 n/a 00:05:54.568 00:05:54.568 Elapsed time = 0.034 seconds 00:05:54.568 00:05:54.568 real 0m0.062s 00:05:54.568 user 0m0.020s 00:05:54.568 sys 0m0.042s 00:05:54.568 23:47:55 env.env_pci -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:54.568 23:47:55 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:05:54.568 ************************************ 00:05:54.568 END TEST env_pci 00:05:54.568 ************************************ 00:05:54.828 23:47:55 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:54.828 23:47:55 env -- env/env.sh@15 -- # uname 00:05:54.828 23:47:55 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:54.828 23:47:55 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:54.828 23:47:55 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:54.828 23:47:55 env -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:05:54.828 23:47:55 env -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:54.828 23:47:55 env -- common/autotest_common.sh@10 -- # set +x 00:05:54.828 ************************************ 00:05:54.828 START TEST env_dpdk_post_init 00:05:54.828 ************************************ 00:05:54.828 23:47:55 env.env_dpdk_post_init -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:54.828 EAL: Detected CPU lcores: 72 00:05:54.828 EAL: Detected NUMA nodes: 2 00:05:54.828 EAL: Detected shared linkage of DPDK 00:05:54.828 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:54.828 EAL: Selected IOVA mode 'PA' 00:05:54.828 EAL: VFIO support initialized 00:05:54.828 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.828 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.828 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.828 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.828 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.828 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.828 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.828 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.828 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.828 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.828 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.828 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.828 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.828 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.828 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.828 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.828 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.828 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.828 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.828 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.828 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.828 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.828 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.828 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.828 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.828 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:05:54.828 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.828 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.829 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.829 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.829 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.829 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.829 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.829 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.829 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.829 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.829 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.829 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.829 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.829 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.829 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.829 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.829 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.829 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.829 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.829 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.829 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.829 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.829 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.829 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.829 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:05:54.829 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.829 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:55.088 EAL: Using IOMMU type 1 (Type 1) 00:05:55.088 EAL: Ignore mapping IO port bar(1) 00:05:55.088 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.0 (socket 0) 00:05:55.088 EAL: Ignore mapping IO port bar(1) 00:05:55.088 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.1 (socket 0) 00:05:55.088 EAL: Ignore mapping IO port bar(1) 00:05:55.088 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.2 (socket 0) 00:05:55.088 EAL: Ignore mapping IO port bar(1) 00:05:55.089 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.3 (socket 0) 00:05:55.089 EAL: Ignore mapping IO port bar(1) 00:05:55.089 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.4 (socket 0) 00:05:55.089 EAL: Ignore mapping IO port bar(1) 00:05:55.089 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.5 (socket 0) 00:05:55.089 EAL: Ignore mapping IO port bar(1) 00:05:55.089 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.6 (socket 0) 00:05:55.089 EAL: Ignore mapping IO port bar(1) 00:05:55.089 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.7 (socket 0) 00:05:55.348 EAL: Probe PCI driver: spdk_nvme (8086:0b60) device: 0000:5e:00.0 (socket 0) 00:05:55.348 EAL: Ignore mapping IO port bar(1) 00:05:55.348 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.0 (socket 1) 00:05:55.348 EAL: Ignore mapping IO port bar(1) 00:05:55.348 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.1 (socket 1) 00:05:55.348 EAL: Ignore mapping IO port bar(1) 00:05:55.348 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.2 (socket 1) 00:05:55.348 EAL: Ignore mapping IO port bar(1) 00:05:55.348 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.3 (socket 1) 00:05:55.348 EAL: Ignore mapping IO port bar(1) 00:05:55.348 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.4 (socket 1) 00:05:55.348 EAL: Ignore mapping IO port bar(1) 00:05:55.348 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.5 (socket 1) 00:05:55.348 EAL: Ignore mapping IO port bar(1) 00:05:55.348 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.6 (socket 1) 00:05:55.348 EAL: Ignore mapping IO port bar(1) 00:05:55.348 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.7 (socket 1) 00:05:55.348 EAL: Ignore mapping IO port bar(1) 00:05:55.348 EAL: Ignore mapping IO port bar(5) 00:05:55.348 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:85:05.5 (socket 1) 00:05:55.348 EAL: Ignore mapping IO port bar(1) 00:05:55.348 EAL: Ignore mapping IO port bar(5) 00:05:55.348 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:d7:05.5 (socket 1) 00:05:57.916 EAL: Releasing PCI mapped resource for 0000:5e:00.0 00:05:57.916 EAL: Calling pci_unmap_resource for 0000:5e:00.0 at 0x202001080000 00:05:57.916 Starting DPDK initialization... 00:05:57.916 Starting SPDK post initialization... 00:05:57.916 SPDK NVMe probe 00:05:57.916 Attaching to 0000:5e:00.0 00:05:57.916 Attached to 0000:5e:00.0 00:05:57.916 Cleaning up... 00:05:58.205 00:05:58.205 real 0m3.264s 00:05:58.205 user 0m2.226s 00:05:58.205 sys 0m0.594s 00:05:58.205 23:47:58 env.env_dpdk_post_init -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:58.205 23:47:58 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:05:58.205 ************************************ 00:05:58.205 END TEST env_dpdk_post_init 00:05:58.205 ************************************ 00:05:58.205 23:47:58 env -- env/env.sh@26 -- # uname 00:05:58.205 23:47:58 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:58.205 23:47:58 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:58.205 23:47:58 env -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:58.205 23:47:58 env -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:58.205 23:47:58 env -- common/autotest_common.sh@10 -- # set +x 00:05:58.205 ************************************ 00:05:58.205 START TEST env_mem_callbacks 00:05:58.205 ************************************ 00:05:58.205 23:47:58 env.env_mem_callbacks -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:58.205 EAL: Detected CPU lcores: 72 00:05:58.205 EAL: Detected NUMA nodes: 2 00:05:58.205 EAL: Detected shared linkage of DPDK 00:05:58.205 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:58.205 EAL: Selected IOVA mode 'PA' 00:05:58.205 EAL: VFIO support initialized 00:05:58.205 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.205 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.205 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.205 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.205 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.205 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.205 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.205 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.205 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.205 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.205 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.205 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.205 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.205 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.205 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.205 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.205 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.205 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.205 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.205 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.205 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:05:58.205 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.205 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:05:58.205 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.206 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.206 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.206 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.206 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.206 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.206 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.206 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.206 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.206 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.206 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:58.206 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:58.206 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:58.206 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:58.206 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:58.206 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:58.206 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:58.206 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:58.206 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:58.206 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:58.206 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:58.206 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:58.206 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:58.206 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:58.206 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:58.206 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:58.206 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:58.206 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:05:58.206 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:58.206 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:58.206 00:05:58.206 00:05:58.206 CUnit - A unit testing framework for C - Version 2.1-3 00:05:58.206 http://cunit.sourceforge.net/ 00:05:58.206 00:05:58.206 00:05:58.206 Suite: memory 00:05:58.206 Test: test ... 00:05:58.206 register 0x200000200000 2097152 00:05:58.206 register 0x201000a00000 2097152 00:05:58.206 malloc 3145728 00:05:58.206 register 0x200000400000 4194304 00:05:58.206 buf 0x200000500000 len 3145728 PASSED 00:05:58.206 malloc 64 00:05:58.206 buf 0x2000004fff40 len 64 PASSED 00:05:58.206 malloc 4194304 00:05:58.206 register 0x200000800000 6291456 00:05:58.206 buf 0x200000a00000 len 4194304 PASSED 00:05:58.206 free 0x200000500000 3145728 00:05:58.206 free 0x2000004fff40 64 00:05:58.206 unregister 0x200000400000 4194304 PASSED 00:05:58.206 free 0x200000a00000 4194304 00:05:58.206 unregister 0x200000800000 6291456 PASSED 00:05:58.206 malloc 8388608 00:05:58.206 register 0x200000400000 10485760 00:05:58.206 buf 0x200000600000 len 8388608 PASSED 00:05:58.206 free 0x200000600000 8388608 00:05:58.206 unregister 0x200000400000 10485760 PASSED 00:05:58.206 passed 00:05:58.206 00:05:58.206 Run Summary: Type Total Ran Passed Failed Inactive 00:05:58.206 suites 1 1 n/a 0 0 00:05:58.206 tests 1 1 1 0 0 00:05:58.206 asserts 16 16 16 0 n/a 00:05:58.206 00:05:58.206 Elapsed time = 0.008 seconds 00:05:58.206 00:05:58.206 real 0m0.113s 00:05:58.206 user 0m0.036s 00:05:58.206 sys 0m0.077s 00:05:58.206 23:47:58 env.env_mem_callbacks -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:58.206 23:47:58 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:05:58.206 ************************************ 00:05:58.206 END TEST env_mem_callbacks 00:05:58.206 ************************************ 00:05:58.206 00:05:58.206 real 0m5.617s 00:05:58.206 user 0m3.449s 00:05:58.206 sys 0m1.718s 00:05:58.206 23:47:58 env -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:58.206 23:47:58 env -- common/autotest_common.sh@10 -- # set +x 00:05:58.206 ************************************ 00:05:58.206 END TEST env 00:05:58.206 ************************************ 00:05:58.465 23:47:58 -- spdk/autotest.sh@165 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:05:58.465 23:47:58 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:58.465 23:47:58 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:58.465 23:47:58 -- common/autotest_common.sh@10 -- # set +x 00:05:58.465 ************************************ 00:05:58.465 START TEST rpc 00:05:58.465 ************************************ 00:05:58.465 23:47:58 rpc -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:05:58.465 * Looking for test storage... 00:05:58.465 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:58.465 23:47:58 rpc -- rpc/rpc.sh@65 -- # spdk_pid=345631 00:05:58.465 23:47:58 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:58.465 23:47:58 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:58.465 23:47:58 rpc -- rpc/rpc.sh@67 -- # waitforlisten 345631 00:05:58.465 23:47:58 rpc -- common/autotest_common.sh@827 -- # '[' -z 345631 ']' 00:05:58.465 23:47:58 rpc -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:58.465 23:47:58 rpc -- common/autotest_common.sh@832 -- # local max_retries=100 00:05:58.465 23:47:58 rpc -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:58.465 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:58.465 23:47:58 rpc -- common/autotest_common.sh@836 -- # xtrace_disable 00:05:58.465 23:47:58 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:58.465 [2024-05-14 23:47:59.018638] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:05:58.465 [2024-05-14 23:47:59.018698] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid345631 ] 00:05:58.724 [2024-05-14 23:47:59.134668] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:58.724 [2024-05-14 23:47:59.233334] app.c: 604:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:58.724 [2024-05-14 23:47:59.233387] app.c: 605:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 345631' to capture a snapshot of events at runtime. 00:05:58.724 [2024-05-14 23:47:59.233411] app.c: 610:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:05:58.724 [2024-05-14 23:47:59.233424] app.c: 611:app_setup_trace: *NOTICE*: SPDK application currently running. 00:05:58.724 [2024-05-14 23:47:59.233435] app.c: 612:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid345631 for offline analysis/debug. 00:05:58.724 [2024-05-14 23:47:59.233467] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:59.659 23:47:59 rpc -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:05:59.659 23:47:59 rpc -- common/autotest_common.sh@860 -- # return 0 00:05:59.659 23:47:59 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:59.659 23:47:59 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:59.659 23:47:59 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:59.659 23:47:59 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:59.659 23:47:59 rpc -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:59.659 23:47:59 rpc -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:59.659 23:47:59 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:59.659 ************************************ 00:05:59.659 START TEST rpc_integrity 00:05:59.659 ************************************ 00:05:59.659 23:47:59 rpc.rpc_integrity -- common/autotest_common.sh@1121 -- # rpc_integrity 00:05:59.659 23:47:59 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:59.659 23:47:59 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:59.659 23:47:59 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:59.659 23:47:59 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:59.659 23:47:59 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:59.659 23:47:59 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:05:59.659 23:48:00 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:59.659 23:48:00 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:59.659 23:48:00 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:59.659 23:48:00 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:59.659 23:48:00 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:59.659 23:48:00 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:59.659 23:48:00 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:59.659 23:48:00 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:59.659 23:48:00 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:59.659 23:48:00 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:59.659 23:48:00 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:59.659 { 00:05:59.659 "name": "Malloc0", 00:05:59.659 "aliases": [ 00:05:59.659 "d167cb98-e025-4e32-a17c-bc47205bdb0e" 00:05:59.659 ], 00:05:59.659 "product_name": "Malloc disk", 00:05:59.659 "block_size": 512, 00:05:59.659 "num_blocks": 16384, 00:05:59.659 "uuid": "d167cb98-e025-4e32-a17c-bc47205bdb0e", 00:05:59.659 "assigned_rate_limits": { 00:05:59.659 "rw_ios_per_sec": 0, 00:05:59.659 "rw_mbytes_per_sec": 0, 00:05:59.659 "r_mbytes_per_sec": 0, 00:05:59.659 "w_mbytes_per_sec": 0 00:05:59.659 }, 00:05:59.659 "claimed": false, 00:05:59.659 "zoned": false, 00:05:59.659 "supported_io_types": { 00:05:59.659 "read": true, 00:05:59.659 "write": true, 00:05:59.659 "unmap": true, 00:05:59.659 "write_zeroes": true, 00:05:59.659 "flush": true, 00:05:59.659 "reset": true, 00:05:59.659 "compare": false, 00:05:59.659 "compare_and_write": false, 00:05:59.659 "abort": true, 00:05:59.659 "nvme_admin": false, 00:05:59.659 "nvme_io": false 00:05:59.659 }, 00:05:59.659 "memory_domains": [ 00:05:59.659 { 00:05:59.659 "dma_device_id": "system", 00:05:59.659 "dma_device_type": 1 00:05:59.659 }, 00:05:59.659 { 00:05:59.659 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:59.659 "dma_device_type": 2 00:05:59.659 } 00:05:59.659 ], 00:05:59.659 "driver_specific": {} 00:05:59.659 } 00:05:59.659 ]' 00:05:59.659 23:48:00 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:05:59.659 23:48:00 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:59.659 23:48:00 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:59.659 23:48:00 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:59.659 23:48:00 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:59.659 [2024-05-14 23:48:00.091048] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:59.659 [2024-05-14 23:48:00.091091] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:59.659 [2024-05-14 23:48:00.091111] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x159d640 00:05:59.659 [2024-05-14 23:48:00.091124] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:59.659 [2024-05-14 23:48:00.092849] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:59.659 [2024-05-14 23:48:00.092879] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:59.659 Passthru0 00:05:59.659 23:48:00 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:59.659 23:48:00 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:59.659 23:48:00 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:59.659 23:48:00 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:59.659 23:48:00 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:59.659 23:48:00 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:59.659 { 00:05:59.659 "name": "Malloc0", 00:05:59.660 "aliases": [ 00:05:59.660 "d167cb98-e025-4e32-a17c-bc47205bdb0e" 00:05:59.660 ], 00:05:59.660 "product_name": "Malloc disk", 00:05:59.660 "block_size": 512, 00:05:59.660 "num_blocks": 16384, 00:05:59.660 "uuid": "d167cb98-e025-4e32-a17c-bc47205bdb0e", 00:05:59.660 "assigned_rate_limits": { 00:05:59.660 "rw_ios_per_sec": 0, 00:05:59.660 "rw_mbytes_per_sec": 0, 00:05:59.660 "r_mbytes_per_sec": 0, 00:05:59.660 "w_mbytes_per_sec": 0 00:05:59.660 }, 00:05:59.660 "claimed": true, 00:05:59.660 "claim_type": "exclusive_write", 00:05:59.660 "zoned": false, 00:05:59.660 "supported_io_types": { 00:05:59.660 "read": true, 00:05:59.660 "write": true, 00:05:59.660 "unmap": true, 00:05:59.660 "write_zeroes": true, 00:05:59.660 "flush": true, 00:05:59.660 "reset": true, 00:05:59.660 "compare": false, 00:05:59.660 "compare_and_write": false, 00:05:59.660 "abort": true, 00:05:59.660 "nvme_admin": false, 00:05:59.660 "nvme_io": false 00:05:59.660 }, 00:05:59.660 "memory_domains": [ 00:05:59.660 { 00:05:59.660 "dma_device_id": "system", 00:05:59.660 "dma_device_type": 1 00:05:59.660 }, 00:05:59.660 { 00:05:59.660 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:59.660 "dma_device_type": 2 00:05:59.660 } 00:05:59.660 ], 00:05:59.660 "driver_specific": {} 00:05:59.660 }, 00:05:59.660 { 00:05:59.660 "name": "Passthru0", 00:05:59.660 "aliases": [ 00:05:59.660 "db3a986e-99b1-5d09-accb-d6225dd8b626" 00:05:59.660 ], 00:05:59.660 "product_name": "passthru", 00:05:59.660 "block_size": 512, 00:05:59.660 "num_blocks": 16384, 00:05:59.660 "uuid": "db3a986e-99b1-5d09-accb-d6225dd8b626", 00:05:59.660 "assigned_rate_limits": { 00:05:59.660 "rw_ios_per_sec": 0, 00:05:59.660 "rw_mbytes_per_sec": 0, 00:05:59.660 "r_mbytes_per_sec": 0, 00:05:59.660 "w_mbytes_per_sec": 0 00:05:59.660 }, 00:05:59.660 "claimed": false, 00:05:59.660 "zoned": false, 00:05:59.660 "supported_io_types": { 00:05:59.660 "read": true, 00:05:59.660 "write": true, 00:05:59.660 "unmap": true, 00:05:59.660 "write_zeroes": true, 00:05:59.660 "flush": true, 00:05:59.660 "reset": true, 00:05:59.660 "compare": false, 00:05:59.660 "compare_and_write": false, 00:05:59.660 "abort": true, 00:05:59.660 "nvme_admin": false, 00:05:59.660 "nvme_io": false 00:05:59.660 }, 00:05:59.660 "memory_domains": [ 00:05:59.660 { 00:05:59.660 "dma_device_id": "system", 00:05:59.660 "dma_device_type": 1 00:05:59.660 }, 00:05:59.660 { 00:05:59.660 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:59.660 "dma_device_type": 2 00:05:59.660 } 00:05:59.660 ], 00:05:59.660 "driver_specific": { 00:05:59.660 "passthru": { 00:05:59.660 "name": "Passthru0", 00:05:59.660 "base_bdev_name": "Malloc0" 00:05:59.660 } 00:05:59.660 } 00:05:59.660 } 00:05:59.660 ]' 00:05:59.660 23:48:00 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:05:59.660 23:48:00 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:59.660 23:48:00 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:59.660 23:48:00 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:59.660 23:48:00 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:59.660 23:48:00 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:59.660 23:48:00 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:59.660 23:48:00 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:59.660 23:48:00 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:59.660 23:48:00 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:59.660 23:48:00 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:59.660 23:48:00 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:59.660 23:48:00 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:59.660 23:48:00 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:59.660 23:48:00 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:59.660 23:48:00 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:05:59.660 23:48:00 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:59.660 00:05:59.660 real 0m0.302s 00:05:59.660 user 0m0.198s 00:05:59.660 sys 0m0.043s 00:05:59.660 23:48:00 rpc.rpc_integrity -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:59.660 23:48:00 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:59.660 ************************************ 00:05:59.660 END TEST rpc_integrity 00:05:59.660 ************************************ 00:05:59.919 23:48:00 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:59.919 23:48:00 rpc -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:59.919 23:48:00 rpc -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:59.919 23:48:00 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:59.919 ************************************ 00:05:59.919 START TEST rpc_plugins 00:05:59.919 ************************************ 00:05:59.919 23:48:00 rpc.rpc_plugins -- common/autotest_common.sh@1121 -- # rpc_plugins 00:05:59.919 23:48:00 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:59.919 23:48:00 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:59.919 23:48:00 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:59.919 23:48:00 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:59.919 23:48:00 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:59.919 23:48:00 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:59.919 23:48:00 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:59.919 23:48:00 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:59.919 23:48:00 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:59.919 23:48:00 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:59.919 { 00:05:59.919 "name": "Malloc1", 00:05:59.919 "aliases": [ 00:05:59.919 "20c5c99a-fc90-429a-9f57-fe6e6ee677dc" 00:05:59.919 ], 00:05:59.919 "product_name": "Malloc disk", 00:05:59.919 "block_size": 4096, 00:05:59.919 "num_blocks": 256, 00:05:59.919 "uuid": "20c5c99a-fc90-429a-9f57-fe6e6ee677dc", 00:05:59.919 "assigned_rate_limits": { 00:05:59.919 "rw_ios_per_sec": 0, 00:05:59.919 "rw_mbytes_per_sec": 0, 00:05:59.919 "r_mbytes_per_sec": 0, 00:05:59.919 "w_mbytes_per_sec": 0 00:05:59.919 }, 00:05:59.919 "claimed": false, 00:05:59.919 "zoned": false, 00:05:59.919 "supported_io_types": { 00:05:59.919 "read": true, 00:05:59.919 "write": true, 00:05:59.919 "unmap": true, 00:05:59.919 "write_zeroes": true, 00:05:59.919 "flush": true, 00:05:59.919 "reset": true, 00:05:59.919 "compare": false, 00:05:59.919 "compare_and_write": false, 00:05:59.919 "abort": true, 00:05:59.919 "nvme_admin": false, 00:05:59.919 "nvme_io": false 00:05:59.919 }, 00:05:59.919 "memory_domains": [ 00:05:59.919 { 00:05:59.919 "dma_device_id": "system", 00:05:59.919 "dma_device_type": 1 00:05:59.919 }, 00:05:59.919 { 00:05:59.919 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:59.919 "dma_device_type": 2 00:05:59.919 } 00:05:59.919 ], 00:05:59.919 "driver_specific": {} 00:05:59.919 } 00:05:59.919 ]' 00:05:59.919 23:48:00 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:05:59.919 23:48:00 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:59.919 23:48:00 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:59.919 23:48:00 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:59.919 23:48:00 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:59.919 23:48:00 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:59.919 23:48:00 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:59.919 23:48:00 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:59.919 23:48:00 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:59.919 23:48:00 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:59.919 23:48:00 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:59.919 23:48:00 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:05:59.919 23:48:00 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:59.919 00:05:59.919 real 0m0.153s 00:05:59.919 user 0m0.093s 00:05:59.919 sys 0m0.029s 00:05:59.919 23:48:00 rpc.rpc_plugins -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:59.919 23:48:00 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:59.919 ************************************ 00:05:59.919 END TEST rpc_plugins 00:05:59.919 ************************************ 00:06:00.178 23:48:00 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:06:00.178 23:48:00 rpc -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:00.178 23:48:00 rpc -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:00.178 23:48:00 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:00.178 ************************************ 00:06:00.178 START TEST rpc_trace_cmd_test 00:06:00.178 ************************************ 00:06:00.178 23:48:00 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1121 -- # rpc_trace_cmd_test 00:06:00.178 23:48:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:06:00.178 23:48:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:06:00.178 23:48:00 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:00.178 23:48:00 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:00.178 23:48:00 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:00.178 23:48:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:06:00.178 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid345631", 00:06:00.178 "tpoint_group_mask": "0x8", 00:06:00.178 "iscsi_conn": { 00:06:00.178 "mask": "0x2", 00:06:00.178 "tpoint_mask": "0x0" 00:06:00.178 }, 00:06:00.178 "scsi": { 00:06:00.178 "mask": "0x4", 00:06:00.178 "tpoint_mask": "0x0" 00:06:00.178 }, 00:06:00.178 "bdev": { 00:06:00.178 "mask": "0x8", 00:06:00.178 "tpoint_mask": "0xffffffffffffffff" 00:06:00.178 }, 00:06:00.178 "nvmf_rdma": { 00:06:00.178 "mask": "0x10", 00:06:00.178 "tpoint_mask": "0x0" 00:06:00.178 }, 00:06:00.178 "nvmf_tcp": { 00:06:00.178 "mask": "0x20", 00:06:00.178 "tpoint_mask": "0x0" 00:06:00.178 }, 00:06:00.178 "ftl": { 00:06:00.178 "mask": "0x40", 00:06:00.178 "tpoint_mask": "0x0" 00:06:00.178 }, 00:06:00.178 "blobfs": { 00:06:00.178 "mask": "0x80", 00:06:00.178 "tpoint_mask": "0x0" 00:06:00.178 }, 00:06:00.178 "dsa": { 00:06:00.178 "mask": "0x200", 00:06:00.178 "tpoint_mask": "0x0" 00:06:00.178 }, 00:06:00.178 "thread": { 00:06:00.178 "mask": "0x400", 00:06:00.178 "tpoint_mask": "0x0" 00:06:00.179 }, 00:06:00.179 "nvme_pcie": { 00:06:00.179 "mask": "0x800", 00:06:00.179 "tpoint_mask": "0x0" 00:06:00.179 }, 00:06:00.179 "iaa": { 00:06:00.179 "mask": "0x1000", 00:06:00.179 "tpoint_mask": "0x0" 00:06:00.179 }, 00:06:00.179 "nvme_tcp": { 00:06:00.179 "mask": "0x2000", 00:06:00.179 "tpoint_mask": "0x0" 00:06:00.179 }, 00:06:00.179 "bdev_nvme": { 00:06:00.179 "mask": "0x4000", 00:06:00.179 "tpoint_mask": "0x0" 00:06:00.179 }, 00:06:00.179 "sock": { 00:06:00.179 "mask": "0x8000", 00:06:00.179 "tpoint_mask": "0x0" 00:06:00.179 } 00:06:00.179 }' 00:06:00.179 23:48:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:06:00.179 23:48:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:06:00.179 23:48:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:06:00.179 23:48:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:06:00.179 23:48:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:06:00.179 23:48:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:06:00.179 23:48:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:06:00.438 23:48:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:06:00.438 23:48:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:06:00.438 23:48:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:06:00.438 00:06:00.438 real 0m0.253s 00:06:00.438 user 0m0.205s 00:06:00.438 sys 0m0.040s 00:06:00.438 23:48:00 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:00.438 23:48:00 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:00.438 ************************************ 00:06:00.438 END TEST rpc_trace_cmd_test 00:06:00.438 ************************************ 00:06:00.438 23:48:00 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:06:00.438 23:48:00 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:06:00.438 23:48:00 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:06:00.438 23:48:00 rpc -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:00.438 23:48:00 rpc -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:00.438 23:48:00 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:00.438 ************************************ 00:06:00.438 START TEST rpc_daemon_integrity 00:06:00.438 ************************************ 00:06:00.438 23:48:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1121 -- # rpc_integrity 00:06:00.438 23:48:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:00.438 23:48:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:00.438 23:48:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:00.438 23:48:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:00.438 23:48:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:00.438 23:48:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:00.438 23:48:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:00.438 23:48:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:00.438 23:48:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:00.438 23:48:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:00.438 23:48:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:00.438 23:48:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:06:00.438 23:48:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:00.438 23:48:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:00.438 23:48:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:00.438 23:48:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:00.438 23:48:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:00.438 { 00:06:00.438 "name": "Malloc2", 00:06:00.438 "aliases": [ 00:06:00.438 "eb12214a-d833-42c6-ac2c-3bd9415c3eb5" 00:06:00.438 ], 00:06:00.438 "product_name": "Malloc disk", 00:06:00.438 "block_size": 512, 00:06:00.438 "num_blocks": 16384, 00:06:00.438 "uuid": "eb12214a-d833-42c6-ac2c-3bd9415c3eb5", 00:06:00.438 "assigned_rate_limits": { 00:06:00.438 "rw_ios_per_sec": 0, 00:06:00.438 "rw_mbytes_per_sec": 0, 00:06:00.438 "r_mbytes_per_sec": 0, 00:06:00.438 "w_mbytes_per_sec": 0 00:06:00.438 }, 00:06:00.438 "claimed": false, 00:06:00.438 "zoned": false, 00:06:00.438 "supported_io_types": { 00:06:00.438 "read": true, 00:06:00.438 "write": true, 00:06:00.438 "unmap": true, 00:06:00.438 "write_zeroes": true, 00:06:00.438 "flush": true, 00:06:00.438 "reset": true, 00:06:00.438 "compare": false, 00:06:00.438 "compare_and_write": false, 00:06:00.438 "abort": true, 00:06:00.438 "nvme_admin": false, 00:06:00.438 "nvme_io": false 00:06:00.438 }, 00:06:00.438 "memory_domains": [ 00:06:00.438 { 00:06:00.438 "dma_device_id": "system", 00:06:00.438 "dma_device_type": 1 00:06:00.438 }, 00:06:00.438 { 00:06:00.438 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:00.438 "dma_device_type": 2 00:06:00.438 } 00:06:00.438 ], 00:06:00.438 "driver_specific": {} 00:06:00.438 } 00:06:00.438 ]' 00:06:00.438 23:48:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:00.698 [2024-05-14 23:48:01.065896] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:06:00.698 [2024-05-14 23:48:01.065937] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:00.698 [2024-05-14 23:48:01.065959] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x159e7a0 00:06:00.698 [2024-05-14 23:48:01.065972] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:00.698 [2024-05-14 23:48:01.067382] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:00.698 [2024-05-14 23:48:01.067420] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:00.698 Passthru0 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:00.698 { 00:06:00.698 "name": "Malloc2", 00:06:00.698 "aliases": [ 00:06:00.698 "eb12214a-d833-42c6-ac2c-3bd9415c3eb5" 00:06:00.698 ], 00:06:00.698 "product_name": "Malloc disk", 00:06:00.698 "block_size": 512, 00:06:00.698 "num_blocks": 16384, 00:06:00.698 "uuid": "eb12214a-d833-42c6-ac2c-3bd9415c3eb5", 00:06:00.698 "assigned_rate_limits": { 00:06:00.698 "rw_ios_per_sec": 0, 00:06:00.698 "rw_mbytes_per_sec": 0, 00:06:00.698 "r_mbytes_per_sec": 0, 00:06:00.698 "w_mbytes_per_sec": 0 00:06:00.698 }, 00:06:00.698 "claimed": true, 00:06:00.698 "claim_type": "exclusive_write", 00:06:00.698 "zoned": false, 00:06:00.698 "supported_io_types": { 00:06:00.698 "read": true, 00:06:00.698 "write": true, 00:06:00.698 "unmap": true, 00:06:00.698 "write_zeroes": true, 00:06:00.698 "flush": true, 00:06:00.698 "reset": true, 00:06:00.698 "compare": false, 00:06:00.698 "compare_and_write": false, 00:06:00.698 "abort": true, 00:06:00.698 "nvme_admin": false, 00:06:00.698 "nvme_io": false 00:06:00.698 }, 00:06:00.698 "memory_domains": [ 00:06:00.698 { 00:06:00.698 "dma_device_id": "system", 00:06:00.698 "dma_device_type": 1 00:06:00.698 }, 00:06:00.698 { 00:06:00.698 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:00.698 "dma_device_type": 2 00:06:00.698 } 00:06:00.698 ], 00:06:00.698 "driver_specific": {} 00:06:00.698 }, 00:06:00.698 { 00:06:00.698 "name": "Passthru0", 00:06:00.698 "aliases": [ 00:06:00.698 "db479046-2777-51df-bc08-ab3efbc7da93" 00:06:00.698 ], 00:06:00.698 "product_name": "passthru", 00:06:00.698 "block_size": 512, 00:06:00.698 "num_blocks": 16384, 00:06:00.698 "uuid": "db479046-2777-51df-bc08-ab3efbc7da93", 00:06:00.698 "assigned_rate_limits": { 00:06:00.698 "rw_ios_per_sec": 0, 00:06:00.698 "rw_mbytes_per_sec": 0, 00:06:00.698 "r_mbytes_per_sec": 0, 00:06:00.698 "w_mbytes_per_sec": 0 00:06:00.698 }, 00:06:00.698 "claimed": false, 00:06:00.698 "zoned": false, 00:06:00.698 "supported_io_types": { 00:06:00.698 "read": true, 00:06:00.698 "write": true, 00:06:00.698 "unmap": true, 00:06:00.698 "write_zeroes": true, 00:06:00.698 "flush": true, 00:06:00.698 "reset": true, 00:06:00.698 "compare": false, 00:06:00.698 "compare_and_write": false, 00:06:00.698 "abort": true, 00:06:00.698 "nvme_admin": false, 00:06:00.698 "nvme_io": false 00:06:00.698 }, 00:06:00.698 "memory_domains": [ 00:06:00.698 { 00:06:00.698 "dma_device_id": "system", 00:06:00.698 "dma_device_type": 1 00:06:00.698 }, 00:06:00.698 { 00:06:00.698 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:00.698 "dma_device_type": 2 00:06:00.698 } 00:06:00.698 ], 00:06:00.698 "driver_specific": { 00:06:00.698 "passthru": { 00:06:00.698 "name": "Passthru0", 00:06:00.698 "base_bdev_name": "Malloc2" 00:06:00.698 } 00:06:00.698 } 00:06:00.698 } 00:06:00.698 ]' 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:00.698 00:06:00.698 real 0m0.299s 00:06:00.698 user 0m0.179s 00:06:00.698 sys 0m0.059s 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:00.698 23:48:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:00.698 ************************************ 00:06:00.698 END TEST rpc_daemon_integrity 00:06:00.698 ************************************ 00:06:00.698 23:48:01 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:06:00.698 23:48:01 rpc -- rpc/rpc.sh@84 -- # killprocess 345631 00:06:00.698 23:48:01 rpc -- common/autotest_common.sh@946 -- # '[' -z 345631 ']' 00:06:00.698 23:48:01 rpc -- common/autotest_common.sh@950 -- # kill -0 345631 00:06:00.698 23:48:01 rpc -- common/autotest_common.sh@951 -- # uname 00:06:00.698 23:48:01 rpc -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:00.698 23:48:01 rpc -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 345631 00:06:00.958 23:48:01 rpc -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:00.958 23:48:01 rpc -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:00.958 23:48:01 rpc -- common/autotest_common.sh@964 -- # echo 'killing process with pid 345631' 00:06:00.958 killing process with pid 345631 00:06:00.958 23:48:01 rpc -- common/autotest_common.sh@965 -- # kill 345631 00:06:00.958 23:48:01 rpc -- common/autotest_common.sh@970 -- # wait 345631 00:06:01.217 00:06:01.217 real 0m2.850s 00:06:01.217 user 0m3.567s 00:06:01.217 sys 0m0.929s 00:06:01.217 23:48:01 rpc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:01.217 23:48:01 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:01.217 ************************************ 00:06:01.217 END TEST rpc 00:06:01.217 ************************************ 00:06:01.217 23:48:01 -- spdk/autotest.sh@166 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:01.217 23:48:01 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:01.217 23:48:01 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:01.217 23:48:01 -- common/autotest_common.sh@10 -- # set +x 00:06:01.217 ************************************ 00:06:01.217 START TEST skip_rpc 00:06:01.217 ************************************ 00:06:01.217 23:48:01 skip_rpc -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:01.476 * Looking for test storage... 00:06:01.476 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:01.476 23:48:01 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:01.476 23:48:01 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:01.476 23:48:01 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:06:01.476 23:48:01 skip_rpc -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:01.476 23:48:01 skip_rpc -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:01.476 23:48:01 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:01.476 ************************************ 00:06:01.476 START TEST skip_rpc 00:06:01.476 ************************************ 00:06:01.476 23:48:01 skip_rpc.skip_rpc -- common/autotest_common.sh@1121 -- # test_skip_rpc 00:06:01.476 23:48:01 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=346376 00:06:01.476 23:48:01 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:01.476 23:48:01 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:06:01.476 23:48:01 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:06:01.476 [2024-05-14 23:48:02.031959] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:06:01.476 [2024-05-14 23:48:02.032028] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid346376 ] 00:06:01.735 [2024-05-14 23:48:02.154605] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:01.735 [2024-05-14 23:48:02.255364] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:07.006 23:48:06 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:06:07.006 23:48:06 skip_rpc.skip_rpc -- common/autotest_common.sh@648 -- # local es=0 00:06:07.006 23:48:06 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # valid_exec_arg rpc_cmd spdk_get_version 00:06:07.006 23:48:06 skip_rpc.skip_rpc -- common/autotest_common.sh@636 -- # local arg=rpc_cmd 00:06:07.006 23:48:06 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:07.006 23:48:06 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # type -t rpc_cmd 00:06:07.006 23:48:06 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:07.006 23:48:06 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # rpc_cmd spdk_get_version 00:06:07.006 23:48:06 skip_rpc.skip_rpc -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:07.006 23:48:06 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:07.006 23:48:06 skip_rpc.skip_rpc -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:06:07.006 23:48:06 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # es=1 00:06:07.006 23:48:06 skip_rpc.skip_rpc -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:07.006 23:48:06 skip_rpc.skip_rpc -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:07.006 23:48:06 skip_rpc.skip_rpc -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:07.006 23:48:06 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:06:07.006 23:48:06 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 346376 00:06:07.006 23:48:06 skip_rpc.skip_rpc -- common/autotest_common.sh@946 -- # '[' -z 346376 ']' 00:06:07.006 23:48:06 skip_rpc.skip_rpc -- common/autotest_common.sh@950 -- # kill -0 346376 00:06:07.006 23:48:06 skip_rpc.skip_rpc -- common/autotest_common.sh@951 -- # uname 00:06:07.006 23:48:06 skip_rpc.skip_rpc -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:07.006 23:48:06 skip_rpc.skip_rpc -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 346376 00:06:07.006 23:48:07 skip_rpc.skip_rpc -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:07.006 23:48:07 skip_rpc.skip_rpc -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:07.006 23:48:07 skip_rpc.skip_rpc -- common/autotest_common.sh@964 -- # echo 'killing process with pid 346376' 00:06:07.006 killing process with pid 346376 00:06:07.006 23:48:07 skip_rpc.skip_rpc -- common/autotest_common.sh@965 -- # kill 346376 00:06:07.006 23:48:07 skip_rpc.skip_rpc -- common/autotest_common.sh@970 -- # wait 346376 00:06:07.006 00:06:07.006 real 0m5.482s 00:06:07.006 user 0m5.116s 00:06:07.006 sys 0m0.371s 00:06:07.006 23:48:07 skip_rpc.skip_rpc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:07.006 23:48:07 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:07.006 ************************************ 00:06:07.006 END TEST skip_rpc 00:06:07.006 ************************************ 00:06:07.006 23:48:07 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:06:07.006 23:48:07 skip_rpc -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:07.006 23:48:07 skip_rpc -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:07.006 23:48:07 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:07.006 ************************************ 00:06:07.006 START TEST skip_rpc_with_json 00:06:07.006 ************************************ 00:06:07.006 23:48:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1121 -- # test_skip_rpc_with_json 00:06:07.006 23:48:07 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:06:07.006 23:48:07 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=347554 00:06:07.006 23:48:07 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:07.006 23:48:07 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:07.006 23:48:07 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 347554 00:06:07.006 23:48:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@827 -- # '[' -z 347554 ']' 00:06:07.006 23:48:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:07.006 23:48:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:07.006 23:48:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:07.006 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:07.006 23:48:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:07.006 23:48:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:07.266 [2024-05-14 23:48:07.602426] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:06:07.266 [2024-05-14 23:48:07.602492] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid347554 ] 00:06:07.266 [2024-05-14 23:48:07.732560] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:07.266 [2024-05-14 23:48:07.839112] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:08.203 23:48:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:08.203 23:48:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@860 -- # return 0 00:06:08.203 23:48:08 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:06:08.203 23:48:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:08.203 23:48:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:08.203 [2024-05-14 23:48:08.525714] nvmf_rpc.c:2531:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:06:08.203 request: 00:06:08.203 { 00:06:08.203 "trtype": "tcp", 00:06:08.203 "method": "nvmf_get_transports", 00:06:08.203 "req_id": 1 00:06:08.203 } 00:06:08.203 Got JSON-RPC error response 00:06:08.203 response: 00:06:08.203 { 00:06:08.203 "code": -19, 00:06:08.203 "message": "No such device" 00:06:08.203 } 00:06:08.203 23:48:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:06:08.203 23:48:08 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:06:08.203 23:48:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:08.203 23:48:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:08.203 [2024-05-14 23:48:08.537843] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:08.203 23:48:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:08.203 23:48:08 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:06:08.203 23:48:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:08.203 23:48:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:08.203 23:48:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:08.203 23:48:08 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:08.203 { 00:06:08.203 "subsystems": [ 00:06:08.203 { 00:06:08.203 "subsystem": "keyring", 00:06:08.203 "config": [] 00:06:08.203 }, 00:06:08.203 { 00:06:08.203 "subsystem": "iobuf", 00:06:08.203 "config": [ 00:06:08.203 { 00:06:08.203 "method": "iobuf_set_options", 00:06:08.203 "params": { 00:06:08.203 "small_pool_count": 8192, 00:06:08.203 "large_pool_count": 1024, 00:06:08.203 "small_bufsize": 8192, 00:06:08.203 "large_bufsize": 135168 00:06:08.203 } 00:06:08.203 } 00:06:08.203 ] 00:06:08.203 }, 00:06:08.203 { 00:06:08.203 "subsystem": "sock", 00:06:08.203 "config": [ 00:06:08.203 { 00:06:08.203 "method": "sock_impl_set_options", 00:06:08.203 "params": { 00:06:08.203 "impl_name": "posix", 00:06:08.203 "recv_buf_size": 2097152, 00:06:08.203 "send_buf_size": 2097152, 00:06:08.203 "enable_recv_pipe": true, 00:06:08.203 "enable_quickack": false, 00:06:08.203 "enable_placement_id": 0, 00:06:08.203 "enable_zerocopy_send_server": true, 00:06:08.203 "enable_zerocopy_send_client": false, 00:06:08.203 "zerocopy_threshold": 0, 00:06:08.203 "tls_version": 0, 00:06:08.203 "enable_ktls": false 00:06:08.203 } 00:06:08.203 }, 00:06:08.203 { 00:06:08.203 "method": "sock_impl_set_options", 00:06:08.203 "params": { 00:06:08.203 "impl_name": "ssl", 00:06:08.203 "recv_buf_size": 4096, 00:06:08.203 "send_buf_size": 4096, 00:06:08.203 "enable_recv_pipe": true, 00:06:08.203 "enable_quickack": false, 00:06:08.203 "enable_placement_id": 0, 00:06:08.203 "enable_zerocopy_send_server": true, 00:06:08.203 "enable_zerocopy_send_client": false, 00:06:08.203 "zerocopy_threshold": 0, 00:06:08.203 "tls_version": 0, 00:06:08.203 "enable_ktls": false 00:06:08.203 } 00:06:08.203 } 00:06:08.203 ] 00:06:08.203 }, 00:06:08.203 { 00:06:08.203 "subsystem": "vmd", 00:06:08.203 "config": [] 00:06:08.203 }, 00:06:08.203 { 00:06:08.203 "subsystem": "accel", 00:06:08.203 "config": [ 00:06:08.203 { 00:06:08.203 "method": "accel_set_options", 00:06:08.203 "params": { 00:06:08.203 "small_cache_size": 128, 00:06:08.203 "large_cache_size": 16, 00:06:08.203 "task_count": 2048, 00:06:08.203 "sequence_count": 2048, 00:06:08.203 "buf_count": 2048 00:06:08.203 } 00:06:08.203 } 00:06:08.203 ] 00:06:08.203 }, 00:06:08.203 { 00:06:08.203 "subsystem": "bdev", 00:06:08.203 "config": [ 00:06:08.203 { 00:06:08.203 "method": "bdev_set_options", 00:06:08.203 "params": { 00:06:08.203 "bdev_io_pool_size": 65535, 00:06:08.203 "bdev_io_cache_size": 256, 00:06:08.203 "bdev_auto_examine": true, 00:06:08.203 "iobuf_small_cache_size": 128, 00:06:08.203 "iobuf_large_cache_size": 16 00:06:08.203 } 00:06:08.203 }, 00:06:08.203 { 00:06:08.203 "method": "bdev_raid_set_options", 00:06:08.203 "params": { 00:06:08.203 "process_window_size_kb": 1024 00:06:08.203 } 00:06:08.203 }, 00:06:08.203 { 00:06:08.203 "method": "bdev_iscsi_set_options", 00:06:08.203 "params": { 00:06:08.203 "timeout_sec": 30 00:06:08.203 } 00:06:08.203 }, 00:06:08.203 { 00:06:08.203 "method": "bdev_nvme_set_options", 00:06:08.203 "params": { 00:06:08.203 "action_on_timeout": "none", 00:06:08.203 "timeout_us": 0, 00:06:08.203 "timeout_admin_us": 0, 00:06:08.203 "keep_alive_timeout_ms": 10000, 00:06:08.203 "arbitration_burst": 0, 00:06:08.203 "low_priority_weight": 0, 00:06:08.203 "medium_priority_weight": 0, 00:06:08.203 "high_priority_weight": 0, 00:06:08.203 "nvme_adminq_poll_period_us": 10000, 00:06:08.203 "nvme_ioq_poll_period_us": 0, 00:06:08.203 "io_queue_requests": 0, 00:06:08.203 "delay_cmd_submit": true, 00:06:08.203 "transport_retry_count": 4, 00:06:08.203 "bdev_retry_count": 3, 00:06:08.203 "transport_ack_timeout": 0, 00:06:08.203 "ctrlr_loss_timeout_sec": 0, 00:06:08.203 "reconnect_delay_sec": 0, 00:06:08.203 "fast_io_fail_timeout_sec": 0, 00:06:08.203 "disable_auto_failback": false, 00:06:08.203 "generate_uuids": false, 00:06:08.203 "transport_tos": 0, 00:06:08.203 "nvme_error_stat": false, 00:06:08.203 "rdma_srq_size": 0, 00:06:08.203 "io_path_stat": false, 00:06:08.203 "allow_accel_sequence": false, 00:06:08.203 "rdma_max_cq_size": 0, 00:06:08.203 "rdma_cm_event_timeout_ms": 0, 00:06:08.203 "dhchap_digests": [ 00:06:08.203 "sha256", 00:06:08.203 "sha384", 00:06:08.203 "sha512" 00:06:08.203 ], 00:06:08.203 "dhchap_dhgroups": [ 00:06:08.203 "null", 00:06:08.203 "ffdhe2048", 00:06:08.203 "ffdhe3072", 00:06:08.203 "ffdhe4096", 00:06:08.203 "ffdhe6144", 00:06:08.203 "ffdhe8192" 00:06:08.203 ] 00:06:08.203 } 00:06:08.203 }, 00:06:08.203 { 00:06:08.203 "method": "bdev_nvme_set_hotplug", 00:06:08.203 "params": { 00:06:08.203 "period_us": 100000, 00:06:08.203 "enable": false 00:06:08.203 } 00:06:08.203 }, 00:06:08.203 { 00:06:08.203 "method": "bdev_wait_for_examine" 00:06:08.203 } 00:06:08.203 ] 00:06:08.203 }, 00:06:08.203 { 00:06:08.203 "subsystem": "scsi", 00:06:08.203 "config": null 00:06:08.203 }, 00:06:08.203 { 00:06:08.203 "subsystem": "scheduler", 00:06:08.203 "config": [ 00:06:08.203 { 00:06:08.203 "method": "framework_set_scheduler", 00:06:08.203 "params": { 00:06:08.203 "name": "static" 00:06:08.203 } 00:06:08.203 } 00:06:08.203 ] 00:06:08.203 }, 00:06:08.203 { 00:06:08.203 "subsystem": "vhost_scsi", 00:06:08.203 "config": [] 00:06:08.203 }, 00:06:08.203 { 00:06:08.203 "subsystem": "vhost_blk", 00:06:08.203 "config": [] 00:06:08.203 }, 00:06:08.203 { 00:06:08.203 "subsystem": "ublk", 00:06:08.203 "config": [] 00:06:08.203 }, 00:06:08.203 { 00:06:08.203 "subsystem": "nbd", 00:06:08.203 "config": [] 00:06:08.203 }, 00:06:08.203 { 00:06:08.203 "subsystem": "nvmf", 00:06:08.203 "config": [ 00:06:08.203 { 00:06:08.203 "method": "nvmf_set_config", 00:06:08.203 "params": { 00:06:08.203 "discovery_filter": "match_any", 00:06:08.203 "admin_cmd_passthru": { 00:06:08.203 "identify_ctrlr": false 00:06:08.203 } 00:06:08.203 } 00:06:08.203 }, 00:06:08.203 { 00:06:08.203 "method": "nvmf_set_max_subsystems", 00:06:08.203 "params": { 00:06:08.203 "max_subsystems": 1024 00:06:08.203 } 00:06:08.203 }, 00:06:08.203 { 00:06:08.203 "method": "nvmf_set_crdt", 00:06:08.203 "params": { 00:06:08.204 "crdt1": 0, 00:06:08.204 "crdt2": 0, 00:06:08.204 "crdt3": 0 00:06:08.204 } 00:06:08.204 }, 00:06:08.204 { 00:06:08.204 "method": "nvmf_create_transport", 00:06:08.204 "params": { 00:06:08.204 "trtype": "TCP", 00:06:08.204 "max_queue_depth": 128, 00:06:08.204 "max_io_qpairs_per_ctrlr": 127, 00:06:08.204 "in_capsule_data_size": 4096, 00:06:08.204 "max_io_size": 131072, 00:06:08.204 "io_unit_size": 131072, 00:06:08.204 "max_aq_depth": 128, 00:06:08.204 "num_shared_buffers": 511, 00:06:08.204 "buf_cache_size": 4294967295, 00:06:08.204 "dif_insert_or_strip": false, 00:06:08.204 "zcopy": false, 00:06:08.204 "c2h_success": true, 00:06:08.204 "sock_priority": 0, 00:06:08.204 "abort_timeout_sec": 1, 00:06:08.204 "ack_timeout": 0, 00:06:08.204 "data_wr_pool_size": 0 00:06:08.204 } 00:06:08.204 } 00:06:08.204 ] 00:06:08.204 }, 00:06:08.204 { 00:06:08.204 "subsystem": "iscsi", 00:06:08.204 "config": [ 00:06:08.204 { 00:06:08.204 "method": "iscsi_set_options", 00:06:08.204 "params": { 00:06:08.204 "node_base": "iqn.2016-06.io.spdk", 00:06:08.204 "max_sessions": 128, 00:06:08.204 "max_connections_per_session": 2, 00:06:08.204 "max_queue_depth": 64, 00:06:08.204 "default_time2wait": 2, 00:06:08.204 "default_time2retain": 20, 00:06:08.204 "first_burst_length": 8192, 00:06:08.204 "immediate_data": true, 00:06:08.204 "allow_duplicated_isid": false, 00:06:08.204 "error_recovery_level": 0, 00:06:08.204 "nop_timeout": 60, 00:06:08.204 "nop_in_interval": 30, 00:06:08.204 "disable_chap": false, 00:06:08.204 "require_chap": false, 00:06:08.204 "mutual_chap": false, 00:06:08.204 "chap_group": 0, 00:06:08.204 "max_large_datain_per_connection": 64, 00:06:08.204 "max_r2t_per_connection": 4, 00:06:08.204 "pdu_pool_size": 36864, 00:06:08.204 "immediate_data_pool_size": 16384, 00:06:08.204 "data_out_pool_size": 2048 00:06:08.204 } 00:06:08.204 } 00:06:08.204 ] 00:06:08.204 } 00:06:08.204 ] 00:06:08.204 } 00:06:08.204 23:48:08 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:06:08.204 23:48:08 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 347554 00:06:08.204 23:48:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@946 -- # '[' -z 347554 ']' 00:06:08.204 23:48:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@950 -- # kill -0 347554 00:06:08.204 23:48:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@951 -- # uname 00:06:08.204 23:48:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:08.204 23:48:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 347554 00:06:08.204 23:48:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:08.204 23:48:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:08.204 23:48:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@964 -- # echo 'killing process with pid 347554' 00:06:08.204 killing process with pid 347554 00:06:08.204 23:48:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@965 -- # kill 347554 00:06:08.204 23:48:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@970 -- # wait 347554 00:06:08.772 23:48:09 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=347746 00:06:08.772 23:48:09 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:06:08.772 23:48:09 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:14.048 23:48:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 347746 00:06:14.048 23:48:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@946 -- # '[' -z 347746 ']' 00:06:14.048 23:48:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@950 -- # kill -0 347746 00:06:14.048 23:48:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@951 -- # uname 00:06:14.048 23:48:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:14.048 23:48:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 347746 00:06:14.048 23:48:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:14.048 23:48:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:14.048 23:48:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@964 -- # echo 'killing process with pid 347746' 00:06:14.048 killing process with pid 347746 00:06:14.048 23:48:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@965 -- # kill 347746 00:06:14.048 23:48:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@970 -- # wait 347746 00:06:14.308 23:48:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:14.308 23:48:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:14.308 00:06:14.308 real 0m7.133s 00:06:14.308 user 0m6.824s 00:06:14.308 sys 0m0.871s 00:06:14.308 23:48:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:14.308 23:48:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:14.308 ************************************ 00:06:14.308 END TEST skip_rpc_with_json 00:06:14.308 ************************************ 00:06:14.308 23:48:14 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:06:14.308 23:48:14 skip_rpc -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:14.308 23:48:14 skip_rpc -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:14.308 23:48:14 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:14.308 ************************************ 00:06:14.308 START TEST skip_rpc_with_delay 00:06:14.308 ************************************ 00:06:14.308 23:48:14 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1121 -- # test_skip_rpc_with_delay 00:06:14.308 23:48:14 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:14.308 23:48:14 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@648 -- # local es=0 00:06:14.308 23:48:14 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:14.308 23:48:14 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:14.308 23:48:14 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:14.308 23:48:14 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:14.308 23:48:14 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:14.308 23:48:14 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:14.308 23:48:14 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:14.308 23:48:14 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:14.308 23:48:14 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:14.308 23:48:14 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:14.308 [2024-05-14 23:48:14.833674] app.c: 832:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:06:14.308 [2024-05-14 23:48:14.833779] app.c: 711:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:06:14.308 23:48:14 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # es=1 00:06:14.308 23:48:14 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:14.308 23:48:14 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:14.308 23:48:14 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:14.308 00:06:14.308 real 0m0.093s 00:06:14.308 user 0m0.055s 00:06:14.308 sys 0m0.037s 00:06:14.308 23:48:14 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:14.308 23:48:14 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:06:14.308 ************************************ 00:06:14.308 END TEST skip_rpc_with_delay 00:06:14.308 ************************************ 00:06:14.308 23:48:14 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:06:14.567 23:48:14 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:06:14.567 23:48:14 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:06:14.567 23:48:14 skip_rpc -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:14.567 23:48:14 skip_rpc -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:14.567 23:48:14 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:14.567 ************************************ 00:06:14.567 START TEST exit_on_failed_rpc_init 00:06:14.567 ************************************ 00:06:14.567 23:48:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1121 -- # test_exit_on_failed_rpc_init 00:06:14.567 23:48:14 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=348500 00:06:14.567 23:48:14 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 348500 00:06:14.567 23:48:14 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:14.567 23:48:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@827 -- # '[' -z 348500 ']' 00:06:14.567 23:48:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:14.567 23:48:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:14.567 23:48:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:14.567 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:14.567 23:48:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:14.567 23:48:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:14.567 [2024-05-14 23:48:15.021086] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:06:14.567 [2024-05-14 23:48:15.021154] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid348500 ] 00:06:14.567 [2024-05-14 23:48:15.151593] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:14.827 [2024-05-14 23:48:15.250373] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:15.394 23:48:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:15.394 23:48:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@860 -- # return 0 00:06:15.394 23:48:15 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:15.394 23:48:15 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:15.394 23:48:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@648 -- # local es=0 00:06:15.395 23:48:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:15.395 23:48:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:15.395 23:48:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:15.395 23:48:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:15.395 23:48:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:15.395 23:48:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:15.395 23:48:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:15.395 23:48:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:15.395 23:48:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:15.395 23:48:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:15.395 [2024-05-14 23:48:15.937830] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:06:15.395 [2024-05-14 23:48:15.937883] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid348677 ] 00:06:15.653 [2024-05-14 23:48:16.038839] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:15.653 [2024-05-14 23:48:16.136335] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:15.653 [2024-05-14 23:48:16.136428] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:06:15.653 [2024-05-14 23:48:16.136445] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:06:15.653 [2024-05-14 23:48:16.136459] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:15.913 23:48:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # es=234 00:06:15.913 23:48:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:15.913 23:48:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@660 -- # es=106 00:06:15.913 23:48:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # case "$es" in 00:06:15.913 23:48:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@668 -- # es=1 00:06:15.913 23:48:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:15.913 23:48:16 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:06:15.913 23:48:16 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 348500 00:06:15.913 23:48:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@946 -- # '[' -z 348500 ']' 00:06:15.913 23:48:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@950 -- # kill -0 348500 00:06:15.913 23:48:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@951 -- # uname 00:06:15.913 23:48:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:15.913 23:48:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 348500 00:06:15.913 23:48:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:15.913 23:48:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:15.913 23:48:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@964 -- # echo 'killing process with pid 348500' 00:06:15.913 killing process with pid 348500 00:06:15.913 23:48:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@965 -- # kill 348500 00:06:15.913 23:48:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@970 -- # wait 348500 00:06:16.173 00:06:16.173 real 0m1.760s 00:06:16.173 user 0m1.986s 00:06:16.173 sys 0m0.583s 00:06:16.173 23:48:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:16.173 23:48:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:16.173 ************************************ 00:06:16.173 END TEST exit_on_failed_rpc_init 00:06:16.173 ************************************ 00:06:16.173 23:48:16 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:16.173 00:06:16.173 real 0m14.957s 00:06:16.173 user 0m14.151s 00:06:16.173 sys 0m2.197s 00:06:16.173 23:48:16 skip_rpc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:16.173 23:48:16 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:16.173 ************************************ 00:06:16.173 END TEST skip_rpc 00:06:16.173 ************************************ 00:06:16.432 23:48:16 -- spdk/autotest.sh@167 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:16.432 23:48:16 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:16.432 23:48:16 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:16.432 23:48:16 -- common/autotest_common.sh@10 -- # set +x 00:06:16.432 ************************************ 00:06:16.432 START TEST rpc_client 00:06:16.432 ************************************ 00:06:16.432 23:48:16 rpc_client -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:16.432 * Looking for test storage... 00:06:16.432 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:06:16.432 23:48:16 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:06:16.432 OK 00:06:16.432 23:48:16 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:06:16.432 00:06:16.432 real 0m0.145s 00:06:16.432 user 0m0.056s 00:06:16.432 sys 0m0.100s 00:06:16.432 23:48:17 rpc_client -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:16.432 23:48:17 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:06:16.432 ************************************ 00:06:16.432 END TEST rpc_client 00:06:16.432 ************************************ 00:06:16.692 23:48:17 -- spdk/autotest.sh@168 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:06:16.692 23:48:17 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:16.692 23:48:17 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:16.692 23:48:17 -- common/autotest_common.sh@10 -- # set +x 00:06:16.692 ************************************ 00:06:16.692 START TEST json_config 00:06:16.692 ************************************ 00:06:16.692 23:48:17 json_config -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:06:16.692 23:48:17 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:06:16.692 23:48:17 json_config -- nvmf/common.sh@7 -- # uname -s 00:06:16.692 23:48:17 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:16.692 23:48:17 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:16.692 23:48:17 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:16.692 23:48:17 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:16.692 23:48:17 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:16.692 23:48:17 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:16.692 23:48:17 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:16.692 23:48:17 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:16.692 23:48:17 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:16.692 23:48:17 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:16.692 23:48:17 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:06:16.692 23:48:17 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:06:16.692 23:48:17 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:16.692 23:48:17 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:16.692 23:48:17 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:16.692 23:48:17 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:16.692 23:48:17 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:06:16.692 23:48:17 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:16.692 23:48:17 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:16.692 23:48:17 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:16.692 23:48:17 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:16.692 23:48:17 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:16.692 23:48:17 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:16.692 23:48:17 json_config -- paths/export.sh@5 -- # export PATH 00:06:16.692 23:48:17 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:16.692 23:48:17 json_config -- nvmf/common.sh@47 -- # : 0 00:06:16.692 23:48:17 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:16.692 23:48:17 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:16.692 23:48:17 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:16.692 23:48:17 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:16.692 23:48:17 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:16.692 23:48:17 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:16.692 23:48:17 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:16.692 23:48:17 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:16.692 23:48:17 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:06:16.692 23:48:17 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:06:16.692 23:48:17 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:06:16.692 23:48:17 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:06:16.692 23:48:17 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:06:16.692 23:48:17 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:06:16.692 23:48:17 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:06:16.692 23:48:17 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:06:16.692 23:48:17 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:06:16.692 23:48:17 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:06:16.692 23:48:17 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:06:16.692 23:48:17 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:06:16.692 23:48:17 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:06:16.692 23:48:17 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:06:16.692 23:48:17 json_config -- json_config/json_config.sh@355 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:16.692 23:48:17 json_config -- json_config/json_config.sh@356 -- # echo 'INFO: JSON configuration test init' 00:06:16.692 INFO: JSON configuration test init 00:06:16.692 23:48:17 json_config -- json_config/json_config.sh@357 -- # json_config_test_init 00:06:16.692 23:48:17 json_config -- json_config/json_config.sh@262 -- # timing_enter json_config_test_init 00:06:16.692 23:48:17 json_config -- common/autotest_common.sh@720 -- # xtrace_disable 00:06:16.692 23:48:17 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:16.692 23:48:17 json_config -- json_config/json_config.sh@263 -- # timing_enter json_config_setup_target 00:06:16.692 23:48:17 json_config -- common/autotest_common.sh@720 -- # xtrace_disable 00:06:16.692 23:48:17 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:16.692 23:48:17 json_config -- json_config/json_config.sh@265 -- # json_config_test_start_app target --wait-for-rpc 00:06:16.692 23:48:17 json_config -- json_config/common.sh@9 -- # local app=target 00:06:16.692 23:48:17 json_config -- json_config/common.sh@10 -- # shift 00:06:16.692 23:48:17 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:16.692 23:48:17 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:16.692 23:48:17 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:06:16.692 23:48:17 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:16.692 23:48:17 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:16.692 23:48:17 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=348959 00:06:16.692 23:48:17 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:06:16.692 23:48:17 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:16.692 Waiting for target to run... 00:06:16.692 23:48:17 json_config -- json_config/common.sh@25 -- # waitforlisten 348959 /var/tmp/spdk_tgt.sock 00:06:16.692 23:48:17 json_config -- common/autotest_common.sh@827 -- # '[' -z 348959 ']' 00:06:16.692 23:48:17 json_config -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:16.692 23:48:17 json_config -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:16.692 23:48:17 json_config -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:16.692 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:16.692 23:48:17 json_config -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:16.692 23:48:17 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:16.692 [2024-05-14 23:48:17.269986] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:06:16.692 [2024-05-14 23:48:17.270057] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid348959 ] 00:06:17.260 [2024-05-14 23:48:17.619489] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:17.260 [2024-05-14 23:48:17.710688] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:17.857 23:48:18 json_config -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:17.857 23:48:18 json_config -- common/autotest_common.sh@860 -- # return 0 00:06:17.857 23:48:18 json_config -- json_config/common.sh@26 -- # echo '' 00:06:17.857 00:06:17.857 23:48:18 json_config -- json_config/json_config.sh@269 -- # create_accel_config 00:06:17.857 23:48:18 json_config -- json_config/json_config.sh@93 -- # timing_enter create_accel_config 00:06:17.857 23:48:18 json_config -- common/autotest_common.sh@720 -- # xtrace_disable 00:06:17.857 23:48:18 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:17.857 23:48:18 json_config -- json_config/json_config.sh@95 -- # [[ 1 -eq 1 ]] 00:06:17.857 23:48:18 json_config -- json_config/json_config.sh@96 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:06:17.858 23:48:18 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:06:18.116 23:48:18 json_config -- json_config/json_config.sh@97 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:06:18.116 23:48:18 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:06:18.116 [2024-05-14 23:48:18.685597] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:06:18.116 23:48:18 json_config -- json_config/json_config.sh@98 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:06:18.116 23:48:18 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:06:18.375 [2024-05-14 23:48:18.930219] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:06:18.375 23:48:18 json_config -- json_config/json_config.sh@101 -- # timing_exit create_accel_config 00:06:18.375 23:48:18 json_config -- common/autotest_common.sh@726 -- # xtrace_disable 00:06:18.375 23:48:18 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:18.658 23:48:19 json_config -- json_config/json_config.sh@273 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:06:18.658 23:48:19 json_config -- json_config/json_config.sh@274 -- # tgt_rpc load_config 00:06:18.658 23:48:19 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:06:18.658 [2024-05-14 23:48:19.247916] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:06:21.949 23:48:22 json_config -- json_config/json_config.sh@276 -- # tgt_check_notification_types 00:06:21.949 23:48:22 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:06:21.949 23:48:22 json_config -- common/autotest_common.sh@720 -- # xtrace_disable 00:06:21.949 23:48:22 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:21.949 23:48:22 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:06:21.949 23:48:22 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:06:21.949 23:48:22 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:06:21.949 23:48:22 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:06:21.949 23:48:22 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:06:21.949 23:48:22 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:06:21.949 23:48:22 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:06:21.949 23:48:22 json_config -- json_config/json_config.sh@48 -- # local get_types 00:06:21.949 23:48:22 json_config -- json_config/json_config.sh@49 -- # [[ bdev_register bdev_unregister != \b\d\e\v\_\r\e\g\i\s\t\e\r\ \b\d\e\v\_\u\n\r\e\g\i\s\t\e\r ]] 00:06:21.949 23:48:22 json_config -- json_config/json_config.sh@54 -- # timing_exit tgt_check_notification_types 00:06:21.949 23:48:22 json_config -- common/autotest_common.sh@726 -- # xtrace_disable 00:06:21.949 23:48:22 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:21.949 23:48:22 json_config -- json_config/json_config.sh@55 -- # return 0 00:06:21.949 23:48:22 json_config -- json_config/json_config.sh@278 -- # [[ 1 -eq 1 ]] 00:06:21.949 23:48:22 json_config -- json_config/json_config.sh@279 -- # create_bdev_subsystem_config 00:06:21.949 23:48:22 json_config -- json_config/json_config.sh@105 -- # timing_enter create_bdev_subsystem_config 00:06:21.949 23:48:22 json_config -- common/autotest_common.sh@720 -- # xtrace_disable 00:06:21.949 23:48:22 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:21.949 23:48:22 json_config -- json_config/json_config.sh@107 -- # expected_notifications=() 00:06:21.949 23:48:22 json_config -- json_config/json_config.sh@107 -- # local expected_notifications 00:06:21.949 23:48:22 json_config -- json_config/json_config.sh@111 -- # expected_notifications+=($(get_notifications)) 00:06:21.949 23:48:22 json_config -- json_config/json_config.sh@111 -- # get_notifications 00:06:21.949 23:48:22 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:06:21.949 23:48:22 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:21.949 23:48:22 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:21.949 23:48:22 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:06:21.949 23:48:22 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:06:21.949 23:48:22 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:06:22.208 23:48:22 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:06:22.208 23:48:22 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:22.208 23:48:22 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:22.208 23:48:22 json_config -- json_config/json_config.sh@113 -- # [[ 1 -eq 1 ]] 00:06:22.208 23:48:22 json_config -- json_config/json_config.sh@114 -- # local lvol_store_base_bdev=Nvme0n1 00:06:22.208 23:48:22 json_config -- json_config/json_config.sh@116 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:06:22.208 23:48:22 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:06:22.208 Nvme0n1p0 Nvme0n1p1 00:06:22.467 23:48:22 json_config -- json_config/json_config.sh@117 -- # tgt_rpc bdev_split_create Malloc0 3 00:06:22.467 23:48:22 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:06:22.467 [2024-05-14 23:48:23.020098] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:22.467 [2024-05-14 23:48:23.020152] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:22.467 00:06:22.467 23:48:23 json_config -- json_config/json_config.sh@118 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:06:22.467 23:48:23 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:06:22.726 Malloc3 00:06:22.726 23:48:23 json_config -- json_config/json_config.sh@119 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:06:22.726 23:48:23 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:06:22.985 [2024-05-14 23:48:23.505508] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:22.985 [2024-05-14 23:48:23.505555] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:22.985 [2024-05-14 23:48:23.505575] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ffa270 00:06:22.985 [2024-05-14 23:48:23.505588] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:22.985 [2024-05-14 23:48:23.507377] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:22.985 [2024-05-14 23:48:23.507421] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:06:22.985 PTBdevFromMalloc3 00:06:22.985 23:48:23 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_null_create Null0 32 512 00:06:22.985 23:48:23 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:06:23.242 Null0 00:06:23.242 23:48:23 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:06:23.242 23:48:23 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:06:23.501 Malloc0 00:06:23.501 23:48:24 json_config -- json_config/json_config.sh@124 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:06:23.501 23:48:24 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:06:23.760 Malloc1 00:06:23.760 23:48:24 json_config -- json_config/json_config.sh@137 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:06:23.760 23:48:24 json_config -- json_config/json_config.sh@140 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:06:24.018 102400+0 records in 00:06:24.018 102400+0 records out 00:06:24.018 104857600 bytes (105 MB, 100 MiB) copied, 0.306991 s, 342 MB/s 00:06:24.018 23:48:24 json_config -- json_config/json_config.sh@141 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:06:24.018 23:48:24 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:06:24.278 aio_disk 00:06:24.278 23:48:24 json_config -- json_config/json_config.sh@142 -- # expected_notifications+=(bdev_register:aio_disk) 00:06:24.278 23:48:24 json_config -- json_config/json_config.sh@147 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:06:24.278 23:48:24 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:06:29.551 b4e31251-d51e-4fd2-a42a-71832ea51cfd 00:06:29.551 23:48:29 json_config -- json_config/json_config.sh@154 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:06:29.552 23:48:29 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:06:29.552 23:48:29 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:06:29.552 23:48:29 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:06:29.552 23:48:29 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:06:29.552 23:48:29 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:06:29.552 23:48:29 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:06:29.810 23:48:30 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:06:29.810 23:48:30 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:06:30.069 23:48:30 json_config -- json_config/json_config.sh@157 -- # [[ 1 -eq 1 ]] 00:06:30.069 23:48:30 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:06:30.069 23:48:30 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:06:30.328 MallocForCryptoBdev 00:06:30.328 23:48:30 json_config -- json_config/json_config.sh@159 -- # lspci -d:37c8 00:06:30.328 23:48:30 json_config -- json_config/json_config.sh@159 -- # wc -l 00:06:30.328 23:48:30 json_config -- json_config/json_config.sh@159 -- # [[ 3 -eq 0 ]] 00:06:30.328 23:48:30 json_config -- json_config/json_config.sh@162 -- # local crypto_driver=crypto_qat 00:06:30.328 23:48:30 json_config -- json_config/json_config.sh@165 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:06:30.328 23:48:30 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:06:30.587 [2024-05-14 23:48:30.939777] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:06:30.587 CryptoMallocBdev 00:06:30.587 23:48:30 json_config -- json_config/json_config.sh@169 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:06:30.587 23:48:30 json_config -- json_config/json_config.sh@172 -- # [[ 0 -eq 1 ]] 00:06:30.587 23:48:30 json_config -- json_config/json_config.sh@178 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:cdd8804c-1857-47a3-970b-0dcfb0300f52 bdev_register:d6b06c71-6665-4e0c-a3c6-f8a5bf45aa6e bdev_register:07960642-d68b-45a9-8277-86ec38dde10e bdev_register:1c00ba51-2148-4c51-8922-47f8065a64b7 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:06:30.587 23:48:30 json_config -- json_config/json_config.sh@67 -- # local events_to_check 00:06:30.587 23:48:30 json_config -- json_config/json_config.sh@68 -- # local recorded_events 00:06:30.587 23:48:30 json_config -- json_config/json_config.sh@71 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:06:30.588 23:48:30 json_config -- json_config/json_config.sh@71 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:cdd8804c-1857-47a3-970b-0dcfb0300f52 bdev_register:d6b06c71-6665-4e0c-a3c6-f8a5bf45aa6e bdev_register:07960642-d68b-45a9-8277-86ec38dde10e bdev_register:1c00ba51-2148-4c51-8922-47f8065a64b7 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:06:30.588 23:48:30 json_config -- json_config/json_config.sh@71 -- # sort 00:06:30.588 23:48:30 json_config -- json_config/json_config.sh@72 -- # recorded_events=($(get_notifications | sort)) 00:06:30.588 23:48:30 json_config -- json_config/json_config.sh@72 -- # get_notifications 00:06:30.588 23:48:30 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:06:30.588 23:48:30 json_config -- json_config/json_config.sh@72 -- # sort 00:06:30.588 23:48:30 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:30.588 23:48:30 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:30.588 23:48:30 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:06:30.588 23:48:30 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:06:30.588 23:48:30 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p1 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p0 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc3 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:PTBdevFromMalloc3 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Null0 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p2 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p1 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p0 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc1 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:aio_disk 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:cdd8804c-1857-47a3-970b-0dcfb0300f52 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:d6b06c71-6665-4e0c-a3c6-f8a5bf45aa6e 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:07960642-d68b-45a9-8277-86ec38dde10e 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:1c00ba51-2148-4c51-8922-47f8065a64b7 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:MallocForCryptoBdev 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:CryptoMallocBdev 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@74 -- # [[ bdev_register:07960642-d68b-45a9-8277-86ec38dde10e bdev_register:1c00ba51-2148-4c51-8922-47f8065a64b7 bdev_register:aio_disk bdev_register:cdd8804c-1857-47a3-970b-0dcfb0300f52 bdev_register:CryptoMallocBdev bdev_register:d6b06c71-6665-4e0c-a3c6-f8a5bf45aa6e bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\0\7\9\6\0\6\4\2\-\d\6\8\b\-\4\5\a\9\-\8\2\7\7\-\8\6\e\c\3\8\d\d\e\1\0\e\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\1\c\0\0\b\a\5\1\-\2\1\4\8\-\4\c\5\1\-\8\9\2\2\-\4\7\f\8\0\6\5\a\6\4\b\7\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\c\d\d\8\8\0\4\c\-\1\8\5\7\-\4\7\a\3\-\9\7\0\b\-\0\d\c\f\b\0\3\0\0\f\5\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\d\6\b\0\6\c\7\1\-\6\6\6\5\-\4\e\0\c\-\a\3\c\6\-\f\8\a\5\b\f\4\5\a\a\6\e\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@86 -- # cat 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@86 -- # printf ' %s\n' bdev_register:07960642-d68b-45a9-8277-86ec38dde10e bdev_register:1c00ba51-2148-4c51-8922-47f8065a64b7 bdev_register:aio_disk bdev_register:cdd8804c-1857-47a3-970b-0dcfb0300f52 bdev_register:CryptoMallocBdev bdev_register:d6b06c71-6665-4e0c-a3c6-f8a5bf45aa6e bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:06:30.848 Expected events matched: 00:06:30.848 bdev_register:07960642-d68b-45a9-8277-86ec38dde10e 00:06:30.848 bdev_register:1c00ba51-2148-4c51-8922-47f8065a64b7 00:06:30.848 bdev_register:aio_disk 00:06:30.848 bdev_register:cdd8804c-1857-47a3-970b-0dcfb0300f52 00:06:30.848 bdev_register:CryptoMallocBdev 00:06:30.848 bdev_register:d6b06c71-6665-4e0c-a3c6-f8a5bf45aa6e 00:06:30.848 bdev_register:Malloc0 00:06:30.848 bdev_register:Malloc0p0 00:06:30.848 bdev_register:Malloc0p1 00:06:30.848 bdev_register:Malloc0p2 00:06:30.848 bdev_register:Malloc1 00:06:30.848 bdev_register:Malloc3 00:06:30.848 bdev_register:MallocForCryptoBdev 00:06:30.848 bdev_register:Null0 00:06:30.848 bdev_register:Nvme0n1 00:06:30.848 bdev_register:Nvme0n1p0 00:06:30.848 bdev_register:Nvme0n1p1 00:06:30.848 bdev_register:PTBdevFromMalloc3 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@180 -- # timing_exit create_bdev_subsystem_config 00:06:30.848 23:48:31 json_config -- common/autotest_common.sh@726 -- # xtrace_disable 00:06:30.848 23:48:31 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@282 -- # [[ 0 -eq 1 ]] 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@293 -- # timing_exit json_config_setup_target 00:06:30.848 23:48:31 json_config -- common/autotest_common.sh@726 -- # xtrace_disable 00:06:30.848 23:48:31 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@295 -- # [[ 0 -eq 1 ]] 00:06:30.848 23:48:31 json_config -- json_config/json_config.sh@300 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:06:30.848 23:48:31 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:06:31.107 MallocBdevForConfigChangeCheck 00:06:31.107 23:48:31 json_config -- json_config/json_config.sh@302 -- # timing_exit json_config_test_init 00:06:31.107 23:48:31 json_config -- common/autotest_common.sh@726 -- # xtrace_disable 00:06:31.107 23:48:31 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:31.107 23:48:31 json_config -- json_config/json_config.sh@359 -- # tgt_rpc save_config 00:06:31.107 23:48:31 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:31.366 23:48:31 json_config -- json_config/json_config.sh@361 -- # echo 'INFO: shutting down applications...' 00:06:31.366 INFO: shutting down applications... 00:06:31.366 23:48:31 json_config -- json_config/json_config.sh@362 -- # [[ 0 -eq 1 ]] 00:06:31.366 23:48:31 json_config -- json_config/json_config.sh@368 -- # json_config_clear target 00:06:31.366 23:48:31 json_config -- json_config/json_config.sh@332 -- # [[ -n 22 ]] 00:06:31.366 23:48:31 json_config -- json_config/json_config.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:06:31.624 [2024-05-14 23:48:32.139475] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:06:34.914 Calling clear_iscsi_subsystem 00:06:34.915 Calling clear_nvmf_subsystem 00:06:34.915 Calling clear_nbd_subsystem 00:06:34.915 Calling clear_ublk_subsystem 00:06:34.915 Calling clear_vhost_blk_subsystem 00:06:34.915 Calling clear_vhost_scsi_subsystem 00:06:34.915 Calling clear_bdev_subsystem 00:06:34.915 23:48:34 json_config -- json_config/json_config.sh@337 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:06:34.915 23:48:34 json_config -- json_config/json_config.sh@343 -- # count=100 00:06:34.915 23:48:34 json_config -- json_config/json_config.sh@344 -- # '[' 100 -gt 0 ']' 00:06:34.915 23:48:34 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:34.915 23:48:34 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:06:34.915 23:48:34 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:06:34.915 23:48:35 json_config -- json_config/json_config.sh@345 -- # break 00:06:34.915 23:48:35 json_config -- json_config/json_config.sh@350 -- # '[' 100 -eq 0 ']' 00:06:34.915 23:48:35 json_config -- json_config/json_config.sh@369 -- # json_config_test_shutdown_app target 00:06:34.915 23:48:35 json_config -- json_config/common.sh@31 -- # local app=target 00:06:34.915 23:48:35 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:34.915 23:48:35 json_config -- json_config/common.sh@35 -- # [[ -n 348959 ]] 00:06:34.915 23:48:35 json_config -- json_config/common.sh@38 -- # kill -SIGINT 348959 00:06:34.915 23:48:35 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:34.915 23:48:35 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:34.915 23:48:35 json_config -- json_config/common.sh@41 -- # kill -0 348959 00:06:34.915 23:48:35 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:06:35.482 23:48:35 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:06:35.482 23:48:35 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:35.482 23:48:35 json_config -- json_config/common.sh@41 -- # kill -0 348959 00:06:35.482 23:48:35 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:35.482 23:48:35 json_config -- json_config/common.sh@43 -- # break 00:06:35.482 23:48:35 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:35.482 23:48:35 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:35.482 SPDK target shutdown done 00:06:35.482 23:48:35 json_config -- json_config/json_config.sh@371 -- # echo 'INFO: relaunching applications...' 00:06:35.482 INFO: relaunching applications... 00:06:35.482 23:48:35 json_config -- json_config/json_config.sh@372 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:35.482 23:48:35 json_config -- json_config/common.sh@9 -- # local app=target 00:06:35.482 23:48:35 json_config -- json_config/common.sh@10 -- # shift 00:06:35.482 23:48:35 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:35.482 23:48:35 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:35.482 23:48:35 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:06:35.482 23:48:35 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:35.482 23:48:35 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:35.482 23:48:35 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=351522 00:06:35.482 23:48:35 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:35.482 Waiting for target to run... 00:06:35.482 23:48:35 json_config -- json_config/common.sh@25 -- # waitforlisten 351522 /var/tmp/spdk_tgt.sock 00:06:35.482 23:48:35 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:35.482 23:48:35 json_config -- common/autotest_common.sh@827 -- # '[' -z 351522 ']' 00:06:35.482 23:48:35 json_config -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:35.482 23:48:35 json_config -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:35.482 23:48:35 json_config -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:35.482 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:35.482 23:48:35 json_config -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:35.482 23:48:35 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:35.482 [2024-05-14 23:48:35.837511] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:06:35.482 [2024-05-14 23:48:35.837581] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid351522 ] 00:06:35.741 [2024-05-14 23:48:36.194554] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:35.741 [2024-05-14 23:48:36.284564] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.741 [2024-05-14 23:48:36.330641] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:06:36.000 [2024-05-14 23:48:36.338677] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:06:36.000 [2024-05-14 23:48:36.346701] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:06:36.000 [2024-05-14 23:48:36.427869] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:06:38.543 [2024-05-14 23:48:38.844071] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:38.543 [2024-05-14 23:48:38.844126] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:06:38.543 [2024-05-14 23:48:38.844140] vbdev_passthru.c: 731:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:06:38.543 [2024-05-14 23:48:38.852094] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:06:38.543 [2024-05-14 23:48:38.852121] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:06:38.543 [2024-05-14 23:48:38.860107] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:38.543 [2024-05-14 23:48:38.860131] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:38.543 [2024-05-14 23:48:38.868140] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:06:38.543 [2024-05-14 23:48:38.868165] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:06:38.543 [2024-05-14 23:48:38.868177] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:06:38.865 [2024-05-14 23:48:39.241235] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:38.865 [2024-05-14 23:48:39.241281] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:38.865 [2024-05-14 23:48:39.241299] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25f2630 00:06:38.865 [2024-05-14 23:48:39.241316] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:38.865 [2024-05-14 23:48:39.241605] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:38.865 [2024-05-14 23:48:39.241624] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:06:38.865 23:48:39 json_config -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:38.865 23:48:39 json_config -- common/autotest_common.sh@860 -- # return 0 00:06:38.865 23:48:39 json_config -- json_config/common.sh@26 -- # echo '' 00:06:38.865 00:06:38.865 23:48:39 json_config -- json_config/json_config.sh@373 -- # [[ 0 -eq 1 ]] 00:06:38.865 23:48:39 json_config -- json_config/json_config.sh@377 -- # echo 'INFO: Checking if target configuration is the same...' 00:06:38.865 INFO: Checking if target configuration is the same... 00:06:38.865 23:48:39 json_config -- json_config/json_config.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:38.865 23:48:39 json_config -- json_config/json_config.sh@378 -- # tgt_rpc save_config 00:06:38.865 23:48:39 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:38.865 + '[' 2 -ne 2 ']' 00:06:38.865 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:06:38.865 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:06:38.865 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:06:38.865 +++ basename /dev/fd/62 00:06:38.865 ++ mktemp /tmp/62.XXX 00:06:38.865 + tmp_file_1=/tmp/62.6Nv 00:06:38.865 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:38.865 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:06:38.865 + tmp_file_2=/tmp/spdk_tgt_config.json.ApW 00:06:38.865 + ret=0 00:06:38.865 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:39.433 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:39.433 + diff -u /tmp/62.6Nv /tmp/spdk_tgt_config.json.ApW 00:06:39.433 + echo 'INFO: JSON config files are the same' 00:06:39.433 INFO: JSON config files are the same 00:06:39.433 + rm /tmp/62.6Nv /tmp/spdk_tgt_config.json.ApW 00:06:39.433 + exit 0 00:06:39.433 23:48:39 json_config -- json_config/json_config.sh@379 -- # [[ 0 -eq 1 ]] 00:06:39.433 23:48:39 json_config -- json_config/json_config.sh@384 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:06:39.433 INFO: changing configuration and checking if this can be detected... 00:06:39.433 23:48:39 json_config -- json_config/json_config.sh@386 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:06:39.433 23:48:39 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:06:39.433 23:48:39 json_config -- json_config/json_config.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:39.433 23:48:39 json_config -- json_config/json_config.sh@387 -- # tgt_rpc save_config 00:06:39.433 23:48:39 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:39.433 + '[' 2 -ne 2 ']' 00:06:39.433 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:06:39.433 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:06:39.433 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:06:39.433 +++ basename /dev/fd/62 00:06:39.433 ++ mktemp /tmp/62.XXX 00:06:39.433 + tmp_file_1=/tmp/62.Rea 00:06:39.433 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:39.433 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:06:39.433 + tmp_file_2=/tmp/spdk_tgt_config.json.ogA 00:06:39.433 + ret=0 00:06:39.433 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:40.010 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:40.010 + diff -u /tmp/62.Rea /tmp/spdk_tgt_config.json.ogA 00:06:40.010 + ret=1 00:06:40.010 + echo '=== Start of file: /tmp/62.Rea ===' 00:06:40.010 + cat /tmp/62.Rea 00:06:40.010 + echo '=== End of file: /tmp/62.Rea ===' 00:06:40.010 + echo '' 00:06:40.010 + echo '=== Start of file: /tmp/spdk_tgt_config.json.ogA ===' 00:06:40.010 + cat /tmp/spdk_tgt_config.json.ogA 00:06:40.010 + echo '=== End of file: /tmp/spdk_tgt_config.json.ogA ===' 00:06:40.010 + echo '' 00:06:40.010 + rm /tmp/62.Rea /tmp/spdk_tgt_config.json.ogA 00:06:40.010 + exit 1 00:06:40.010 23:48:40 json_config -- json_config/json_config.sh@391 -- # echo 'INFO: configuration change detected.' 00:06:40.010 INFO: configuration change detected. 00:06:40.010 23:48:40 json_config -- json_config/json_config.sh@394 -- # json_config_test_fini 00:06:40.010 23:48:40 json_config -- json_config/json_config.sh@306 -- # timing_enter json_config_test_fini 00:06:40.010 23:48:40 json_config -- common/autotest_common.sh@720 -- # xtrace_disable 00:06:40.010 23:48:40 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:40.010 23:48:40 json_config -- json_config/json_config.sh@307 -- # local ret=0 00:06:40.010 23:48:40 json_config -- json_config/json_config.sh@309 -- # [[ -n '' ]] 00:06:40.010 23:48:40 json_config -- json_config/json_config.sh@317 -- # [[ -n 351522 ]] 00:06:40.010 23:48:40 json_config -- json_config/json_config.sh@320 -- # cleanup_bdev_subsystem_config 00:06:40.010 23:48:40 json_config -- json_config/json_config.sh@184 -- # timing_enter cleanup_bdev_subsystem_config 00:06:40.010 23:48:40 json_config -- common/autotest_common.sh@720 -- # xtrace_disable 00:06:40.010 23:48:40 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:40.010 23:48:40 json_config -- json_config/json_config.sh@186 -- # [[ 1 -eq 1 ]] 00:06:40.010 23:48:40 json_config -- json_config/json_config.sh@187 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:06:40.010 23:48:40 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:06:40.268 23:48:40 json_config -- json_config/json_config.sh@188 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:06:40.268 23:48:40 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:06:40.526 23:48:40 json_config -- json_config/json_config.sh@189 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:06:40.526 23:48:40 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:06:40.785 23:48:41 json_config -- json_config/json_config.sh@190 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:06:40.785 23:48:41 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:06:41.044 23:48:41 json_config -- json_config/json_config.sh@193 -- # uname -s 00:06:41.044 23:48:41 json_config -- json_config/json_config.sh@193 -- # [[ Linux = Linux ]] 00:06:41.044 23:48:41 json_config -- json_config/json_config.sh@194 -- # rm -f /sample_aio 00:06:41.044 23:48:41 json_config -- json_config/json_config.sh@197 -- # [[ 0 -eq 1 ]] 00:06:41.044 23:48:41 json_config -- json_config/json_config.sh@201 -- # timing_exit cleanup_bdev_subsystem_config 00:06:41.044 23:48:41 json_config -- common/autotest_common.sh@726 -- # xtrace_disable 00:06:41.044 23:48:41 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:41.044 23:48:41 json_config -- json_config/json_config.sh@323 -- # killprocess 351522 00:06:41.044 23:48:41 json_config -- common/autotest_common.sh@946 -- # '[' -z 351522 ']' 00:06:41.044 23:48:41 json_config -- common/autotest_common.sh@950 -- # kill -0 351522 00:06:41.044 23:48:41 json_config -- common/autotest_common.sh@951 -- # uname 00:06:41.044 23:48:41 json_config -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:41.044 23:48:41 json_config -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 351522 00:06:41.044 23:48:41 json_config -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:41.044 23:48:41 json_config -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:41.044 23:48:41 json_config -- common/autotest_common.sh@964 -- # echo 'killing process with pid 351522' 00:06:41.044 killing process with pid 351522 00:06:41.045 23:48:41 json_config -- common/autotest_common.sh@965 -- # kill 351522 00:06:41.045 23:48:41 json_config -- common/autotest_common.sh@970 -- # wait 351522 00:06:44.341 23:48:44 json_config -- json_config/json_config.sh@326 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:44.341 23:48:44 json_config -- json_config/json_config.sh@327 -- # timing_exit json_config_test_fini 00:06:44.341 23:48:44 json_config -- common/autotest_common.sh@726 -- # xtrace_disable 00:06:44.341 23:48:44 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:44.341 23:48:44 json_config -- json_config/json_config.sh@328 -- # return 0 00:06:44.341 23:48:44 json_config -- json_config/json_config.sh@396 -- # echo 'INFO: Success' 00:06:44.341 INFO: Success 00:06:44.341 00:06:44.341 real 0m27.543s 00:06:44.341 user 0m33.301s 00:06:44.341 sys 0m3.940s 00:06:44.341 23:48:44 json_config -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:44.341 23:48:44 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:44.341 ************************************ 00:06:44.341 END TEST json_config 00:06:44.341 ************************************ 00:06:44.341 23:48:44 -- spdk/autotest.sh@169 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:44.341 23:48:44 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:44.341 23:48:44 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:44.341 23:48:44 -- common/autotest_common.sh@10 -- # set +x 00:06:44.341 ************************************ 00:06:44.341 START TEST json_config_extra_key 00:06:44.341 ************************************ 00:06:44.341 23:48:44 json_config_extra_key -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:44.341 23:48:44 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:06:44.341 23:48:44 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:06:44.341 23:48:44 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:44.341 23:48:44 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:44.341 23:48:44 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:44.341 23:48:44 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:44.341 23:48:44 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:44.341 23:48:44 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:44.341 23:48:44 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:44.341 23:48:44 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:44.341 23:48:44 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:44.341 23:48:44 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:44.341 23:48:44 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:06:44.341 23:48:44 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:06:44.341 23:48:44 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:44.341 23:48:44 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:44.341 23:48:44 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:44.341 23:48:44 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:44.341 23:48:44 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:06:44.341 23:48:44 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:44.341 23:48:44 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:44.341 23:48:44 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:44.341 23:48:44 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:44.342 23:48:44 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:44.342 23:48:44 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:44.342 23:48:44 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:06:44.342 23:48:44 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:44.342 23:48:44 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:06:44.342 23:48:44 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:44.342 23:48:44 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:44.342 23:48:44 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:44.342 23:48:44 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:44.342 23:48:44 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:44.342 23:48:44 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:44.342 23:48:44 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:44.342 23:48:44 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:44.342 23:48:44 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:06:44.342 23:48:44 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:06:44.342 23:48:44 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:06:44.342 23:48:44 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:06:44.342 23:48:44 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:06:44.342 23:48:44 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:06:44.342 23:48:44 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:06:44.342 23:48:44 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:06:44.342 23:48:44 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:06:44.342 23:48:44 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:44.342 23:48:44 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:06:44.342 INFO: launching applications... 00:06:44.342 23:48:44 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:06:44.342 23:48:44 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:06:44.342 23:48:44 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:06:44.342 23:48:44 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:44.342 23:48:44 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:44.342 23:48:44 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:06:44.342 23:48:44 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:44.342 23:48:44 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:44.342 23:48:44 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=352755 00:06:44.342 23:48:44 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:44.342 Waiting for target to run... 00:06:44.342 23:48:44 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 352755 /var/tmp/spdk_tgt.sock 00:06:44.342 23:48:44 json_config_extra_key -- common/autotest_common.sh@827 -- # '[' -z 352755 ']' 00:06:44.342 23:48:44 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:06:44.342 23:48:44 json_config_extra_key -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:44.342 23:48:44 json_config_extra_key -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:44.342 23:48:44 json_config_extra_key -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:44.342 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:44.342 23:48:44 json_config_extra_key -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:44.342 23:48:44 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:44.342 [2024-05-14 23:48:44.906681] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:06:44.342 [2024-05-14 23:48:44.906766] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid352755 ] 00:06:45.279 [2024-05-14 23:48:45.501268] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:45.279 [2024-05-14 23:48:45.612617] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.279 23:48:45 json_config_extra_key -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:45.279 23:48:45 json_config_extra_key -- common/autotest_common.sh@860 -- # return 0 00:06:45.279 23:48:45 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:06:45.279 00:06:45.279 23:48:45 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:06:45.279 INFO: shutting down applications... 00:06:45.279 23:48:45 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:06:45.279 23:48:45 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:06:45.279 23:48:45 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:45.279 23:48:45 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 352755 ]] 00:06:45.279 23:48:45 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 352755 00:06:45.279 23:48:45 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:45.279 23:48:45 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:45.279 23:48:45 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 352755 00:06:45.279 23:48:45 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:06:45.848 23:48:46 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:06:45.848 23:48:46 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:45.848 23:48:46 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 352755 00:06:45.848 23:48:46 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:45.848 23:48:46 json_config_extra_key -- json_config/common.sh@43 -- # break 00:06:45.848 23:48:46 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:45.848 23:48:46 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:45.848 SPDK target shutdown done 00:06:45.848 23:48:46 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:06:45.848 Success 00:06:45.848 00:06:45.848 real 0m1.609s 00:06:45.848 user 0m1.106s 00:06:45.848 sys 0m0.723s 00:06:45.848 23:48:46 json_config_extra_key -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:45.848 23:48:46 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:45.848 ************************************ 00:06:45.848 END TEST json_config_extra_key 00:06:45.848 ************************************ 00:06:45.848 23:48:46 -- spdk/autotest.sh@170 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:45.848 23:48:46 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:45.848 23:48:46 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:45.848 23:48:46 -- common/autotest_common.sh@10 -- # set +x 00:06:45.848 ************************************ 00:06:45.848 START TEST alias_rpc 00:06:45.848 ************************************ 00:06:45.848 23:48:46 alias_rpc -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:46.107 * Looking for test storage... 00:06:46.107 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:06:46.107 23:48:46 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:46.107 23:48:46 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=353120 00:06:46.107 23:48:46 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 353120 00:06:46.107 23:48:46 alias_rpc -- common/autotest_common.sh@827 -- # '[' -z 353120 ']' 00:06:46.107 23:48:46 alias_rpc -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:46.107 23:48:46 alias_rpc -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:46.107 23:48:46 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:46.107 23:48:46 alias_rpc -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:46.107 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:46.107 23:48:46 alias_rpc -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:46.107 23:48:46 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:46.107 [2024-05-14 23:48:46.602601] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:06:46.107 [2024-05-14 23:48:46.602668] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid353120 ] 00:06:46.366 [2024-05-14 23:48:46.729297] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:46.366 [2024-05-14 23:48:46.828617] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:46.935 23:48:47 alias_rpc -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:46.935 23:48:47 alias_rpc -- common/autotest_common.sh@860 -- # return 0 00:06:46.935 23:48:47 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:06:47.504 23:48:47 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 353120 00:06:47.504 23:48:47 alias_rpc -- common/autotest_common.sh@946 -- # '[' -z 353120 ']' 00:06:47.504 23:48:47 alias_rpc -- common/autotest_common.sh@950 -- # kill -0 353120 00:06:47.504 23:48:47 alias_rpc -- common/autotest_common.sh@951 -- # uname 00:06:47.504 23:48:47 alias_rpc -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:47.504 23:48:47 alias_rpc -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 353120 00:06:47.504 23:48:47 alias_rpc -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:47.504 23:48:47 alias_rpc -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:47.504 23:48:47 alias_rpc -- common/autotest_common.sh@964 -- # echo 'killing process with pid 353120' 00:06:47.504 killing process with pid 353120 00:06:47.504 23:48:47 alias_rpc -- common/autotest_common.sh@965 -- # kill 353120 00:06:47.504 23:48:47 alias_rpc -- common/autotest_common.sh@970 -- # wait 353120 00:06:47.763 00:06:47.763 real 0m1.832s 00:06:47.763 user 0m2.013s 00:06:47.763 sys 0m0.558s 00:06:47.763 23:48:48 alias_rpc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:47.763 23:48:48 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:47.763 ************************************ 00:06:47.763 END TEST alias_rpc 00:06:47.763 ************************************ 00:06:47.763 23:48:48 -- spdk/autotest.sh@172 -- # [[ 0 -eq 0 ]] 00:06:47.763 23:48:48 -- spdk/autotest.sh@173 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:47.763 23:48:48 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:47.763 23:48:48 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:47.763 23:48:48 -- common/autotest_common.sh@10 -- # set +x 00:06:47.763 ************************************ 00:06:47.763 START TEST spdkcli_tcp 00:06:47.763 ************************************ 00:06:47.763 23:48:48 spdkcli_tcp -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:48.024 * Looking for test storage... 00:06:48.024 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:06:48.024 23:48:48 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:06:48.024 23:48:48 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:06:48.024 23:48:48 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:06:48.024 23:48:48 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:06:48.024 23:48:48 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:06:48.024 23:48:48 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:06:48.024 23:48:48 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:06:48.024 23:48:48 spdkcli_tcp -- common/autotest_common.sh@720 -- # xtrace_disable 00:06:48.024 23:48:48 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:48.024 23:48:48 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=353382 00:06:48.024 23:48:48 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 353382 00:06:48.024 23:48:48 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:06:48.024 23:48:48 spdkcli_tcp -- common/autotest_common.sh@827 -- # '[' -z 353382 ']' 00:06:48.024 23:48:48 spdkcli_tcp -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:48.024 23:48:48 spdkcli_tcp -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:48.024 23:48:48 spdkcli_tcp -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:48.024 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:48.024 23:48:48 spdkcli_tcp -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:48.024 23:48:48 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:48.024 [2024-05-14 23:48:48.530953] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:06:48.024 [2024-05-14 23:48:48.531028] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid353382 ] 00:06:48.283 [2024-05-14 23:48:48.658605] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:48.283 [2024-05-14 23:48:48.763528] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:48.283 [2024-05-14 23:48:48.763536] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:48.851 23:48:49 spdkcli_tcp -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:48.851 23:48:49 spdkcli_tcp -- common/autotest_common.sh@860 -- # return 0 00:06:49.111 23:48:49 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=353561 00:06:49.111 23:48:49 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:06:49.111 23:48:49 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:06:49.111 [ 00:06:49.111 "bdev_malloc_delete", 00:06:49.111 "bdev_malloc_create", 00:06:49.111 "bdev_null_resize", 00:06:49.111 "bdev_null_delete", 00:06:49.111 "bdev_null_create", 00:06:49.111 "bdev_nvme_cuse_unregister", 00:06:49.111 "bdev_nvme_cuse_register", 00:06:49.111 "bdev_opal_new_user", 00:06:49.111 "bdev_opal_set_lock_state", 00:06:49.111 "bdev_opal_delete", 00:06:49.111 "bdev_opal_get_info", 00:06:49.111 "bdev_opal_create", 00:06:49.111 "bdev_nvme_opal_revert", 00:06:49.111 "bdev_nvme_opal_init", 00:06:49.111 "bdev_nvme_send_cmd", 00:06:49.111 "bdev_nvme_get_path_iostat", 00:06:49.111 "bdev_nvme_get_mdns_discovery_info", 00:06:49.111 "bdev_nvme_stop_mdns_discovery", 00:06:49.111 "bdev_nvme_start_mdns_discovery", 00:06:49.111 "bdev_nvme_set_multipath_policy", 00:06:49.111 "bdev_nvme_set_preferred_path", 00:06:49.111 "bdev_nvme_get_io_paths", 00:06:49.111 "bdev_nvme_remove_error_injection", 00:06:49.111 "bdev_nvme_add_error_injection", 00:06:49.111 "bdev_nvme_get_discovery_info", 00:06:49.111 "bdev_nvme_stop_discovery", 00:06:49.111 "bdev_nvme_start_discovery", 00:06:49.111 "bdev_nvme_get_controller_health_info", 00:06:49.111 "bdev_nvme_disable_controller", 00:06:49.111 "bdev_nvme_enable_controller", 00:06:49.111 "bdev_nvme_reset_controller", 00:06:49.111 "bdev_nvme_get_transport_statistics", 00:06:49.111 "bdev_nvme_apply_firmware", 00:06:49.111 "bdev_nvme_detach_controller", 00:06:49.111 "bdev_nvme_get_controllers", 00:06:49.111 "bdev_nvme_attach_controller", 00:06:49.111 "bdev_nvme_set_hotplug", 00:06:49.111 "bdev_nvme_set_options", 00:06:49.111 "bdev_passthru_delete", 00:06:49.111 "bdev_passthru_create", 00:06:49.111 "bdev_lvol_check_shallow_copy", 00:06:49.111 "bdev_lvol_start_shallow_copy", 00:06:49.111 "bdev_lvol_grow_lvstore", 00:06:49.111 "bdev_lvol_get_lvols", 00:06:49.111 "bdev_lvol_get_lvstores", 00:06:49.111 "bdev_lvol_delete", 00:06:49.111 "bdev_lvol_set_read_only", 00:06:49.111 "bdev_lvol_resize", 00:06:49.111 "bdev_lvol_decouple_parent", 00:06:49.111 "bdev_lvol_inflate", 00:06:49.111 "bdev_lvol_rename", 00:06:49.111 "bdev_lvol_clone_bdev", 00:06:49.111 "bdev_lvol_clone", 00:06:49.111 "bdev_lvol_snapshot", 00:06:49.111 "bdev_lvol_create", 00:06:49.111 "bdev_lvol_delete_lvstore", 00:06:49.111 "bdev_lvol_rename_lvstore", 00:06:49.111 "bdev_lvol_create_lvstore", 00:06:49.111 "bdev_raid_set_options", 00:06:49.111 "bdev_raid_remove_base_bdev", 00:06:49.111 "bdev_raid_add_base_bdev", 00:06:49.111 "bdev_raid_delete", 00:06:49.111 "bdev_raid_create", 00:06:49.111 "bdev_raid_get_bdevs", 00:06:49.111 "bdev_error_inject_error", 00:06:49.111 "bdev_error_delete", 00:06:49.111 "bdev_error_create", 00:06:49.111 "bdev_split_delete", 00:06:49.111 "bdev_split_create", 00:06:49.111 "bdev_delay_delete", 00:06:49.111 "bdev_delay_create", 00:06:49.111 "bdev_delay_update_latency", 00:06:49.111 "bdev_zone_block_delete", 00:06:49.111 "bdev_zone_block_create", 00:06:49.111 "blobfs_create", 00:06:49.111 "blobfs_detect", 00:06:49.111 "blobfs_set_cache_size", 00:06:49.111 "bdev_crypto_delete", 00:06:49.111 "bdev_crypto_create", 00:06:49.111 "bdev_compress_delete", 00:06:49.111 "bdev_compress_create", 00:06:49.111 "bdev_compress_get_orphans", 00:06:49.111 "bdev_aio_delete", 00:06:49.111 "bdev_aio_rescan", 00:06:49.111 "bdev_aio_create", 00:06:49.111 "bdev_ftl_set_property", 00:06:49.111 "bdev_ftl_get_properties", 00:06:49.111 "bdev_ftl_get_stats", 00:06:49.111 "bdev_ftl_unmap", 00:06:49.111 "bdev_ftl_unload", 00:06:49.111 "bdev_ftl_delete", 00:06:49.111 "bdev_ftl_load", 00:06:49.111 "bdev_ftl_create", 00:06:49.111 "bdev_virtio_attach_controller", 00:06:49.111 "bdev_virtio_scsi_get_devices", 00:06:49.111 "bdev_virtio_detach_controller", 00:06:49.111 "bdev_virtio_blk_set_hotplug", 00:06:49.111 "bdev_iscsi_delete", 00:06:49.111 "bdev_iscsi_create", 00:06:49.111 "bdev_iscsi_set_options", 00:06:49.111 "accel_error_inject_error", 00:06:49.111 "ioat_scan_accel_module", 00:06:49.111 "dsa_scan_accel_module", 00:06:49.111 "iaa_scan_accel_module", 00:06:49.111 "dpdk_cryptodev_get_driver", 00:06:49.111 "dpdk_cryptodev_set_driver", 00:06:49.111 "dpdk_cryptodev_scan_accel_module", 00:06:49.111 "compressdev_scan_accel_module", 00:06:49.111 "keyring_file_remove_key", 00:06:49.111 "keyring_file_add_key", 00:06:49.111 "iscsi_get_histogram", 00:06:49.111 "iscsi_enable_histogram", 00:06:49.111 "iscsi_set_options", 00:06:49.111 "iscsi_get_auth_groups", 00:06:49.111 "iscsi_auth_group_remove_secret", 00:06:49.111 "iscsi_auth_group_add_secret", 00:06:49.111 "iscsi_delete_auth_group", 00:06:49.111 "iscsi_create_auth_group", 00:06:49.111 "iscsi_set_discovery_auth", 00:06:49.111 "iscsi_get_options", 00:06:49.111 "iscsi_target_node_request_logout", 00:06:49.111 "iscsi_target_node_set_redirect", 00:06:49.111 "iscsi_target_node_set_auth", 00:06:49.111 "iscsi_target_node_add_lun", 00:06:49.111 "iscsi_get_stats", 00:06:49.111 "iscsi_get_connections", 00:06:49.111 "iscsi_portal_group_set_auth", 00:06:49.111 "iscsi_start_portal_group", 00:06:49.111 "iscsi_delete_portal_group", 00:06:49.111 "iscsi_create_portal_group", 00:06:49.111 "iscsi_get_portal_groups", 00:06:49.111 "iscsi_delete_target_node", 00:06:49.111 "iscsi_target_node_remove_pg_ig_maps", 00:06:49.111 "iscsi_target_node_add_pg_ig_maps", 00:06:49.111 "iscsi_create_target_node", 00:06:49.111 "iscsi_get_target_nodes", 00:06:49.111 "iscsi_delete_initiator_group", 00:06:49.111 "iscsi_initiator_group_remove_initiators", 00:06:49.111 "iscsi_initiator_group_add_initiators", 00:06:49.111 "iscsi_create_initiator_group", 00:06:49.111 "iscsi_get_initiator_groups", 00:06:49.111 "nvmf_set_crdt", 00:06:49.111 "nvmf_set_config", 00:06:49.111 "nvmf_set_max_subsystems", 00:06:49.111 "nvmf_subsystem_get_listeners", 00:06:49.111 "nvmf_subsystem_get_qpairs", 00:06:49.111 "nvmf_subsystem_get_controllers", 00:06:49.111 "nvmf_get_stats", 00:06:49.111 "nvmf_get_transports", 00:06:49.112 "nvmf_create_transport", 00:06:49.112 "nvmf_get_targets", 00:06:49.112 "nvmf_delete_target", 00:06:49.112 "nvmf_create_target", 00:06:49.112 "nvmf_subsystem_allow_any_host", 00:06:49.112 "nvmf_subsystem_remove_host", 00:06:49.112 "nvmf_subsystem_add_host", 00:06:49.112 "nvmf_ns_remove_host", 00:06:49.112 "nvmf_ns_add_host", 00:06:49.112 "nvmf_subsystem_remove_ns", 00:06:49.112 "nvmf_subsystem_add_ns", 00:06:49.112 "nvmf_subsystem_listener_set_ana_state", 00:06:49.112 "nvmf_discovery_get_referrals", 00:06:49.112 "nvmf_discovery_remove_referral", 00:06:49.112 "nvmf_discovery_add_referral", 00:06:49.112 "nvmf_subsystem_remove_listener", 00:06:49.112 "nvmf_subsystem_add_listener", 00:06:49.112 "nvmf_delete_subsystem", 00:06:49.112 "nvmf_create_subsystem", 00:06:49.112 "nvmf_get_subsystems", 00:06:49.112 "env_dpdk_get_mem_stats", 00:06:49.112 "nbd_get_disks", 00:06:49.112 "nbd_stop_disk", 00:06:49.112 "nbd_start_disk", 00:06:49.112 "ublk_recover_disk", 00:06:49.112 "ublk_get_disks", 00:06:49.112 "ublk_stop_disk", 00:06:49.112 "ublk_start_disk", 00:06:49.112 "ublk_destroy_target", 00:06:49.112 "ublk_create_target", 00:06:49.112 "virtio_blk_create_transport", 00:06:49.112 "virtio_blk_get_transports", 00:06:49.112 "vhost_controller_set_coalescing", 00:06:49.112 "vhost_get_controllers", 00:06:49.112 "vhost_delete_controller", 00:06:49.112 "vhost_create_blk_controller", 00:06:49.112 "vhost_scsi_controller_remove_target", 00:06:49.112 "vhost_scsi_controller_add_target", 00:06:49.112 "vhost_start_scsi_controller", 00:06:49.112 "vhost_create_scsi_controller", 00:06:49.112 "thread_set_cpumask", 00:06:49.112 "framework_get_scheduler", 00:06:49.112 "framework_set_scheduler", 00:06:49.112 "framework_get_reactors", 00:06:49.112 "thread_get_io_channels", 00:06:49.112 "thread_get_pollers", 00:06:49.112 "thread_get_stats", 00:06:49.112 "framework_monitor_context_switch", 00:06:49.112 "spdk_kill_instance", 00:06:49.112 "log_enable_timestamps", 00:06:49.112 "log_get_flags", 00:06:49.112 "log_clear_flag", 00:06:49.112 "log_set_flag", 00:06:49.112 "log_get_level", 00:06:49.112 "log_set_level", 00:06:49.112 "log_get_print_level", 00:06:49.112 "log_set_print_level", 00:06:49.112 "framework_enable_cpumask_locks", 00:06:49.112 "framework_disable_cpumask_locks", 00:06:49.112 "framework_wait_init", 00:06:49.112 "framework_start_init", 00:06:49.112 "scsi_get_devices", 00:06:49.112 "bdev_get_histogram", 00:06:49.112 "bdev_enable_histogram", 00:06:49.112 "bdev_set_qos_limit", 00:06:49.112 "bdev_set_qd_sampling_period", 00:06:49.112 "bdev_get_bdevs", 00:06:49.112 "bdev_reset_iostat", 00:06:49.112 "bdev_get_iostat", 00:06:49.112 "bdev_examine", 00:06:49.112 "bdev_wait_for_examine", 00:06:49.112 "bdev_set_options", 00:06:49.112 "notify_get_notifications", 00:06:49.112 "notify_get_types", 00:06:49.112 "accel_get_stats", 00:06:49.112 "accel_set_options", 00:06:49.112 "accel_set_driver", 00:06:49.112 "accel_crypto_key_destroy", 00:06:49.112 "accel_crypto_keys_get", 00:06:49.112 "accel_crypto_key_create", 00:06:49.112 "accel_assign_opc", 00:06:49.112 "accel_get_module_info", 00:06:49.112 "accel_get_opc_assignments", 00:06:49.112 "vmd_rescan", 00:06:49.112 "vmd_remove_device", 00:06:49.112 "vmd_enable", 00:06:49.112 "sock_get_default_impl", 00:06:49.112 "sock_set_default_impl", 00:06:49.112 "sock_impl_set_options", 00:06:49.112 "sock_impl_get_options", 00:06:49.112 "iobuf_get_stats", 00:06:49.112 "iobuf_set_options", 00:06:49.112 "framework_get_pci_devices", 00:06:49.112 "framework_get_config", 00:06:49.112 "framework_get_subsystems", 00:06:49.112 "trace_get_info", 00:06:49.112 "trace_get_tpoint_group_mask", 00:06:49.112 "trace_disable_tpoint_group", 00:06:49.112 "trace_enable_tpoint_group", 00:06:49.112 "trace_clear_tpoint_mask", 00:06:49.112 "trace_set_tpoint_mask", 00:06:49.112 "keyring_get_keys", 00:06:49.112 "spdk_get_version", 00:06:49.112 "rpc_get_methods" 00:06:49.112 ] 00:06:49.112 23:48:49 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:06:49.112 23:48:49 spdkcli_tcp -- common/autotest_common.sh@726 -- # xtrace_disable 00:06:49.371 23:48:49 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:49.371 23:48:49 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:06:49.371 23:48:49 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 353382 00:06:49.371 23:48:49 spdkcli_tcp -- common/autotest_common.sh@946 -- # '[' -z 353382 ']' 00:06:49.371 23:48:49 spdkcli_tcp -- common/autotest_common.sh@950 -- # kill -0 353382 00:06:49.371 23:48:49 spdkcli_tcp -- common/autotest_common.sh@951 -- # uname 00:06:49.371 23:48:49 spdkcli_tcp -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:49.371 23:48:49 spdkcli_tcp -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 353382 00:06:49.371 23:48:49 spdkcli_tcp -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:49.371 23:48:49 spdkcli_tcp -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:49.372 23:48:49 spdkcli_tcp -- common/autotest_common.sh@964 -- # echo 'killing process with pid 353382' 00:06:49.372 killing process with pid 353382 00:06:49.372 23:48:49 spdkcli_tcp -- common/autotest_common.sh@965 -- # kill 353382 00:06:49.372 23:48:49 spdkcli_tcp -- common/autotest_common.sh@970 -- # wait 353382 00:06:49.631 00:06:49.631 real 0m1.864s 00:06:49.631 user 0m3.367s 00:06:49.631 sys 0m0.612s 00:06:49.631 23:48:50 spdkcli_tcp -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:49.631 23:48:50 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:49.631 ************************************ 00:06:49.631 END TEST spdkcli_tcp 00:06:49.631 ************************************ 00:06:49.891 23:48:50 -- spdk/autotest.sh@176 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:49.891 23:48:50 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:49.891 23:48:50 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:49.891 23:48:50 -- common/autotest_common.sh@10 -- # set +x 00:06:49.891 ************************************ 00:06:49.891 START TEST dpdk_mem_utility 00:06:49.891 ************************************ 00:06:49.891 23:48:50 dpdk_mem_utility -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:49.891 * Looking for test storage... 00:06:49.891 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:06:49.891 23:48:50 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:49.891 23:48:50 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=353687 00:06:49.891 23:48:50 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 353687 00:06:49.891 23:48:50 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:49.891 23:48:50 dpdk_mem_utility -- common/autotest_common.sh@827 -- # '[' -z 353687 ']' 00:06:49.891 23:48:50 dpdk_mem_utility -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:49.891 23:48:50 dpdk_mem_utility -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:49.891 23:48:50 dpdk_mem_utility -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:49.891 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:49.891 23:48:50 dpdk_mem_utility -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:49.891 23:48:50 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:49.891 [2024-05-14 23:48:50.473218] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:06:49.891 [2024-05-14 23:48:50.473295] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid353687 ] 00:06:50.151 [2024-05-14 23:48:50.603182] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:50.151 [2024-05-14 23:48:50.710420] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:51.092 23:48:51 dpdk_mem_utility -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:51.092 23:48:51 dpdk_mem_utility -- common/autotest_common.sh@860 -- # return 0 00:06:51.092 23:48:51 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:06:51.092 23:48:51 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:06:51.092 23:48:51 dpdk_mem_utility -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:51.092 23:48:51 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:51.092 { 00:06:51.092 "filename": "/tmp/spdk_mem_dump.txt" 00:06:51.092 } 00:06:51.093 23:48:51 dpdk_mem_utility -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:51.093 23:48:51 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:51.093 DPDK memory size 816.000000 MiB in 2 heap(s) 00:06:51.093 2 heaps totaling size 816.000000 MiB 00:06:51.093 size: 814.000000 MiB heap id: 0 00:06:51.093 size: 2.000000 MiB heap id: 1 00:06:51.093 end heaps---------- 00:06:51.093 8 mempools totaling size 598.116089 MiB 00:06:51.093 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:06:51.093 size: 158.602051 MiB name: PDU_data_out_Pool 00:06:51.093 size: 84.521057 MiB name: bdev_io_353687 00:06:51.093 size: 51.011292 MiB name: evtpool_353687 00:06:51.093 size: 50.003479 MiB name: msgpool_353687 00:06:51.093 size: 21.763794 MiB name: PDU_Pool 00:06:51.093 size: 19.513306 MiB name: SCSI_TASK_Pool 00:06:51.093 size: 0.026123 MiB name: Session_Pool 00:06:51.093 end mempools------- 00:06:51.093 201 memzones totaling size 4.173523 MiB 00:06:51.093 size: 1.000366 MiB name: RG_ring_0_353687 00:06:51.093 size: 1.000366 MiB name: RG_ring_1_353687 00:06:51.093 size: 1.000366 MiB name: RG_ring_4_353687 00:06:51.093 size: 1.000366 MiB name: RG_ring_5_353687 00:06:51.093 size: 0.125366 MiB name: RG_ring_2_353687 00:06:51.093 size: 0.015991 MiB name: RG_ring_3_353687 00:06:51.093 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:06:51.093 size: 0.000244 MiB name: 0000:3d:01.0_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3d:01.1_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3d:01.2_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3d:01.3_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3d:01.4_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3d:01.5_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3d:01.6_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3d:01.7_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3d:02.0_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3d:02.1_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3d:02.2_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3d:02.3_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3d:02.4_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3d:02.5_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3d:02.6_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3d:02.7_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3f:01.0_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3f:01.1_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3f:01.2_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3f:01.3_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3f:01.4_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3f:01.5_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3f:01.6_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3f:01.7_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3f:02.0_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3f:02.1_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3f:02.2_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3f:02.3_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3f:02.4_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3f:02.5_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3f:02.6_qat 00:06:51.093 size: 0.000244 MiB name: 0000:3f:02.7_qat 00:06:51.093 size: 0.000244 MiB name: 0000:da:01.0_qat 00:06:51.093 size: 0.000244 MiB name: 0000:da:01.1_qat 00:06:51.093 size: 0.000244 MiB name: 0000:da:01.2_qat 00:06:51.093 size: 0.000244 MiB name: 0000:da:01.3_qat 00:06:51.093 size: 0.000244 MiB name: 0000:da:01.4_qat 00:06:51.093 size: 0.000244 MiB name: 0000:da:01.5_qat 00:06:51.093 size: 0.000244 MiB name: 0000:da:01.6_qat 00:06:51.093 size: 0.000244 MiB name: 0000:da:01.7_qat 00:06:51.093 size: 0.000244 MiB name: 0000:da:02.0_qat 00:06:51.093 size: 0.000244 MiB name: 0000:da:02.1_qat 00:06:51.093 size: 0.000244 MiB name: 0000:da:02.2_qat 00:06:51.093 size: 0.000244 MiB name: 0000:da:02.3_qat 00:06:51.093 size: 0.000244 MiB name: 0000:da:02.4_qat 00:06:51.093 size: 0.000244 MiB name: 0000:da:02.5_qat 00:06:51.093 size: 0.000244 MiB name: 0000:da:02.6_qat 00:06:51.093 size: 0.000244 MiB name: 0000:da:02.7_qat 00:06:51.093 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_0 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_0 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_1 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_2 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_1 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_3 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_4 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_2 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_5 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_6 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_3 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_7 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_8 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_4 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_9 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_10 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_5 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_11 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_12 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_6 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_13 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_14 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_7 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_15 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_16 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_8 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_17 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_18 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_9 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_19 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_20 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_10 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_21 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_22 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_11 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_23 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_24 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_12 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_25 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_26 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_13 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_27 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_28 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_14 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_29 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_30 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_15 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_31 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_32 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_16 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_33 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_34 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_17 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_35 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_36 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_18 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_37 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_38 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_19 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_39 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_40 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_20 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_41 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_42 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_21 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_43 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_44 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_22 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_45 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_46 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_23 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_47 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_48 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_24 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_49 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_50 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_25 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_51 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_52 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_26 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_53 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_54 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_27 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_55 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_56 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_28 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_57 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_58 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_29 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_59 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_60 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_30 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_61 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_62 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_31 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_63 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_64 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_32 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_65 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_66 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_33 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_67 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_68 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_34 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_69 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_70 00:06:51.093 size: 0.000122 MiB name: rte_compressdev_data_35 00:06:51.093 size: 0.000122 MiB name: rte_cryptodev_data_71 00:06:51.094 size: 0.000122 MiB name: rte_cryptodev_data_72 00:06:51.094 size: 0.000122 MiB name: rte_compressdev_data_36 00:06:51.094 size: 0.000122 MiB name: rte_cryptodev_data_73 00:06:51.094 size: 0.000122 MiB name: rte_cryptodev_data_74 00:06:51.094 size: 0.000122 MiB name: rte_compressdev_data_37 00:06:51.094 size: 0.000122 MiB name: rte_cryptodev_data_75 00:06:51.094 size: 0.000122 MiB name: rte_cryptodev_data_76 00:06:51.094 size: 0.000122 MiB name: rte_compressdev_data_38 00:06:51.094 size: 0.000122 MiB name: rte_cryptodev_data_77 00:06:51.094 size: 0.000122 MiB name: rte_cryptodev_data_78 00:06:51.094 size: 0.000122 MiB name: rte_compressdev_data_39 00:06:51.094 size: 0.000122 MiB name: rte_cryptodev_data_79 00:06:51.094 size: 0.000122 MiB name: rte_cryptodev_data_80 00:06:51.094 size: 0.000122 MiB name: rte_compressdev_data_40 00:06:51.094 size: 0.000122 MiB name: rte_cryptodev_data_81 00:06:51.094 size: 0.000122 MiB name: rte_cryptodev_data_82 00:06:51.094 size: 0.000122 MiB name: rte_compressdev_data_41 00:06:51.094 size: 0.000122 MiB name: rte_cryptodev_data_83 00:06:51.094 size: 0.000122 MiB name: rte_cryptodev_data_84 00:06:51.094 size: 0.000122 MiB name: rte_compressdev_data_42 00:06:51.094 size: 0.000122 MiB name: rte_cryptodev_data_85 00:06:51.094 size: 0.000122 MiB name: rte_cryptodev_data_86 00:06:51.094 size: 0.000122 MiB name: rte_compressdev_data_43 00:06:51.094 size: 0.000122 MiB name: rte_cryptodev_data_87 00:06:51.094 size: 0.000122 MiB name: rte_cryptodev_data_88 00:06:51.094 size: 0.000122 MiB name: rte_compressdev_data_44 00:06:51.094 size: 0.000122 MiB name: rte_cryptodev_data_89 00:06:51.094 size: 0.000122 MiB name: rte_cryptodev_data_90 00:06:51.094 size: 0.000122 MiB name: rte_compressdev_data_45 00:06:51.094 size: 0.000122 MiB name: rte_cryptodev_data_91 00:06:51.094 size: 0.000122 MiB name: rte_cryptodev_data_92 00:06:51.094 size: 0.000122 MiB name: rte_compressdev_data_46 00:06:51.094 size: 0.000122 MiB name: rte_cryptodev_data_93 00:06:51.094 size: 0.000122 MiB name: rte_cryptodev_data_94 00:06:51.094 size: 0.000122 MiB name: rte_compressdev_data_47 00:06:51.094 size: 0.000122 MiB name: rte_cryptodev_data_95 00:06:51.094 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:06:51.094 end memzones------- 00:06:51.094 23:48:51 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:06:51.094 heap id: 0 total size: 814.000000 MiB number of busy elements: 519 number of free elements: 14 00:06:51.094 list of free elements. size: 11.817749 MiB 00:06:51.094 element at address: 0x200000400000 with size: 1.999512 MiB 00:06:51.094 element at address: 0x200018e00000 with size: 0.999878 MiB 00:06:51.094 element at address: 0x200019000000 with size: 0.999878 MiB 00:06:51.094 element at address: 0x200003e00000 with size: 0.996460 MiB 00:06:51.094 element at address: 0x200031c00000 with size: 0.994446 MiB 00:06:51.094 element at address: 0x200013800000 with size: 0.978882 MiB 00:06:51.094 element at address: 0x200007000000 with size: 0.960022 MiB 00:06:51.094 element at address: 0x200019200000 with size: 0.937256 MiB 00:06:51.094 element at address: 0x20001aa00000 with size: 0.583252 MiB 00:06:51.094 element at address: 0x200003a00000 with size: 0.498535 MiB 00:06:51.094 element at address: 0x20000b200000 with size: 0.491272 MiB 00:06:51.094 element at address: 0x200000800000 with size: 0.486328 MiB 00:06:51.094 element at address: 0x200019400000 with size: 0.485840 MiB 00:06:51.094 element at address: 0x200027e00000 with size: 0.406189 MiB 00:06:51.094 list of standard malloc elements. size: 199.876892 MiB 00:06:51.094 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:06:51.094 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:06:51.094 element at address: 0x200018efff80 with size: 1.000122 MiB 00:06:51.094 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:06:51.094 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:06:51.094 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:06:51.094 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:06:51.094 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:06:51.094 element at address: 0x200000331700 with size: 0.004395 MiB 00:06:51.094 element at address: 0x200000334c40 with size: 0.004395 MiB 00:06:51.094 element at address: 0x200000338180 with size: 0.004395 MiB 00:06:51.094 element at address: 0x20000033b6c0 with size: 0.004395 MiB 00:06:51.094 element at address: 0x20000033ec00 with size: 0.004395 MiB 00:06:51.094 element at address: 0x200000342140 with size: 0.004395 MiB 00:06:51.094 element at address: 0x200000345680 with size: 0.004395 MiB 00:06:51.094 element at address: 0x200000348bc0 with size: 0.004395 MiB 00:06:51.094 element at address: 0x20000034c100 with size: 0.004395 MiB 00:06:51.094 element at address: 0x20000034f640 with size: 0.004395 MiB 00:06:51.094 element at address: 0x200000352b80 with size: 0.004395 MiB 00:06:51.094 element at address: 0x2000003560c0 with size: 0.004395 MiB 00:06:51.094 element at address: 0x200000359600 with size: 0.004395 MiB 00:06:51.094 element at address: 0x20000035cb40 with size: 0.004395 MiB 00:06:51.094 element at address: 0x200000360080 with size: 0.004395 MiB 00:06:51.094 element at address: 0x2000003635c0 with size: 0.004395 MiB 00:06:51.094 element at address: 0x200000367040 with size: 0.004395 MiB 00:06:51.094 element at address: 0x20000036aac0 with size: 0.004395 MiB 00:06:51.094 element at address: 0x20000036e540 with size: 0.004395 MiB 00:06:51.094 element at address: 0x200000371fc0 with size: 0.004395 MiB 00:06:51.094 element at address: 0x200000375a40 with size: 0.004395 MiB 00:06:51.094 element at address: 0x2000003794c0 with size: 0.004395 MiB 00:06:51.094 element at address: 0x20000037cf40 with size: 0.004395 MiB 00:06:51.094 element at address: 0x2000003809c0 with size: 0.004395 MiB 00:06:51.094 element at address: 0x200000384440 with size: 0.004395 MiB 00:06:51.094 element at address: 0x200000387ec0 with size: 0.004395 MiB 00:06:51.094 element at address: 0x20000038b940 with size: 0.004395 MiB 00:06:51.094 element at address: 0x20000038f3c0 with size: 0.004395 MiB 00:06:51.094 element at address: 0x200000392e40 with size: 0.004395 MiB 00:06:51.094 element at address: 0x2000003968c0 with size: 0.004395 MiB 00:06:51.094 element at address: 0x20000039a340 with size: 0.004395 MiB 00:06:51.094 element at address: 0x20000039ddc0 with size: 0.004395 MiB 00:06:51.094 element at address: 0x2000003a1840 with size: 0.004395 MiB 00:06:51.094 element at address: 0x2000003a52c0 with size: 0.004395 MiB 00:06:51.094 element at address: 0x2000003a8d40 with size: 0.004395 MiB 00:06:51.094 element at address: 0x2000003ac7c0 with size: 0.004395 MiB 00:06:51.094 element at address: 0x2000003b0240 with size: 0.004395 MiB 00:06:51.094 element at address: 0x2000003b3cc0 with size: 0.004395 MiB 00:06:51.094 element at address: 0x2000003b7740 with size: 0.004395 MiB 00:06:51.094 element at address: 0x2000003bb1c0 with size: 0.004395 MiB 00:06:51.094 element at address: 0x2000003bec40 with size: 0.004395 MiB 00:06:51.094 element at address: 0x2000003c26c0 with size: 0.004395 MiB 00:06:51.094 element at address: 0x2000003c6140 with size: 0.004395 MiB 00:06:51.094 element at address: 0x2000003c9bc0 with size: 0.004395 MiB 00:06:51.094 element at address: 0x2000003cd640 with size: 0.004395 MiB 00:06:51.094 element at address: 0x2000003d10c0 with size: 0.004395 MiB 00:06:51.094 element at address: 0x2000003d4b40 with size: 0.004395 MiB 00:06:51.094 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:06:51.094 element at address: 0x20000032f600 with size: 0.004028 MiB 00:06:51.094 element at address: 0x200000330680 with size: 0.004028 MiB 00:06:51.094 element at address: 0x200000332b40 with size: 0.004028 MiB 00:06:51.094 element at address: 0x200000333bc0 with size: 0.004028 MiB 00:06:51.094 element at address: 0x200000336080 with size: 0.004028 MiB 00:06:51.094 element at address: 0x200000337100 with size: 0.004028 MiB 00:06:51.094 element at address: 0x2000003395c0 with size: 0.004028 MiB 00:06:51.094 element at address: 0x20000033a640 with size: 0.004028 MiB 00:06:51.094 element at address: 0x20000033cb00 with size: 0.004028 MiB 00:06:51.094 element at address: 0x20000033db80 with size: 0.004028 MiB 00:06:51.094 element at address: 0x200000340040 with size: 0.004028 MiB 00:06:51.094 element at address: 0x2000003410c0 with size: 0.004028 MiB 00:06:51.094 element at address: 0x200000343580 with size: 0.004028 MiB 00:06:51.094 element at address: 0x200000344600 with size: 0.004028 MiB 00:06:51.094 element at address: 0x200000346ac0 with size: 0.004028 MiB 00:06:51.094 element at address: 0x200000347b40 with size: 0.004028 MiB 00:06:51.094 element at address: 0x20000034a000 with size: 0.004028 MiB 00:06:51.094 element at address: 0x20000034b080 with size: 0.004028 MiB 00:06:51.094 element at address: 0x20000034d540 with size: 0.004028 MiB 00:06:51.094 element at address: 0x20000034e5c0 with size: 0.004028 MiB 00:06:51.094 element at address: 0x200000350a80 with size: 0.004028 MiB 00:06:51.094 element at address: 0x200000351b00 with size: 0.004028 MiB 00:06:51.094 element at address: 0x200000353fc0 with size: 0.004028 MiB 00:06:51.094 element at address: 0x200000355040 with size: 0.004028 MiB 00:06:51.094 element at address: 0x200000357500 with size: 0.004028 MiB 00:06:51.094 element at address: 0x200000358580 with size: 0.004028 MiB 00:06:51.094 element at address: 0x20000035aa40 with size: 0.004028 MiB 00:06:51.094 element at address: 0x20000035bac0 with size: 0.004028 MiB 00:06:51.094 element at address: 0x20000035df80 with size: 0.004028 MiB 00:06:51.094 element at address: 0x20000035f000 with size: 0.004028 MiB 00:06:51.094 element at address: 0x2000003614c0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x200000362540 with size: 0.004028 MiB 00:06:51.095 element at address: 0x200000364f40 with size: 0.004028 MiB 00:06:51.095 element at address: 0x200000365fc0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003689c0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x200000369a40 with size: 0.004028 MiB 00:06:51.095 element at address: 0x20000036c440 with size: 0.004028 MiB 00:06:51.095 element at address: 0x20000036d4c0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x20000036fec0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x200000370f40 with size: 0.004028 MiB 00:06:51.095 element at address: 0x200000373940 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003749c0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003773c0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x200000378440 with size: 0.004028 MiB 00:06:51.095 element at address: 0x20000037ae40 with size: 0.004028 MiB 00:06:51.095 element at address: 0x20000037bec0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x20000037e8c0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x20000037f940 with size: 0.004028 MiB 00:06:51.095 element at address: 0x200000382340 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003833c0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x200000385dc0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x200000386e40 with size: 0.004028 MiB 00:06:51.095 element at address: 0x200000389840 with size: 0.004028 MiB 00:06:51.095 element at address: 0x20000038a8c0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x20000038d2c0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x20000038e340 with size: 0.004028 MiB 00:06:51.095 element at address: 0x200000390d40 with size: 0.004028 MiB 00:06:51.095 element at address: 0x200000391dc0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003947c0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x200000395840 with size: 0.004028 MiB 00:06:51.095 element at address: 0x200000398240 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003992c0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x20000039bcc0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x20000039cd40 with size: 0.004028 MiB 00:06:51.095 element at address: 0x20000039f740 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003a07c0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003a31c0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003a4240 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003a6c40 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003a7cc0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003aa6c0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003ab740 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003ae140 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003af1c0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003b1bc0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003b2c40 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003b5640 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003b66c0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003b90c0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003ba140 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003bcb40 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003bdbc0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003c05c0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003c1640 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003c4040 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003c50c0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003c7ac0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003c8b40 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003cb540 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003cc5c0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003cefc0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003d0040 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003d2a40 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003d3ac0 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:06:51.095 element at address: 0x2000002057c0 with size: 0.000305 MiB 00:06:51.095 element at address: 0x200000200000 with size: 0.000183 MiB 00:06:51.095 element at address: 0x2000002000c0 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000200180 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000200240 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000200300 with size: 0.000183 MiB 00:06:51.095 element at address: 0x2000002003c0 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000200480 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000200540 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000200600 with size: 0.000183 MiB 00:06:51.095 element at address: 0x2000002006c0 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000200780 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000200840 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000200900 with size: 0.000183 MiB 00:06:51.095 element at address: 0x2000002009c0 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000200a80 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000200b40 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000200c00 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000200cc0 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000200d80 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000200e40 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000200f00 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000200fc0 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000201080 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000201140 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000201200 with size: 0.000183 MiB 00:06:51.095 element at address: 0x2000002012c0 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000201380 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000201440 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000201500 with size: 0.000183 MiB 00:06:51.095 element at address: 0x2000002015c0 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000201680 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000201740 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000201800 with size: 0.000183 MiB 00:06:51.095 element at address: 0x2000002018c0 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000201980 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000201a40 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000201b00 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000201bc0 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000201c80 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000201d40 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000201e00 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000201ec0 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000201f80 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000202040 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000202100 with size: 0.000183 MiB 00:06:51.095 element at address: 0x2000002021c0 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000202280 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000202340 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000202400 with size: 0.000183 MiB 00:06:51.095 element at address: 0x2000002024c0 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000202580 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000202640 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000202700 with size: 0.000183 MiB 00:06:51.095 element at address: 0x2000002027c0 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000202880 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000202940 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000202a00 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000202ac0 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000202b80 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000202c40 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000202d00 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000202dc0 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000202e80 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000202f40 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000203000 with size: 0.000183 MiB 00:06:51.095 element at address: 0x2000002030c0 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000203180 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000203240 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000203300 with size: 0.000183 MiB 00:06:51.095 element at address: 0x2000002033c0 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000203480 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000203540 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000203600 with size: 0.000183 MiB 00:06:51.095 element at address: 0x2000002036c0 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000203780 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000203840 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000203900 with size: 0.000183 MiB 00:06:51.095 element at address: 0x2000002039c0 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000203a80 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000203b40 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000203c00 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000203cc0 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000203d80 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000203e40 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000203f00 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000203fc0 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000204080 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000204140 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000204200 with size: 0.000183 MiB 00:06:51.095 element at address: 0x2000002042c0 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000204380 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000204440 with size: 0.000183 MiB 00:06:51.095 element at address: 0x200000204500 with size: 0.000183 MiB 00:06:51.096 element at address: 0x2000002045c0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000204680 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000204740 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000204800 with size: 0.000183 MiB 00:06:51.096 element at address: 0x2000002048c0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000204980 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000204a40 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000204b00 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000204bc0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000204c80 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000204d40 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000204e00 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000204ec0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000204f80 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000205040 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000205100 with size: 0.000183 MiB 00:06:51.096 element at address: 0x2000002051c0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000205280 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000205340 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000205400 with size: 0.000183 MiB 00:06:51.096 element at address: 0x2000002054c0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000205580 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000205640 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000205700 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000205900 with size: 0.000183 MiB 00:06:51.096 element at address: 0x2000002059c0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000205a80 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000205b40 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000205c00 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000205cc0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000205d80 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000205e40 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000205f00 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000205fc0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000206080 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000206140 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000206200 with size: 0.000183 MiB 00:06:51.096 element at address: 0x2000002062c0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000206380 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000206440 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000206500 with size: 0.000183 MiB 00:06:51.096 element at address: 0x2000002065c0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000206680 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000206740 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000206800 with size: 0.000183 MiB 00:06:51.096 element at address: 0x2000002068c0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000206980 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000206a40 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000206b00 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000206bc0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000206c80 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000206d40 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000206e00 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000206ec0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x2000002070c0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000020b380 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022b640 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022b700 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022b7c0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022b880 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022b940 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022ba00 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022bac0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022bb80 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022bc40 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022bd00 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022bdc0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022be80 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022bf40 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022c000 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022c0c0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022c180 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022c240 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022c300 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022c500 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022c5c0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022c680 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022c740 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022c800 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022c8c0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022c980 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022ca40 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022cb00 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022cbc0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022cc80 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022cd40 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022ce00 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022cec0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022cf80 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022d040 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000022d100 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000032f300 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000032f3c0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000332900 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000335e40 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000339380 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000033c8c0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000033fe00 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000343340 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000346880 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000349dc0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000034d300 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000350840 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000353d80 with size: 0.000183 MiB 00:06:51.096 element at address: 0x2000003572c0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000035a800 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000035dd40 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000361280 with size: 0.000183 MiB 00:06:51.096 element at address: 0x2000003647c0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000364980 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000364b40 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000364c00 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000368240 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000368400 with size: 0.000183 MiB 00:06:51.096 element at address: 0x2000003685c0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000368680 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000036bcc0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000036be80 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000036c040 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000036c100 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000036f740 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000036f900 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000036fac0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000036fb80 with size: 0.000183 MiB 00:06:51.096 element at address: 0x2000003731c0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000373380 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000373540 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000373600 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000376c40 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000376e00 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000376fc0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000377080 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000037a6c0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000037a880 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000037aa40 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000037ab00 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000037e140 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000037e300 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000037e4c0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x20000037e580 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000381bc0 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000381d80 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000381f40 with size: 0.000183 MiB 00:06:51.096 element at address: 0x200000382000 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200000385640 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200000385800 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003859c0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200000385a80 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003890c0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200000389280 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200000389440 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200000389500 with size: 0.000183 MiB 00:06:51.097 element at address: 0x20000038cb40 with size: 0.000183 MiB 00:06:51.097 element at address: 0x20000038cd00 with size: 0.000183 MiB 00:06:51.097 element at address: 0x20000038cec0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x20000038cf80 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003905c0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200000390780 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200000390940 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200000390a00 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200000394040 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200000394200 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003943c0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200000394480 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200000397ac0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200000397c80 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200000397e40 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200000397f00 with size: 0.000183 MiB 00:06:51.097 element at address: 0x20000039b540 with size: 0.000183 MiB 00:06:51.097 element at address: 0x20000039b700 with size: 0.000183 MiB 00:06:51.097 element at address: 0x20000039b8c0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x20000039b980 with size: 0.000183 MiB 00:06:51.097 element at address: 0x20000039efc0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x20000039f180 with size: 0.000183 MiB 00:06:51.097 element at address: 0x20000039f340 with size: 0.000183 MiB 00:06:51.097 element at address: 0x20000039f400 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003a2a40 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003a2c00 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003a2dc0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003a2e80 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003a64c0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003a6680 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003a6840 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003a6900 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003a9f40 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003aa100 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003aa2c0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003aa380 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003ad9c0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003adb80 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003add40 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003ade00 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003b1440 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003b1600 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003b17c0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003b1880 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003b4ec0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003b5080 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003b5240 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003b5300 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003b8940 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003b8b00 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003b8cc0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003b8d80 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003bc3c0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003bc580 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003bc740 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003bc800 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003bfe40 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003c0000 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003c01c0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003c0280 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003c38c0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003c3a80 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003c3c40 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003c3d00 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003c7340 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003c7500 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003c76c0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003c7780 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003cadc0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003caf80 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003cb140 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003cb200 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003ce840 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003cea00 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003cebc0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003cec80 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003d22c0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003d2480 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003d2640 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003d2700 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003d5e80 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003d6100 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003d6800 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000003d68c0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x20000087c800 with size: 0.000183 MiB 00:06:51.097 element at address: 0x20000087c8c0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x20000087c980 with size: 0.000183 MiB 00:06:51.097 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:06:51.097 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:06:51.097 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:06:51.097 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:06:51.097 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:06:51.097 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200027e67fc0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200027e68080 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200027e6ec80 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:06:51.097 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:06:51.098 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:06:51.098 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:06:51.098 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:06:51.098 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:06:51.098 list of memzone associated elements. size: 602.305359 MiB 00:06:51.098 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:06:51.098 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:06:51.098 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:06:51.098 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:06:51.098 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:06:51.098 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_353687_0 00:06:51.098 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:06:51.098 associated memzone info: size: 48.002930 MiB name: MP_evtpool_353687_0 00:06:51.098 element at address: 0x200003fff380 with size: 48.003052 MiB 00:06:51.098 associated memzone info: size: 48.002930 MiB name: MP_msgpool_353687_0 00:06:51.098 element at address: 0x2000195be940 with size: 20.255554 MiB 00:06:51.098 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:06:51.098 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:06:51.098 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:06:51.098 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:06:51.098 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_353687 00:06:51.098 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:06:51.098 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_353687 00:06:51.098 element at address: 0x20000022d1c0 with size: 1.008118 MiB 00:06:51.098 associated memzone info: size: 1.007996 MiB name: MP_evtpool_353687 00:06:51.098 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:06:51.098 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:06:51.098 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:06:51.098 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:06:51.098 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:06:51.098 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:06:51.098 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:06:51.098 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:06:51.098 element at address: 0x200003eff180 with size: 1.000488 MiB 00:06:51.098 associated memzone info: size: 1.000366 MiB name: RG_ring_0_353687 00:06:51.098 element at address: 0x200003affc00 with size: 1.000488 MiB 00:06:51.098 associated memzone info: size: 1.000366 MiB name: RG_ring_1_353687 00:06:51.098 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:06:51.098 associated memzone info: size: 1.000366 MiB name: RG_ring_4_353687 00:06:51.098 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:06:51.098 associated memzone info: size: 1.000366 MiB name: RG_ring_5_353687 00:06:51.098 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:06:51.098 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_353687 00:06:51.098 element at address: 0x20000b27dc40 with size: 0.500488 MiB 00:06:51.098 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:06:51.098 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:06:51.098 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:06:51.098 element at address: 0x20001947c600 with size: 0.250488 MiB 00:06:51.098 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:06:51.098 element at address: 0x20000020b440 with size: 0.125488 MiB 00:06:51.098 associated memzone info: size: 0.125366 MiB name: RG_ring_2_353687 00:06:51.098 element at address: 0x2000070f5c40 with size: 0.031738 MiB 00:06:51.098 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:06:51.098 element at address: 0x200027e68140 with size: 0.023743 MiB 00:06:51.098 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:06:51.098 element at address: 0x200000207180 with size: 0.016113 MiB 00:06:51.098 associated memzone info: size: 0.015991 MiB name: RG_ring_3_353687 00:06:51.098 element at address: 0x200027e6e280 with size: 0.002441 MiB 00:06:51.098 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:06:51.098 element at address: 0x2000003d62c0 with size: 0.001282 MiB 00:06:51.098 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:06:51.098 element at address: 0x2000003d6a80 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3d:01.0_qat 00:06:51.098 element at address: 0x2000003d28c0 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3d:01.1_qat 00:06:51.098 element at address: 0x2000003cee40 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3d:01.2_qat 00:06:51.098 element at address: 0x2000003cb3c0 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3d:01.3_qat 00:06:51.098 element at address: 0x2000003c7940 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3d:01.4_qat 00:06:51.098 element at address: 0x2000003c3ec0 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3d:01.5_qat 00:06:51.098 element at address: 0x2000003c0440 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3d:01.6_qat 00:06:51.098 element at address: 0x2000003bc9c0 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3d:01.7_qat 00:06:51.098 element at address: 0x2000003b8f40 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3d:02.0_qat 00:06:51.098 element at address: 0x2000003b54c0 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3d:02.1_qat 00:06:51.098 element at address: 0x2000003b1a40 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3d:02.2_qat 00:06:51.098 element at address: 0x2000003adfc0 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3d:02.3_qat 00:06:51.098 element at address: 0x2000003aa540 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3d:02.4_qat 00:06:51.098 element at address: 0x2000003a6ac0 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3d:02.5_qat 00:06:51.098 element at address: 0x2000003a3040 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3d:02.6_qat 00:06:51.098 element at address: 0x20000039f5c0 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3d:02.7_qat 00:06:51.098 element at address: 0x20000039bb40 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3f:01.0_qat 00:06:51.098 element at address: 0x2000003980c0 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3f:01.1_qat 00:06:51.098 element at address: 0x200000394640 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3f:01.2_qat 00:06:51.098 element at address: 0x200000390bc0 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3f:01.3_qat 00:06:51.098 element at address: 0x20000038d140 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3f:01.4_qat 00:06:51.098 element at address: 0x2000003896c0 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3f:01.5_qat 00:06:51.098 element at address: 0x200000385c40 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3f:01.6_qat 00:06:51.098 element at address: 0x2000003821c0 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3f:01.7_qat 00:06:51.098 element at address: 0x20000037e740 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3f:02.0_qat 00:06:51.098 element at address: 0x20000037acc0 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3f:02.1_qat 00:06:51.098 element at address: 0x200000377240 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3f:02.2_qat 00:06:51.098 element at address: 0x2000003737c0 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3f:02.3_qat 00:06:51.098 element at address: 0x20000036fd40 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3f:02.4_qat 00:06:51.098 element at address: 0x20000036c2c0 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3f:02.5_qat 00:06:51.098 element at address: 0x200000368840 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3f:02.6_qat 00:06:51.098 element at address: 0x200000364dc0 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:3f:02.7_qat 00:06:51.098 element at address: 0x200000361340 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:da:01.0_qat 00:06:51.098 element at address: 0x20000035de00 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:da:01.1_qat 00:06:51.098 element at address: 0x20000035a8c0 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:da:01.2_qat 00:06:51.098 element at address: 0x200000357380 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:da:01.3_qat 00:06:51.098 element at address: 0x200000353e40 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:da:01.4_qat 00:06:51.098 element at address: 0x200000350900 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:da:01.5_qat 00:06:51.098 element at address: 0x20000034d3c0 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:da:01.6_qat 00:06:51.098 element at address: 0x200000349e80 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:da:01.7_qat 00:06:51.098 element at address: 0x200000346940 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:da:02.0_qat 00:06:51.098 element at address: 0x200000343400 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:da:02.1_qat 00:06:51.098 element at address: 0x20000033fec0 with size: 0.000366 MiB 00:06:51.098 associated memzone info: size: 0.000244 MiB name: 0000:da:02.2_qat 00:06:51.098 element at address: 0x20000033c980 with size: 0.000366 MiB 00:06:51.099 associated memzone info: size: 0.000244 MiB name: 0000:da:02.3_qat 00:06:51.099 element at address: 0x200000339440 with size: 0.000366 MiB 00:06:51.099 associated memzone info: size: 0.000244 MiB name: 0000:da:02.4_qat 00:06:51.099 element at address: 0x200000335f00 with size: 0.000366 MiB 00:06:51.099 associated memzone info: size: 0.000244 MiB name: 0000:da:02.5_qat 00:06:51.099 element at address: 0x2000003329c0 with size: 0.000366 MiB 00:06:51.099 associated memzone info: size: 0.000244 MiB name: 0000:da:02.6_qat 00:06:51.099 element at address: 0x20000032f480 with size: 0.000366 MiB 00:06:51.099 associated memzone info: size: 0.000244 MiB name: 0000:da:02.7_qat 00:06:51.099 element at address: 0x2000003d5d40 with size: 0.000305 MiB 00:06:51.099 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:06:51.099 element at address: 0x20000022c3c0 with size: 0.000305 MiB 00:06:51.099 associated memzone info: size: 0.000183 MiB name: MP_msgpool_353687 00:06:51.099 element at address: 0x200000206f80 with size: 0.000305 MiB 00:06:51.099 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_353687 00:06:51.099 element at address: 0x200027e6ed40 with size: 0.000305 MiB 00:06:51.099 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:06:51.099 element at address: 0x2000003d6980 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_0 00:06:51.099 element at address: 0x2000003d61c0 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_0 00:06:51.099 element at address: 0x2000003d5f40 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_1 00:06:51.099 element at address: 0x2000003d27c0 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_2 00:06:51.099 element at address: 0x2000003d2540 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_1 00:06:51.099 element at address: 0x2000003d2380 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_3 00:06:51.099 element at address: 0x2000003ced40 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_4 00:06:51.099 element at address: 0x2000003ceac0 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_2 00:06:51.099 element at address: 0x2000003ce900 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_5 00:06:51.099 element at address: 0x2000003cb2c0 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_6 00:06:51.099 element at address: 0x2000003cb040 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_3 00:06:51.099 element at address: 0x2000003cae80 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_7 00:06:51.099 element at address: 0x2000003c7840 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_8 00:06:51.099 element at address: 0x2000003c75c0 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_4 00:06:51.099 element at address: 0x2000003c7400 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_9 00:06:51.099 element at address: 0x2000003c3dc0 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_10 00:06:51.099 element at address: 0x2000003c3b40 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_5 00:06:51.099 element at address: 0x2000003c3980 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_11 00:06:51.099 element at address: 0x2000003c0340 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_12 00:06:51.099 element at address: 0x2000003c00c0 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_6 00:06:51.099 element at address: 0x2000003bff00 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_13 00:06:51.099 element at address: 0x2000003bc8c0 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_14 00:06:51.099 element at address: 0x2000003bc640 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_7 00:06:51.099 element at address: 0x2000003bc480 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_15 00:06:51.099 element at address: 0x2000003b8e40 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_16 00:06:51.099 element at address: 0x2000003b8bc0 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_8 00:06:51.099 element at address: 0x2000003b8a00 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_17 00:06:51.099 element at address: 0x2000003b53c0 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_18 00:06:51.099 element at address: 0x2000003b5140 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_9 00:06:51.099 element at address: 0x2000003b4f80 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_19 00:06:51.099 element at address: 0x2000003b1940 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_20 00:06:51.099 element at address: 0x2000003b16c0 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_10 00:06:51.099 element at address: 0x2000003b1500 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_21 00:06:51.099 element at address: 0x2000003adec0 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_22 00:06:51.099 element at address: 0x2000003adc40 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_11 00:06:51.099 element at address: 0x2000003ada80 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_23 00:06:51.099 element at address: 0x2000003aa440 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_24 00:06:51.099 element at address: 0x2000003aa1c0 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_12 00:06:51.099 element at address: 0x2000003aa000 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_25 00:06:51.099 element at address: 0x2000003a69c0 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_26 00:06:51.099 element at address: 0x2000003a6740 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_13 00:06:51.099 element at address: 0x2000003a6580 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_27 00:06:51.099 element at address: 0x2000003a2f40 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_28 00:06:51.099 element at address: 0x2000003a2cc0 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_14 00:06:51.099 element at address: 0x2000003a2b00 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_29 00:06:51.099 element at address: 0x20000039f4c0 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_30 00:06:51.099 element at address: 0x20000039f240 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_15 00:06:51.099 element at address: 0x20000039f080 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_31 00:06:51.099 element at address: 0x20000039ba40 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_32 00:06:51.099 element at address: 0x20000039b7c0 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_16 00:06:51.099 element at address: 0x20000039b600 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_33 00:06:51.099 element at address: 0x200000397fc0 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_34 00:06:51.099 element at address: 0x200000397d40 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_17 00:06:51.099 element at address: 0x200000397b80 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_35 00:06:51.099 element at address: 0x200000394540 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_36 00:06:51.099 element at address: 0x2000003942c0 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_18 00:06:51.099 element at address: 0x200000394100 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_37 00:06:51.099 element at address: 0x200000390ac0 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_38 00:06:51.099 element at address: 0x200000390840 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_19 00:06:51.099 element at address: 0x200000390680 with size: 0.000244 MiB 00:06:51.099 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_39 00:06:51.099 element at address: 0x20000038d040 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_40 00:06:51.100 element at address: 0x20000038cdc0 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_20 00:06:51.100 element at address: 0x20000038cc00 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_41 00:06:51.100 element at address: 0x2000003895c0 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_42 00:06:51.100 element at address: 0x200000389340 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_21 00:06:51.100 element at address: 0x200000389180 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_43 00:06:51.100 element at address: 0x200000385b40 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_44 00:06:51.100 element at address: 0x2000003858c0 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_22 00:06:51.100 element at address: 0x200000385700 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_45 00:06:51.100 element at address: 0x2000003820c0 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_46 00:06:51.100 element at address: 0x200000381e40 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_23 00:06:51.100 element at address: 0x200000381c80 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_47 00:06:51.100 element at address: 0x20000037e640 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_48 00:06:51.100 element at address: 0x20000037e3c0 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_24 00:06:51.100 element at address: 0x20000037e200 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_49 00:06:51.100 element at address: 0x20000037abc0 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_50 00:06:51.100 element at address: 0x20000037a940 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_25 00:06:51.100 element at address: 0x20000037a780 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_51 00:06:51.100 element at address: 0x200000377140 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_52 00:06:51.100 element at address: 0x200000376ec0 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_26 00:06:51.100 element at address: 0x200000376d00 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_53 00:06:51.100 element at address: 0x2000003736c0 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_54 00:06:51.100 element at address: 0x200000373440 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_27 00:06:51.100 element at address: 0x200000373280 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_55 00:06:51.100 element at address: 0x20000036fc40 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_56 00:06:51.100 element at address: 0x20000036f9c0 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_28 00:06:51.100 element at address: 0x20000036f800 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_57 00:06:51.100 element at address: 0x20000036c1c0 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_58 00:06:51.100 element at address: 0x20000036bf40 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_29 00:06:51.100 element at address: 0x20000036bd80 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_59 00:06:51.100 element at address: 0x200000368740 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_60 00:06:51.100 element at address: 0x2000003684c0 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_30 00:06:51.100 element at address: 0x200000368300 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_61 00:06:51.100 element at address: 0x200000364cc0 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_62 00:06:51.100 element at address: 0x200000364a40 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_31 00:06:51.100 element at address: 0x200000364880 with size: 0.000244 MiB 00:06:51.100 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_63 00:06:51.100 element at address: 0x2000003d6040 with size: 0.000183 MiB 00:06:51.100 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:06:51.100 23:48:51 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:06:51.100 23:48:51 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 353687 00:06:51.100 23:48:51 dpdk_mem_utility -- common/autotest_common.sh@946 -- # '[' -z 353687 ']' 00:06:51.100 23:48:51 dpdk_mem_utility -- common/autotest_common.sh@950 -- # kill -0 353687 00:06:51.100 23:48:51 dpdk_mem_utility -- common/autotest_common.sh@951 -- # uname 00:06:51.100 23:48:51 dpdk_mem_utility -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:51.100 23:48:51 dpdk_mem_utility -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 353687 00:06:51.100 23:48:51 dpdk_mem_utility -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:51.100 23:48:51 dpdk_mem_utility -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:51.100 23:48:51 dpdk_mem_utility -- common/autotest_common.sh@964 -- # echo 'killing process with pid 353687' 00:06:51.100 killing process with pid 353687 00:06:51.100 23:48:51 dpdk_mem_utility -- common/autotest_common.sh@965 -- # kill 353687 00:06:51.100 23:48:51 dpdk_mem_utility -- common/autotest_common.sh@970 -- # wait 353687 00:06:51.670 00:06:51.670 real 0m1.783s 00:06:51.670 user 0m1.906s 00:06:51.670 sys 0m0.589s 00:06:51.670 23:48:52 dpdk_mem_utility -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:51.670 23:48:52 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:51.670 ************************************ 00:06:51.670 END TEST dpdk_mem_utility 00:06:51.670 ************************************ 00:06:51.670 23:48:52 -- spdk/autotest.sh@177 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:06:51.670 23:48:52 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:51.670 23:48:52 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:51.670 23:48:52 -- common/autotest_common.sh@10 -- # set +x 00:06:51.670 ************************************ 00:06:51.670 START TEST event 00:06:51.670 ************************************ 00:06:51.670 23:48:52 event -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:06:51.929 * Looking for test storage... 00:06:51.929 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:06:51.929 23:48:52 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:06:51.929 23:48:52 event -- bdev/nbd_common.sh@6 -- # set -e 00:06:51.929 23:48:52 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:51.929 23:48:52 event -- common/autotest_common.sh@1097 -- # '[' 6 -le 1 ']' 00:06:51.929 23:48:52 event -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:51.929 23:48:52 event -- common/autotest_common.sh@10 -- # set +x 00:06:51.929 ************************************ 00:06:51.929 START TEST event_perf 00:06:51.929 ************************************ 00:06:51.929 23:48:52 event.event_perf -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:51.929 Running I/O for 1 seconds...[2024-05-14 23:48:52.351888] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:06:51.929 [2024-05-14 23:48:52.351949] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid354040 ] 00:06:51.929 [2024-05-14 23:48:52.479585] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:52.188 [2024-05-14 23:48:52.582493] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:52.188 [2024-05-14 23:48:52.582594] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:52.188 [2024-05-14 23:48:52.582695] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.188 Running I/O for 1 seconds...[2024-05-14 23:48:52.582695] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:53.125 00:06:53.125 lcore 0: 162996 00:06:53.125 lcore 1: 162996 00:06:53.125 lcore 2: 162996 00:06:53.125 lcore 3: 162996 00:06:53.125 done. 00:06:53.125 00:06:53.125 real 0m1.374s 00:06:53.125 user 0m4.228s 00:06:53.125 sys 0m0.137s 00:06:53.125 23:48:53 event.event_perf -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:53.125 23:48:53 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:06:53.125 ************************************ 00:06:53.125 END TEST event_perf 00:06:53.125 ************************************ 00:06:53.384 23:48:53 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:53.384 23:48:53 event -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:06:53.384 23:48:53 event -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:53.384 23:48:53 event -- common/autotest_common.sh@10 -- # set +x 00:06:53.384 ************************************ 00:06:53.384 START TEST event_reactor 00:06:53.384 ************************************ 00:06:53.384 23:48:53 event.event_reactor -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:53.384 [2024-05-14 23:48:53.817014] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:06:53.384 [2024-05-14 23:48:53.817089] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid354236 ] 00:06:53.384 [2024-05-14 23:48:53.950073] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.643 [2024-05-14 23:48:54.052266] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.582 test_start 00:06:54.582 oneshot 00:06:54.582 tick 100 00:06:54.582 tick 100 00:06:54.582 tick 250 00:06:54.582 tick 100 00:06:54.582 tick 100 00:06:54.582 tick 100 00:06:54.582 tick 250 00:06:54.582 tick 500 00:06:54.582 tick 100 00:06:54.582 tick 100 00:06:54.582 tick 250 00:06:54.582 tick 100 00:06:54.582 tick 100 00:06:54.582 test_end 00:06:54.582 00:06:54.582 real 0m1.375s 00:06:54.582 user 0m1.227s 00:06:54.582 sys 0m0.141s 00:06:54.582 23:48:55 event.event_reactor -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:54.582 23:48:55 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:06:54.582 ************************************ 00:06:54.582 END TEST event_reactor 00:06:54.582 ************************************ 00:06:54.877 23:48:55 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:54.878 23:48:55 event -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:06:54.878 23:48:55 event -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:54.878 23:48:55 event -- common/autotest_common.sh@10 -- # set +x 00:06:54.878 ************************************ 00:06:54.878 START TEST event_reactor_perf 00:06:54.878 ************************************ 00:06:54.878 23:48:55 event.event_reactor_perf -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:54.878 [2024-05-14 23:48:55.282845] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:06:54.878 [2024-05-14 23:48:55.282920] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid354440 ] 00:06:54.878 [2024-05-14 23:48:55.411670] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:55.137 [2024-05-14 23:48:55.513352] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.074 test_start 00:06:56.074 test_end 00:06:56.074 Performance: 324156 events per second 00:06:56.074 00:06:56.074 real 0m1.363s 00:06:56.074 user 0m1.223s 00:06:56.074 sys 0m0.134s 00:06:56.074 23:48:56 event.event_reactor_perf -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:56.074 23:48:56 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:06:56.074 ************************************ 00:06:56.074 END TEST event_reactor_perf 00:06:56.074 ************************************ 00:06:56.074 23:48:56 event -- event/event.sh@49 -- # uname -s 00:06:56.333 23:48:56 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:06:56.333 23:48:56 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:56.333 23:48:56 event -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:56.333 23:48:56 event -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:56.333 23:48:56 event -- common/autotest_common.sh@10 -- # set +x 00:06:56.333 ************************************ 00:06:56.333 START TEST event_scheduler 00:06:56.333 ************************************ 00:06:56.333 23:48:56 event.event_scheduler -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:56.333 * Looking for test storage... 00:06:56.333 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:06:56.333 23:48:56 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:06:56.333 23:48:56 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=354664 00:06:56.333 23:48:56 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:06:56.333 23:48:56 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:06:56.333 23:48:56 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 354664 00:06:56.333 23:48:56 event.event_scheduler -- common/autotest_common.sh@827 -- # '[' -z 354664 ']' 00:06:56.333 23:48:56 event.event_scheduler -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:56.333 23:48:56 event.event_scheduler -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:56.333 23:48:56 event.event_scheduler -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:56.333 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:56.333 23:48:56 event.event_scheduler -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:56.333 23:48:56 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:56.333 [2024-05-14 23:48:56.879557] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:06:56.333 [2024-05-14 23:48:56.879631] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid354664 ] 00:06:56.593 [2024-05-14 23:48:57.008606] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:56.593 [2024-05-14 23:48:57.108908] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.593 [2024-05-14 23:48:57.108983] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:56.593 [2024-05-14 23:48:57.109083] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:56.593 [2024-05-14 23:48:57.109083] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:57.530 23:48:57 event.event_scheduler -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:57.530 23:48:57 event.event_scheduler -- common/autotest_common.sh@860 -- # return 0 00:06:57.530 23:48:57 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:06:57.530 23:48:57 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:57.530 23:48:57 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:57.530 POWER: Env isn't set yet! 00:06:57.530 POWER: Attempting to initialise ACPI cpufreq power management... 00:06:57.530 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:06:57.530 POWER: Cannot set governor of lcore 0 to userspace 00:06:57.530 POWER: Attempting to initialise PSTAT power management... 00:06:57.530 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:06:57.530 POWER: Initialized successfully for lcore 0 power management 00:06:57.530 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:06:57.530 POWER: Initialized successfully for lcore 1 power management 00:06:57.530 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:06:57.530 POWER: Initialized successfully for lcore 2 power management 00:06:57.530 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:06:57.530 POWER: Initialized successfully for lcore 3 power management 00:06:57.530 23:48:57 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:57.530 23:48:57 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:06:57.530 23:48:57 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:57.530 23:48:57 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:57.530 [2024-05-14 23:48:57.943747] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:06:57.530 23:48:57 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:57.530 23:48:57 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:06:57.530 23:48:57 event.event_scheduler -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:57.530 23:48:57 event.event_scheduler -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:57.530 23:48:57 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:57.530 ************************************ 00:06:57.530 START TEST scheduler_create_thread 00:06:57.530 ************************************ 00:06:57.530 23:48:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1121 -- # scheduler_create_thread 00:06:57.530 23:48:57 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:06:57.530 23:48:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:57.530 23:48:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:57.530 2 00:06:57.530 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:57.530 23:48:58 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:06:57.530 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:57.530 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:57.530 3 00:06:57.530 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:57.530 23:48:58 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:06:57.530 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:57.530 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:57.530 4 00:06:57.530 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:57.530 23:48:58 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:06:57.530 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:57.530 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:57.530 5 00:06:57.530 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:57.530 23:48:58 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:06:57.530 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:57.530 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:57.530 6 00:06:57.530 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:57.530 23:48:58 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:06:57.531 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:57.531 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:57.531 7 00:06:57.531 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:57.531 23:48:58 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:06:57.531 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:57.531 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:57.531 8 00:06:57.531 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:57.531 23:48:58 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:06:57.531 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:57.531 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:57.531 9 00:06:57.531 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:57.531 23:48:58 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:06:57.531 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:57.531 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:57.531 10 00:06:57.531 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:57.531 23:48:58 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:06:57.531 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:57.531 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:57.531 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:57.531 23:48:58 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:06:57.531 23:48:58 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:06:57.531 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:57.531 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:58.099 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:58.099 23:48:58 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:06:58.099 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:58.099 23:48:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:59.477 23:49:00 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:59.477 23:49:00 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:06:59.477 23:49:00 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:06:59.477 23:49:00 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:59.477 23:49:00 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:00.853 23:49:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:00.853 00:07:00.853 real 0m3.103s 00:07:00.853 user 0m0.021s 00:07:00.853 sys 0m0.010s 00:07:00.853 23:49:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:00.853 23:49:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:00.853 ************************************ 00:07:00.853 END TEST scheduler_create_thread 00:07:00.853 ************************************ 00:07:00.853 23:49:01 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:07:00.853 23:49:01 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 354664 00:07:00.853 23:49:01 event.event_scheduler -- common/autotest_common.sh@946 -- # '[' -z 354664 ']' 00:07:00.853 23:49:01 event.event_scheduler -- common/autotest_common.sh@950 -- # kill -0 354664 00:07:00.853 23:49:01 event.event_scheduler -- common/autotest_common.sh@951 -- # uname 00:07:00.853 23:49:01 event.event_scheduler -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:07:00.853 23:49:01 event.event_scheduler -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 354664 00:07:00.853 23:49:01 event.event_scheduler -- common/autotest_common.sh@952 -- # process_name=reactor_2 00:07:00.853 23:49:01 event.event_scheduler -- common/autotest_common.sh@956 -- # '[' reactor_2 = sudo ']' 00:07:00.853 23:49:01 event.event_scheduler -- common/autotest_common.sh@964 -- # echo 'killing process with pid 354664' 00:07:00.853 killing process with pid 354664 00:07:00.853 23:49:01 event.event_scheduler -- common/autotest_common.sh@965 -- # kill 354664 00:07:00.853 23:49:01 event.event_scheduler -- common/autotest_common.sh@970 -- # wait 354664 00:07:01.112 [2024-05-14 23:49:01.479698] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:07:01.112 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:07:01.112 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:07:01.112 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:07:01.112 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:07:01.112 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:07:01.113 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:07:01.113 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:07:01.113 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:07:01.372 00:07:01.372 real 0m5.025s 00:07:01.372 user 0m9.769s 00:07:01.372 sys 0m0.499s 00:07:01.372 23:49:01 event.event_scheduler -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:01.372 23:49:01 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:01.372 ************************************ 00:07:01.372 END TEST event_scheduler 00:07:01.372 ************************************ 00:07:01.372 23:49:01 event -- event/event.sh@51 -- # modprobe -n nbd 00:07:01.372 23:49:01 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:07:01.372 23:49:01 event -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:07:01.372 23:49:01 event -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:01.372 23:49:01 event -- common/autotest_common.sh@10 -- # set +x 00:07:01.372 ************************************ 00:07:01.372 START TEST app_repeat 00:07:01.372 ************************************ 00:07:01.372 23:49:01 event.app_repeat -- common/autotest_common.sh@1121 -- # app_repeat_test 00:07:01.372 23:49:01 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:01.372 23:49:01 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:01.372 23:49:01 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:07:01.372 23:49:01 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:01.372 23:49:01 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:07:01.372 23:49:01 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:07:01.372 23:49:01 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:07:01.372 23:49:01 event.app_repeat -- event/event.sh@19 -- # repeat_pid=355415 00:07:01.372 23:49:01 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:07:01.372 23:49:01 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 355415' 00:07:01.372 Process app_repeat pid: 355415 00:07:01.372 23:49:01 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:01.372 23:49:01 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:07:01.372 spdk_app_start Round 0 00:07:01.372 23:49:01 event.app_repeat -- event/event.sh@25 -- # waitforlisten 355415 /var/tmp/spdk-nbd.sock 00:07:01.372 23:49:01 event.app_repeat -- common/autotest_common.sh@827 -- # '[' -z 355415 ']' 00:07:01.372 23:49:01 event.app_repeat -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:01.372 23:49:01 event.app_repeat -- common/autotest_common.sh@832 -- # local max_retries=100 00:07:01.372 23:49:01 event.app_repeat -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:01.372 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:01.372 23:49:01 event.app_repeat -- common/autotest_common.sh@836 -- # xtrace_disable 00:07:01.372 23:49:01 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:01.372 23:49:01 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:07:01.372 [2024-05-14 23:49:01.869781] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:07:01.372 [2024-05-14 23:49:01.869838] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid355415 ] 00:07:01.631 [2024-05-14 23:49:02.000520] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:01.631 [2024-05-14 23:49:02.108884] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:01.631 [2024-05-14 23:49:02.108889] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:02.568 23:49:02 event.app_repeat -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:07:02.568 23:49:02 event.app_repeat -- common/autotest_common.sh@860 -- # return 0 00:07:02.568 23:49:02 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:02.568 Malloc0 00:07:02.568 23:49:03 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:02.826 Malloc1 00:07:02.826 23:49:03 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:02.826 23:49:03 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:02.826 23:49:03 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:02.826 23:49:03 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:02.826 23:49:03 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:02.827 23:49:03 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:02.827 23:49:03 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:02.827 23:49:03 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:02.827 23:49:03 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:02.827 23:49:03 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:02.827 23:49:03 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:02.827 23:49:03 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:02.827 23:49:03 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:02.827 23:49:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:02.827 23:49:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:02.827 23:49:03 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:03.085 /dev/nbd0 00:07:03.085 23:49:03 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:03.085 23:49:03 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:03.085 23:49:03 event.app_repeat -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:07:03.085 23:49:03 event.app_repeat -- common/autotest_common.sh@865 -- # local i 00:07:03.085 23:49:03 event.app_repeat -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:07:03.085 23:49:03 event.app_repeat -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:07:03.085 23:49:03 event.app_repeat -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:07:03.085 23:49:03 event.app_repeat -- common/autotest_common.sh@869 -- # break 00:07:03.085 23:49:03 event.app_repeat -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:07:03.085 23:49:03 event.app_repeat -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:07:03.085 23:49:03 event.app_repeat -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:03.085 1+0 records in 00:07:03.085 1+0 records out 00:07:03.085 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000233928 s, 17.5 MB/s 00:07:03.085 23:49:03 event.app_repeat -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:03.085 23:49:03 event.app_repeat -- common/autotest_common.sh@882 -- # size=4096 00:07:03.085 23:49:03 event.app_repeat -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:03.085 23:49:03 event.app_repeat -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:07:03.085 23:49:03 event.app_repeat -- common/autotest_common.sh@885 -- # return 0 00:07:03.085 23:49:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:03.085 23:49:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:03.085 23:49:03 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:03.353 /dev/nbd1 00:07:03.353 23:49:03 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:03.353 23:49:03 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:03.353 23:49:03 event.app_repeat -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:07:03.353 23:49:03 event.app_repeat -- common/autotest_common.sh@865 -- # local i 00:07:03.353 23:49:03 event.app_repeat -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:07:03.353 23:49:03 event.app_repeat -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:07:03.353 23:49:03 event.app_repeat -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:07:03.353 23:49:03 event.app_repeat -- common/autotest_common.sh@869 -- # break 00:07:03.353 23:49:03 event.app_repeat -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:07:03.353 23:49:03 event.app_repeat -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:07:03.353 23:49:03 event.app_repeat -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:03.353 1+0 records in 00:07:03.353 1+0 records out 00:07:03.353 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000256595 s, 16.0 MB/s 00:07:03.353 23:49:03 event.app_repeat -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:03.353 23:49:03 event.app_repeat -- common/autotest_common.sh@882 -- # size=4096 00:07:03.353 23:49:03 event.app_repeat -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:03.353 23:49:03 event.app_repeat -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:07:03.353 23:49:03 event.app_repeat -- common/autotest_common.sh@885 -- # return 0 00:07:03.353 23:49:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:03.353 23:49:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:03.353 23:49:03 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:03.353 23:49:03 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:03.353 23:49:03 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:03.612 23:49:04 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:03.612 { 00:07:03.612 "nbd_device": "/dev/nbd0", 00:07:03.612 "bdev_name": "Malloc0" 00:07:03.612 }, 00:07:03.612 { 00:07:03.612 "nbd_device": "/dev/nbd1", 00:07:03.612 "bdev_name": "Malloc1" 00:07:03.612 } 00:07:03.612 ]' 00:07:03.612 23:49:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:03.612 { 00:07:03.612 "nbd_device": "/dev/nbd0", 00:07:03.612 "bdev_name": "Malloc0" 00:07:03.612 }, 00:07:03.612 { 00:07:03.612 "nbd_device": "/dev/nbd1", 00:07:03.612 "bdev_name": "Malloc1" 00:07:03.612 } 00:07:03.612 ]' 00:07:03.612 23:49:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:03.872 /dev/nbd1' 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:03.872 /dev/nbd1' 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:03.872 256+0 records in 00:07:03.872 256+0 records out 00:07:03.872 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0112197 s, 93.5 MB/s 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:03.872 256+0 records in 00:07:03.872 256+0 records out 00:07:03.872 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0179885 s, 58.3 MB/s 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:03.872 256+0 records in 00:07:03.872 256+0 records out 00:07:03.872 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0306983 s, 34.2 MB/s 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:03.872 23:49:04 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:04.132 23:49:04 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:04.132 23:49:04 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:04.132 23:49:04 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:04.132 23:49:04 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:04.132 23:49:04 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:04.132 23:49:04 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:04.132 23:49:04 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:04.132 23:49:04 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:04.132 23:49:04 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:04.132 23:49:04 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:04.391 23:49:04 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:04.391 23:49:04 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:04.391 23:49:04 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:04.391 23:49:04 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:04.391 23:49:04 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:04.391 23:49:04 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:04.391 23:49:04 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:04.391 23:49:04 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:04.391 23:49:04 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:04.391 23:49:04 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:04.391 23:49:04 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:04.651 23:49:05 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:04.651 23:49:05 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:04.651 23:49:05 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:04.651 23:49:05 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:04.651 23:49:05 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:04.651 23:49:05 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:04.651 23:49:05 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:04.651 23:49:05 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:04.651 23:49:05 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:04.651 23:49:05 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:04.651 23:49:05 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:04.651 23:49:05 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:04.651 23:49:05 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:04.911 23:49:05 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:05.169 [2024-05-14 23:49:05.692442] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:05.428 [2024-05-14 23:49:05.791578] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:05.428 [2024-05-14 23:49:05.791584] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.428 [2024-05-14 23:49:05.844504] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:05.428 [2024-05-14 23:49:05.844556] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:07.962 23:49:08 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:07.962 23:49:08 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:07:07.962 spdk_app_start Round 1 00:07:07.962 23:49:08 event.app_repeat -- event/event.sh@25 -- # waitforlisten 355415 /var/tmp/spdk-nbd.sock 00:07:07.962 23:49:08 event.app_repeat -- common/autotest_common.sh@827 -- # '[' -z 355415 ']' 00:07:07.962 23:49:08 event.app_repeat -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:07.962 23:49:08 event.app_repeat -- common/autotest_common.sh@832 -- # local max_retries=100 00:07:07.962 23:49:08 event.app_repeat -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:07.962 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:07.962 23:49:08 event.app_repeat -- common/autotest_common.sh@836 -- # xtrace_disable 00:07:07.962 23:49:08 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:08.221 23:49:08 event.app_repeat -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:07:08.221 23:49:08 event.app_repeat -- common/autotest_common.sh@860 -- # return 0 00:07:08.221 23:49:08 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:08.480 Malloc0 00:07:08.480 23:49:08 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:08.739 Malloc1 00:07:08.739 23:49:09 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:08.739 23:49:09 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:08.739 23:49:09 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:08.739 23:49:09 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:08.739 23:49:09 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:08.739 23:49:09 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:08.739 23:49:09 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:08.739 23:49:09 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:08.739 23:49:09 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:08.739 23:49:09 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:08.739 23:49:09 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:08.739 23:49:09 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:08.739 23:49:09 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:08.739 23:49:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:08.739 23:49:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:08.739 23:49:09 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:08.997 /dev/nbd0 00:07:08.998 23:49:09 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:08.998 23:49:09 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:08.998 23:49:09 event.app_repeat -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:07:08.998 23:49:09 event.app_repeat -- common/autotest_common.sh@865 -- # local i 00:07:08.998 23:49:09 event.app_repeat -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:07:08.998 23:49:09 event.app_repeat -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:07:08.998 23:49:09 event.app_repeat -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:07:08.998 23:49:09 event.app_repeat -- common/autotest_common.sh@869 -- # break 00:07:08.998 23:49:09 event.app_repeat -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:07:08.998 23:49:09 event.app_repeat -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:07:08.998 23:49:09 event.app_repeat -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:08.998 1+0 records in 00:07:08.998 1+0 records out 00:07:08.998 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000207309 s, 19.8 MB/s 00:07:08.998 23:49:09 event.app_repeat -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:08.998 23:49:09 event.app_repeat -- common/autotest_common.sh@882 -- # size=4096 00:07:08.998 23:49:09 event.app_repeat -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:08.998 23:49:09 event.app_repeat -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:07:08.998 23:49:09 event.app_repeat -- common/autotest_common.sh@885 -- # return 0 00:07:08.998 23:49:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:08.998 23:49:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:08.998 23:49:09 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:09.256 /dev/nbd1 00:07:09.256 23:49:09 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:09.256 23:49:09 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:09.256 23:49:09 event.app_repeat -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:07:09.256 23:49:09 event.app_repeat -- common/autotest_common.sh@865 -- # local i 00:07:09.256 23:49:09 event.app_repeat -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:07:09.256 23:49:09 event.app_repeat -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:07:09.256 23:49:09 event.app_repeat -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:07:09.256 23:49:09 event.app_repeat -- common/autotest_common.sh@869 -- # break 00:07:09.256 23:49:09 event.app_repeat -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:07:09.256 23:49:09 event.app_repeat -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:07:09.256 23:49:09 event.app_repeat -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:09.256 1+0 records in 00:07:09.256 1+0 records out 00:07:09.256 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00018746 s, 21.8 MB/s 00:07:09.256 23:49:09 event.app_repeat -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:09.256 23:49:09 event.app_repeat -- common/autotest_common.sh@882 -- # size=4096 00:07:09.256 23:49:09 event.app_repeat -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:09.256 23:49:09 event.app_repeat -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:07:09.256 23:49:09 event.app_repeat -- common/autotest_common.sh@885 -- # return 0 00:07:09.256 23:49:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:09.256 23:49:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:09.256 23:49:09 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:09.256 23:49:09 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:09.256 23:49:09 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:09.515 23:49:09 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:09.515 { 00:07:09.515 "nbd_device": "/dev/nbd0", 00:07:09.515 "bdev_name": "Malloc0" 00:07:09.515 }, 00:07:09.515 { 00:07:09.515 "nbd_device": "/dev/nbd1", 00:07:09.515 "bdev_name": "Malloc1" 00:07:09.515 } 00:07:09.515 ]' 00:07:09.515 23:49:09 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:09.515 { 00:07:09.515 "nbd_device": "/dev/nbd0", 00:07:09.515 "bdev_name": "Malloc0" 00:07:09.515 }, 00:07:09.515 { 00:07:09.515 "nbd_device": "/dev/nbd1", 00:07:09.515 "bdev_name": "Malloc1" 00:07:09.515 } 00:07:09.515 ]' 00:07:09.515 23:49:09 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:09.515 23:49:09 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:09.515 /dev/nbd1' 00:07:09.515 23:49:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:09.515 /dev/nbd1' 00:07:09.515 23:49:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:09.515 23:49:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:09.515 23:49:09 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:09.515 23:49:09 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:09.515 23:49:09 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:09.515 23:49:09 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:09.515 23:49:09 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:09.515 23:49:09 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:09.515 23:49:09 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:09.515 23:49:09 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:09.515 23:49:09 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:09.515 23:49:09 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:09.515 256+0 records in 00:07:09.515 256+0 records out 00:07:09.515 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0103757 s, 101 MB/s 00:07:09.515 23:49:09 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:09.515 23:49:09 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:09.515 256+0 records in 00:07:09.515 256+0 records out 00:07:09.515 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0286404 s, 36.6 MB/s 00:07:09.515 23:49:10 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:09.515 23:49:10 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:09.515 256+0 records in 00:07:09.515 256+0 records out 00:07:09.515 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0205936 s, 50.9 MB/s 00:07:09.515 23:49:10 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:09.515 23:49:10 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:09.515 23:49:10 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:09.515 23:49:10 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:09.515 23:49:10 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:09.515 23:49:10 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:09.515 23:49:10 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:09.515 23:49:10 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:09.515 23:49:10 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:09.515 23:49:10 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:09.515 23:49:10 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:09.515 23:49:10 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:09.515 23:49:10 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:09.515 23:49:10 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:09.515 23:49:10 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:09.515 23:49:10 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:09.515 23:49:10 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:09.515 23:49:10 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:09.515 23:49:10 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:09.774 23:49:10 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:09.774 23:49:10 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:09.774 23:49:10 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:09.774 23:49:10 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:09.774 23:49:10 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:09.774 23:49:10 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:09.774 23:49:10 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:09.774 23:49:10 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:09.774 23:49:10 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:09.774 23:49:10 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:10.032 23:49:10 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:10.032 23:49:10 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:10.032 23:49:10 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:10.032 23:49:10 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:10.032 23:49:10 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:10.032 23:49:10 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:10.032 23:49:10 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:10.032 23:49:10 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:10.032 23:49:10 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:10.032 23:49:10 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:10.032 23:49:10 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:10.290 23:49:10 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:10.290 23:49:10 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:10.290 23:49:10 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:10.549 23:49:10 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:10.549 23:49:10 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:10.549 23:49:10 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:10.549 23:49:10 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:10.549 23:49:10 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:10.549 23:49:10 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:10.549 23:49:10 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:10.549 23:49:10 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:10.549 23:49:10 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:10.549 23:49:10 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:10.834 23:49:11 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:11.093 [2024-05-14 23:49:11.445355] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:11.093 [2024-05-14 23:49:11.545570] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:11.093 [2024-05-14 23:49:11.545576] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.093 [2024-05-14 23:49:11.598593] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:11.093 [2024-05-14 23:49:11.598643] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:13.624 23:49:14 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:13.624 23:49:14 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:07:13.624 spdk_app_start Round 2 00:07:13.624 23:49:14 event.app_repeat -- event/event.sh@25 -- # waitforlisten 355415 /var/tmp/spdk-nbd.sock 00:07:13.624 23:49:14 event.app_repeat -- common/autotest_common.sh@827 -- # '[' -z 355415 ']' 00:07:13.624 23:49:14 event.app_repeat -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:13.624 23:49:14 event.app_repeat -- common/autotest_common.sh@832 -- # local max_retries=100 00:07:13.624 23:49:14 event.app_repeat -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:13.624 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:13.624 23:49:14 event.app_repeat -- common/autotest_common.sh@836 -- # xtrace_disable 00:07:13.624 23:49:14 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:13.892 23:49:14 event.app_repeat -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:07:13.892 23:49:14 event.app_repeat -- common/autotest_common.sh@860 -- # return 0 00:07:13.892 23:49:14 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:14.155 Malloc0 00:07:14.155 23:49:14 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:14.414 Malloc1 00:07:14.414 23:49:14 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:14.414 23:49:14 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:14.414 23:49:14 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:14.414 23:49:14 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:14.414 23:49:14 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:14.414 23:49:14 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:14.414 23:49:14 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:14.414 23:49:14 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:14.414 23:49:14 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:14.414 23:49:14 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:14.414 23:49:14 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:14.414 23:49:14 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:14.414 23:49:14 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:14.414 23:49:14 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:14.414 23:49:14 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:14.414 23:49:14 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:14.672 /dev/nbd0 00:07:14.672 23:49:15 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:14.672 23:49:15 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:14.672 23:49:15 event.app_repeat -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:07:14.672 23:49:15 event.app_repeat -- common/autotest_common.sh@865 -- # local i 00:07:14.672 23:49:15 event.app_repeat -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:07:14.672 23:49:15 event.app_repeat -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:07:14.672 23:49:15 event.app_repeat -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:07:14.672 23:49:15 event.app_repeat -- common/autotest_common.sh@869 -- # break 00:07:14.672 23:49:15 event.app_repeat -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:07:14.672 23:49:15 event.app_repeat -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:07:14.672 23:49:15 event.app_repeat -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:14.672 1+0 records in 00:07:14.672 1+0 records out 00:07:14.672 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000317627 s, 12.9 MB/s 00:07:14.672 23:49:15 event.app_repeat -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:14.672 23:49:15 event.app_repeat -- common/autotest_common.sh@882 -- # size=4096 00:07:14.672 23:49:15 event.app_repeat -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:14.672 23:49:15 event.app_repeat -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:07:14.672 23:49:15 event.app_repeat -- common/autotest_common.sh@885 -- # return 0 00:07:14.672 23:49:15 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:14.672 23:49:15 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:14.672 23:49:15 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:14.931 /dev/nbd1 00:07:14.931 23:49:15 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:14.931 23:49:15 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:14.931 23:49:15 event.app_repeat -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:07:14.931 23:49:15 event.app_repeat -- common/autotest_common.sh@865 -- # local i 00:07:14.931 23:49:15 event.app_repeat -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:07:14.931 23:49:15 event.app_repeat -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:07:14.931 23:49:15 event.app_repeat -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:07:14.931 23:49:15 event.app_repeat -- common/autotest_common.sh@869 -- # break 00:07:14.931 23:49:15 event.app_repeat -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:07:14.931 23:49:15 event.app_repeat -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:07:14.931 23:49:15 event.app_repeat -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:14.931 1+0 records in 00:07:14.931 1+0 records out 00:07:14.931 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000260818 s, 15.7 MB/s 00:07:14.931 23:49:15 event.app_repeat -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:14.931 23:49:15 event.app_repeat -- common/autotest_common.sh@882 -- # size=4096 00:07:14.931 23:49:15 event.app_repeat -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:14.931 23:49:15 event.app_repeat -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:07:14.931 23:49:15 event.app_repeat -- common/autotest_common.sh@885 -- # return 0 00:07:14.931 23:49:15 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:14.931 23:49:15 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:14.931 23:49:15 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:14.931 23:49:15 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:14.931 23:49:15 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:15.189 23:49:15 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:15.189 { 00:07:15.189 "nbd_device": "/dev/nbd0", 00:07:15.190 "bdev_name": "Malloc0" 00:07:15.190 }, 00:07:15.190 { 00:07:15.190 "nbd_device": "/dev/nbd1", 00:07:15.190 "bdev_name": "Malloc1" 00:07:15.190 } 00:07:15.190 ]' 00:07:15.190 23:49:15 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:15.190 { 00:07:15.190 "nbd_device": "/dev/nbd0", 00:07:15.190 "bdev_name": "Malloc0" 00:07:15.190 }, 00:07:15.190 { 00:07:15.190 "nbd_device": "/dev/nbd1", 00:07:15.190 "bdev_name": "Malloc1" 00:07:15.190 } 00:07:15.190 ]' 00:07:15.190 23:49:15 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:15.190 23:49:15 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:15.190 /dev/nbd1' 00:07:15.190 23:49:15 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:15.190 /dev/nbd1' 00:07:15.190 23:49:15 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:15.190 23:49:15 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:15.190 23:49:15 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:15.448 23:49:15 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:15.448 23:49:15 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:15.448 23:49:15 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:15.448 23:49:15 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:15.448 23:49:15 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:15.448 23:49:15 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:15.448 23:49:15 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:15.448 23:49:15 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:15.448 23:49:15 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:15.448 256+0 records in 00:07:15.448 256+0 records out 00:07:15.448 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0111354 s, 94.2 MB/s 00:07:15.449 23:49:15 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:15.449 23:49:15 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:15.449 256+0 records in 00:07:15.449 256+0 records out 00:07:15.449 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0177889 s, 58.9 MB/s 00:07:15.449 23:49:15 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:15.449 23:49:15 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:15.449 256+0 records in 00:07:15.449 256+0 records out 00:07:15.449 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0309981 s, 33.8 MB/s 00:07:15.449 23:49:15 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:15.449 23:49:15 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:15.449 23:49:15 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:15.449 23:49:15 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:15.449 23:49:15 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:15.449 23:49:15 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:15.449 23:49:15 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:15.449 23:49:15 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:15.449 23:49:15 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:15.449 23:49:15 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:15.449 23:49:15 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:15.449 23:49:15 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:15.449 23:49:15 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:15.449 23:49:15 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:15.449 23:49:15 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:15.449 23:49:15 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:15.449 23:49:15 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:15.449 23:49:15 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:15.449 23:49:15 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:15.707 23:49:16 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:15.707 23:49:16 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:15.707 23:49:16 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:15.707 23:49:16 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:15.707 23:49:16 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:15.707 23:49:16 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:15.707 23:49:16 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:15.707 23:49:16 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:15.707 23:49:16 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:15.707 23:49:16 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:15.965 23:49:16 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:15.965 23:49:16 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:15.965 23:49:16 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:15.965 23:49:16 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:15.965 23:49:16 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:15.965 23:49:16 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:15.965 23:49:16 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:15.965 23:49:16 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:15.965 23:49:16 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:15.965 23:49:16 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:15.965 23:49:16 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:16.224 23:49:16 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:16.224 23:49:16 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:16.224 23:49:16 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:16.224 23:49:16 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:16.224 23:49:16 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:16.224 23:49:16 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:16.224 23:49:16 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:16.224 23:49:16 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:16.224 23:49:16 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:16.224 23:49:16 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:16.224 23:49:16 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:16.224 23:49:16 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:16.224 23:49:16 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:16.483 23:49:16 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:16.742 [2024-05-14 23:49:17.258072] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:17.002 [2024-05-14 23:49:17.359508] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:17.002 [2024-05-14 23:49:17.359513] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.002 [2024-05-14 23:49:17.413670] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:17.002 [2024-05-14 23:49:17.413721] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:19.537 23:49:19 event.app_repeat -- event/event.sh@38 -- # waitforlisten 355415 /var/tmp/spdk-nbd.sock 00:07:19.537 23:49:19 event.app_repeat -- common/autotest_common.sh@827 -- # '[' -z 355415 ']' 00:07:19.537 23:49:19 event.app_repeat -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:19.537 23:49:19 event.app_repeat -- common/autotest_common.sh@832 -- # local max_retries=100 00:07:19.537 23:49:19 event.app_repeat -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:19.537 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:19.537 23:49:19 event.app_repeat -- common/autotest_common.sh@836 -- # xtrace_disable 00:07:19.537 23:49:19 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:19.796 23:49:20 event.app_repeat -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:07:19.796 23:49:20 event.app_repeat -- common/autotest_common.sh@860 -- # return 0 00:07:19.796 23:49:20 event.app_repeat -- event/event.sh@39 -- # killprocess 355415 00:07:19.796 23:49:20 event.app_repeat -- common/autotest_common.sh@946 -- # '[' -z 355415 ']' 00:07:19.796 23:49:20 event.app_repeat -- common/autotest_common.sh@950 -- # kill -0 355415 00:07:19.796 23:49:20 event.app_repeat -- common/autotest_common.sh@951 -- # uname 00:07:19.796 23:49:20 event.app_repeat -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:07:19.796 23:49:20 event.app_repeat -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 355415 00:07:19.796 23:49:20 event.app_repeat -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:07:19.796 23:49:20 event.app_repeat -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:07:19.796 23:49:20 event.app_repeat -- common/autotest_common.sh@964 -- # echo 'killing process with pid 355415' 00:07:19.796 killing process with pid 355415 00:07:19.796 23:49:20 event.app_repeat -- common/autotest_common.sh@965 -- # kill 355415 00:07:19.796 23:49:20 event.app_repeat -- common/autotest_common.sh@970 -- # wait 355415 00:07:20.055 spdk_app_start is called in Round 0. 00:07:20.055 Shutdown signal received, stop current app iteration 00:07:20.055 Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 reinitialization... 00:07:20.055 spdk_app_start is called in Round 1. 00:07:20.055 Shutdown signal received, stop current app iteration 00:07:20.055 Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 reinitialization... 00:07:20.055 spdk_app_start is called in Round 2. 00:07:20.055 Shutdown signal received, stop current app iteration 00:07:20.055 Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 reinitialization... 00:07:20.055 spdk_app_start is called in Round 3. 00:07:20.055 Shutdown signal received, stop current app iteration 00:07:20.055 23:49:20 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:07:20.055 23:49:20 event.app_repeat -- event/event.sh@42 -- # return 0 00:07:20.055 00:07:20.055 real 0m18.687s 00:07:20.055 user 0m40.356s 00:07:20.055 sys 0m3.766s 00:07:20.055 23:49:20 event.app_repeat -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:20.055 23:49:20 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:20.055 ************************************ 00:07:20.055 END TEST app_repeat 00:07:20.055 ************************************ 00:07:20.055 23:49:20 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:07:20.055 00:07:20.055 real 0m28.388s 00:07:20.055 user 0m56.980s 00:07:20.055 sys 0m5.083s 00:07:20.055 23:49:20 event -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:20.055 23:49:20 event -- common/autotest_common.sh@10 -- # set +x 00:07:20.055 ************************************ 00:07:20.055 END TEST event 00:07:20.055 ************************************ 00:07:20.055 23:49:20 -- spdk/autotest.sh@178 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:07:20.055 23:49:20 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:07:20.055 23:49:20 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:20.055 23:49:20 -- common/autotest_common.sh@10 -- # set +x 00:07:20.055 ************************************ 00:07:20.055 START TEST thread 00:07:20.055 ************************************ 00:07:20.055 23:49:20 thread -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:07:20.314 * Looking for test storage... 00:07:20.314 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:07:20.314 23:49:20 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:20.314 23:49:20 thread -- common/autotest_common.sh@1097 -- # '[' 8 -le 1 ']' 00:07:20.314 23:49:20 thread -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:20.314 23:49:20 thread -- common/autotest_common.sh@10 -- # set +x 00:07:20.314 ************************************ 00:07:20.314 START TEST thread_poller_perf 00:07:20.314 ************************************ 00:07:20.314 23:49:20 thread.thread_poller_perf -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:20.314 [2024-05-14 23:49:20.819883] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:07:20.314 [2024-05-14 23:49:20.819948] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid358111 ] 00:07:20.573 [2024-05-14 23:49:20.950333] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:20.573 [2024-05-14 23:49:21.047263] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:20.573 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:21.966 ====================================== 00:07:21.966 busy:2307241960 (cyc) 00:07:21.966 total_run_count: 266000 00:07:21.966 tsc_hz: 2300000000 (cyc) 00:07:21.966 ====================================== 00:07:21.966 poller_cost: 8673 (cyc), 3770 (nsec) 00:07:21.966 00:07:21.966 real 0m1.374s 00:07:21.966 user 0m1.226s 00:07:21.966 sys 0m0.141s 00:07:21.966 23:49:22 thread.thread_poller_perf -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:21.966 23:49:22 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:21.966 ************************************ 00:07:21.966 END TEST thread_poller_perf 00:07:21.966 ************************************ 00:07:21.966 23:49:22 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:21.966 23:49:22 thread -- common/autotest_common.sh@1097 -- # '[' 8 -le 1 ']' 00:07:21.966 23:49:22 thread -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:21.966 23:49:22 thread -- common/autotest_common.sh@10 -- # set +x 00:07:21.966 ************************************ 00:07:21.966 START TEST thread_poller_perf 00:07:21.966 ************************************ 00:07:21.966 23:49:22 thread.thread_poller_perf -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:21.966 [2024-05-14 23:49:22.265861] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:07:21.966 [2024-05-14 23:49:22.265917] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid358310 ] 00:07:21.966 [2024-05-14 23:49:22.392514] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:21.966 [2024-05-14 23:49:22.490140] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.966 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:07:23.342 ====================================== 00:07:23.342 busy:2302255406 (cyc) 00:07:23.342 total_run_count: 3514000 00:07:23.342 tsc_hz: 2300000000 (cyc) 00:07:23.342 ====================================== 00:07:23.342 poller_cost: 655 (cyc), 284 (nsec) 00:07:23.342 00:07:23.342 real 0m1.363s 00:07:23.342 user 0m1.221s 00:07:23.342 sys 0m0.135s 00:07:23.342 23:49:23 thread.thread_poller_perf -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:23.342 23:49:23 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:23.342 ************************************ 00:07:23.342 END TEST thread_poller_perf 00:07:23.342 ************************************ 00:07:23.342 23:49:23 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:07:23.342 00:07:23.342 real 0m2.999s 00:07:23.342 user 0m2.538s 00:07:23.342 sys 0m0.463s 00:07:23.342 23:49:23 thread -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:23.342 23:49:23 thread -- common/autotest_common.sh@10 -- # set +x 00:07:23.342 ************************************ 00:07:23.342 END TEST thread 00:07:23.342 ************************************ 00:07:23.342 23:49:23 -- spdk/autotest.sh@179 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:07:23.342 23:49:23 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:07:23.342 23:49:23 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:23.342 23:49:23 -- common/autotest_common.sh@10 -- # set +x 00:07:23.342 ************************************ 00:07:23.342 START TEST accel 00:07:23.342 ************************************ 00:07:23.342 23:49:23 accel -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:07:23.342 * Looking for test storage... 00:07:23.342 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:07:23.342 23:49:23 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:07:23.342 23:49:23 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:07:23.342 23:49:23 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:23.342 23:49:23 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=358551 00:07:23.342 23:49:23 accel -- accel/accel.sh@63 -- # waitforlisten 358551 00:07:23.342 23:49:23 accel -- common/autotest_common.sh@827 -- # '[' -z 358551 ']' 00:07:23.342 23:49:23 accel -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:23.342 23:49:23 accel -- common/autotest_common.sh@832 -- # local max_retries=100 00:07:23.342 23:49:23 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:07:23.342 23:49:23 accel -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:23.342 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:23.342 23:49:23 accel -- accel/accel.sh@61 -- # build_accel_config 00:07:23.342 23:49:23 accel -- common/autotest_common.sh@836 -- # xtrace_disable 00:07:23.342 23:49:23 accel -- common/autotest_common.sh@10 -- # set +x 00:07:23.342 23:49:23 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:23.342 23:49:23 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:23.342 23:49:23 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:23.342 23:49:23 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:23.342 23:49:23 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:23.342 23:49:23 accel -- accel/accel.sh@40 -- # local IFS=, 00:07:23.342 23:49:23 accel -- accel/accel.sh@41 -- # jq -r . 00:07:23.342 [2024-05-14 23:49:23.898632] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:07:23.342 [2024-05-14 23:49:23.898706] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid358551 ] 00:07:23.601 [2024-05-14 23:49:24.028654] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:23.601 [2024-05-14 23:49:24.126592] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.539 23:49:24 accel -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:07:24.539 23:49:24 accel -- common/autotest_common.sh@860 -- # return 0 00:07:24.539 23:49:24 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:07:24.539 23:49:24 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:07:24.539 23:49:24 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:07:24.539 23:49:24 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:07:24.539 23:49:24 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:07:24.539 23:49:24 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:07:24.539 23:49:24 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:24.539 23:49:24 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:07:24.539 23:49:24 accel -- common/autotest_common.sh@10 -- # set +x 00:07:24.539 23:49:24 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:24.539 23:49:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.539 23:49:24 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.539 23:49:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.539 23:49:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.539 23:49:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.539 23:49:24 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.539 23:49:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.539 23:49:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.539 23:49:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.539 23:49:24 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.539 23:49:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.539 23:49:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.539 23:49:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.539 23:49:24 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.539 23:49:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.539 23:49:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.539 23:49:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.539 23:49:24 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.539 23:49:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.539 23:49:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.539 23:49:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.539 23:49:24 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.539 23:49:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.539 23:49:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.539 23:49:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.539 23:49:24 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.539 23:49:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.539 23:49:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.539 23:49:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.539 23:49:24 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.539 23:49:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.539 23:49:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.539 23:49:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.539 23:49:24 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.539 23:49:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.539 23:49:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.539 23:49:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.539 23:49:24 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.539 23:49:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.539 23:49:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.539 23:49:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.539 23:49:24 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.539 23:49:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.539 23:49:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.539 23:49:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.539 23:49:24 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.539 23:49:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.539 23:49:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.539 23:49:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.539 23:49:24 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.539 23:49:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.539 23:49:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.539 23:49:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:24.539 23:49:24 accel -- accel/accel.sh@72 -- # IFS== 00:07:24.539 23:49:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:24.539 23:49:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:24.539 23:49:24 accel -- accel/accel.sh@75 -- # killprocess 358551 00:07:24.539 23:49:24 accel -- common/autotest_common.sh@946 -- # '[' -z 358551 ']' 00:07:24.539 23:49:24 accel -- common/autotest_common.sh@950 -- # kill -0 358551 00:07:24.539 23:49:24 accel -- common/autotest_common.sh@951 -- # uname 00:07:24.539 23:49:24 accel -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:07:24.539 23:49:24 accel -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 358551 00:07:24.539 23:49:24 accel -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:07:24.539 23:49:24 accel -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:07:24.539 23:49:24 accel -- common/autotest_common.sh@964 -- # echo 'killing process with pid 358551' 00:07:24.539 killing process with pid 358551 00:07:24.539 23:49:24 accel -- common/autotest_common.sh@965 -- # kill 358551 00:07:24.539 23:49:24 accel -- common/autotest_common.sh@970 -- # wait 358551 00:07:24.799 23:49:25 accel -- accel/accel.sh@76 -- # trap - ERR 00:07:24.799 23:49:25 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:07:24.799 23:49:25 accel -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:07:24.799 23:49:25 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:24.799 23:49:25 accel -- common/autotest_common.sh@10 -- # set +x 00:07:24.799 23:49:25 accel.accel_help -- common/autotest_common.sh@1121 -- # accel_perf -h 00:07:24.799 23:49:25 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:07:24.799 23:49:25 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:07:24.799 23:49:25 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:24.799 23:49:25 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:24.799 23:49:25 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:24.799 23:49:25 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:24.799 23:49:25 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:24.799 23:49:25 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:07:24.799 23:49:25 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:07:25.058 23:49:25 accel.accel_help -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:25.058 23:49:25 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:07:25.058 23:49:25 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:07:25.058 23:49:25 accel -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:07:25.058 23:49:25 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:25.058 23:49:25 accel -- common/autotest_common.sh@10 -- # set +x 00:07:25.058 ************************************ 00:07:25.058 START TEST accel_missing_filename 00:07:25.058 ************************************ 00:07:25.058 23:49:25 accel.accel_missing_filename -- common/autotest_common.sh@1121 -- # NOT accel_perf -t 1 -w compress 00:07:25.058 23:49:25 accel.accel_missing_filename -- common/autotest_common.sh@648 -- # local es=0 00:07:25.058 23:49:25 accel.accel_missing_filename -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress 00:07:25.058 23:49:25 accel.accel_missing_filename -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:25.058 23:49:25 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:25.058 23:49:25 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:25.058 23:49:25 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:25.058 23:49:25 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress 00:07:25.058 23:49:25 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:07:25.058 23:49:25 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:07:25.058 23:49:25 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:25.058 23:49:25 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:25.058 23:49:25 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:25.058 23:49:25 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:25.058 23:49:25 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:25.058 23:49:25 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:07:25.058 23:49:25 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:07:25.058 [2024-05-14 23:49:25.536835] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:07:25.058 [2024-05-14 23:49:25.536900] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid358839 ] 00:07:25.317 [2024-05-14 23:49:25.651535] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:25.317 [2024-05-14 23:49:25.752777] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.317 [2024-05-14 23:49:25.826487] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:25.317 [2024-05-14 23:49:25.901224] accel_perf.c:1393:main: *ERROR*: ERROR starting application 00:07:25.577 A filename is required. 00:07:25.577 23:49:26 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # es=234 00:07:25.577 23:49:26 accel.accel_missing_filename -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:25.577 23:49:26 accel.accel_missing_filename -- common/autotest_common.sh@660 -- # es=106 00:07:25.577 23:49:26 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # case "$es" in 00:07:25.577 23:49:26 accel.accel_missing_filename -- common/autotest_common.sh@668 -- # es=1 00:07:25.577 23:49:26 accel.accel_missing_filename -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:25.577 00:07:25.577 real 0m0.517s 00:07:25.577 user 0m0.362s 00:07:25.577 sys 0m0.178s 00:07:25.577 23:49:26 accel.accel_missing_filename -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:25.577 23:49:26 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:07:25.577 ************************************ 00:07:25.577 END TEST accel_missing_filename 00:07:25.577 ************************************ 00:07:25.577 23:49:26 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:25.577 23:49:26 accel -- common/autotest_common.sh@1097 -- # '[' 10 -le 1 ']' 00:07:25.577 23:49:26 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:25.577 23:49:26 accel -- common/autotest_common.sh@10 -- # set +x 00:07:25.577 ************************************ 00:07:25.577 START TEST accel_compress_verify 00:07:25.577 ************************************ 00:07:25.577 23:49:26 accel.accel_compress_verify -- common/autotest_common.sh@1121 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:25.577 23:49:26 accel.accel_compress_verify -- common/autotest_common.sh@648 -- # local es=0 00:07:25.577 23:49:26 accel.accel_compress_verify -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:25.577 23:49:26 accel.accel_compress_verify -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:25.577 23:49:26 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:25.577 23:49:26 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:25.577 23:49:26 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:25.577 23:49:26 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:25.577 23:49:26 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:25.577 23:49:26 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:07:25.577 23:49:26 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:25.577 23:49:26 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:25.577 23:49:26 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:25.577 23:49:26 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:25.577 23:49:26 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:25.577 23:49:26 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:07:25.577 23:49:26 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:07:25.577 [2024-05-14 23:49:26.145298] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:07:25.577 [2024-05-14 23:49:26.145362] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid358954 ] 00:07:25.837 [2024-05-14 23:49:26.276113] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:25.837 [2024-05-14 23:49:26.380605] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.096 [2024-05-14 23:49:26.453347] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:26.096 [2024-05-14 23:49:26.527807] accel_perf.c:1393:main: *ERROR*: ERROR starting application 00:07:26.096 00:07:26.096 Compression does not support the verify option, aborting. 00:07:26.096 23:49:26 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # es=161 00:07:26.096 23:49:26 accel.accel_compress_verify -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:26.096 23:49:26 accel.accel_compress_verify -- common/autotest_common.sh@660 -- # es=33 00:07:26.096 23:49:26 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # case "$es" in 00:07:26.096 23:49:26 accel.accel_compress_verify -- common/autotest_common.sh@668 -- # es=1 00:07:26.096 23:49:26 accel.accel_compress_verify -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:26.096 00:07:26.096 real 0m0.537s 00:07:26.097 user 0m0.364s 00:07:26.097 sys 0m0.200s 00:07:26.097 23:49:26 accel.accel_compress_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:26.097 23:49:26 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:07:26.097 ************************************ 00:07:26.097 END TEST accel_compress_verify 00:07:26.097 ************************************ 00:07:26.097 23:49:26 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:07:26.097 23:49:26 accel -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:07:26.097 23:49:26 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:26.097 23:49:26 accel -- common/autotest_common.sh@10 -- # set +x 00:07:26.357 ************************************ 00:07:26.357 START TEST accel_wrong_workload 00:07:26.357 ************************************ 00:07:26.357 23:49:26 accel.accel_wrong_workload -- common/autotest_common.sh@1121 -- # NOT accel_perf -t 1 -w foobar 00:07:26.357 23:49:26 accel.accel_wrong_workload -- common/autotest_common.sh@648 -- # local es=0 00:07:26.357 23:49:26 accel.accel_wrong_workload -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:07:26.357 23:49:26 accel.accel_wrong_workload -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:26.357 23:49:26 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:26.357 23:49:26 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:26.357 23:49:26 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:26.357 23:49:26 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w foobar 00:07:26.357 23:49:26 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:07:26.357 23:49:26 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:07:26.357 23:49:26 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:26.357 23:49:26 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:26.357 23:49:26 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:26.357 23:49:26 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:26.357 23:49:26 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:26.357 23:49:26 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:07:26.357 23:49:26 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:07:26.357 Unsupported workload type: foobar 00:07:26.357 [2024-05-14 23:49:26.761082] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:07:26.357 accel_perf options: 00:07:26.357 [-h help message] 00:07:26.357 [-q queue depth per core] 00:07:26.357 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:07:26.357 [-T number of threads per core 00:07:26.357 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:07:26.357 [-t time in seconds] 00:07:26.357 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:07:26.357 [ dif_verify, , dif_generate, dif_generate_copy 00:07:26.357 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:07:26.357 [-l for compress/decompress workloads, name of uncompressed input file 00:07:26.357 [-S for crc32c workload, use this seed value (default 0) 00:07:26.357 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:07:26.357 [-f for fill workload, use this BYTE value (default 255) 00:07:26.357 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:07:26.357 [-y verify result if this switch is on] 00:07:26.357 [-a tasks to allocate per core (default: same value as -q)] 00:07:26.357 Can be used to spread operations across a wider range of memory. 00:07:26.357 23:49:26 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # es=1 00:07:26.357 23:49:26 accel.accel_wrong_workload -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:26.357 23:49:26 accel.accel_wrong_workload -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:26.357 23:49:26 accel.accel_wrong_workload -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:26.357 00:07:26.357 real 0m0.038s 00:07:26.357 user 0m0.019s 00:07:26.357 sys 0m0.019s 00:07:26.357 23:49:26 accel.accel_wrong_workload -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:26.357 23:49:26 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:07:26.357 ************************************ 00:07:26.357 END TEST accel_wrong_workload 00:07:26.357 ************************************ 00:07:26.357 Error: writing output failed: Broken pipe 00:07:26.357 23:49:26 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:07:26.357 23:49:26 accel -- common/autotest_common.sh@1097 -- # '[' 10 -le 1 ']' 00:07:26.357 23:49:26 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:26.357 23:49:26 accel -- common/autotest_common.sh@10 -- # set +x 00:07:26.357 ************************************ 00:07:26.357 START TEST accel_negative_buffers 00:07:26.357 ************************************ 00:07:26.357 23:49:26 accel.accel_negative_buffers -- common/autotest_common.sh@1121 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:07:26.357 23:49:26 accel.accel_negative_buffers -- common/autotest_common.sh@648 -- # local es=0 00:07:26.357 23:49:26 accel.accel_negative_buffers -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:07:26.357 23:49:26 accel.accel_negative_buffers -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:26.357 23:49:26 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:26.357 23:49:26 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:26.357 23:49:26 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:26.357 23:49:26 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w xor -y -x -1 00:07:26.357 23:49:26 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:07:26.357 23:49:26 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:07:26.357 23:49:26 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:26.357 23:49:26 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:26.357 23:49:26 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:26.357 23:49:26 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:26.357 23:49:26 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:26.357 23:49:26 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:07:26.357 23:49:26 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:07:26.357 -x option must be non-negative. 00:07:26.357 [2024-05-14 23:49:26.877937] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:07:26.357 accel_perf options: 00:07:26.357 [-h help message] 00:07:26.357 [-q queue depth per core] 00:07:26.357 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:07:26.357 [-T number of threads per core 00:07:26.357 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:07:26.357 [-t time in seconds] 00:07:26.357 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:07:26.357 [ dif_verify, , dif_generate, dif_generate_copy 00:07:26.357 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:07:26.357 [-l for compress/decompress workloads, name of uncompressed input file 00:07:26.357 [-S for crc32c workload, use this seed value (default 0) 00:07:26.357 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:07:26.357 [-f for fill workload, use this BYTE value (default 255) 00:07:26.357 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:07:26.357 [-y verify result if this switch is on] 00:07:26.357 [-a tasks to allocate per core (default: same value as -q)] 00:07:26.357 Can be used to spread operations across a wider range of memory. 00:07:26.357 23:49:26 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # es=1 00:07:26.357 23:49:26 accel.accel_negative_buffers -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:26.357 23:49:26 accel.accel_negative_buffers -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:26.357 23:49:26 accel.accel_negative_buffers -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:26.357 00:07:26.357 real 0m0.041s 00:07:26.357 user 0m0.041s 00:07:26.357 sys 0m0.022s 00:07:26.357 23:49:26 accel.accel_negative_buffers -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:26.357 23:49:26 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:07:26.357 ************************************ 00:07:26.357 END TEST accel_negative_buffers 00:07:26.357 ************************************ 00:07:26.357 23:49:26 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:07:26.357 23:49:26 accel -- common/autotest_common.sh@1097 -- # '[' 9 -le 1 ']' 00:07:26.357 23:49:26 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:26.357 23:49:26 accel -- common/autotest_common.sh@10 -- # set +x 00:07:26.617 ************************************ 00:07:26.617 START TEST accel_crc32c 00:07:26.617 ************************************ 00:07:26.617 23:49:26 accel.accel_crc32c -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w crc32c -S 32 -y 00:07:26.617 23:49:26 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:07:26.617 23:49:26 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:07:26.617 23:49:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:26.617 23:49:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:26.617 23:49:26 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:07:26.617 23:49:26 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:07:26.617 23:49:26 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:07:26.617 23:49:26 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:26.617 23:49:26 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:26.617 23:49:26 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:26.617 23:49:26 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:26.617 23:49:26 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:26.617 23:49:26 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:07:26.617 23:49:26 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:07:26.617 [2024-05-14 23:49:26.993049] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:07:26.617 [2024-05-14 23:49:26.993106] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid359148 ] 00:07:26.617 [2024-05-14 23:49:27.122465] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:26.876 [2024-05-14 23:49:27.224032] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:26.876 23:49:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:26.877 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:26.877 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:26.877 23:49:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:07:26.877 23:49:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:26.877 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:26.877 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:26.877 23:49:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:07:26.877 23:49:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:26.877 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:26.877 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:26.877 23:49:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:07:26.877 23:49:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:26.877 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:26.877 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:26.877 23:49:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:26.877 23:49:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:26.877 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:26.877 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:26.877 23:49:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:26.877 23:49:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:26.877 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:26.877 23:49:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:28.295 23:49:28 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:28.295 23:49:28 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:28.295 23:49:28 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:28.295 23:49:28 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:28.295 23:49:28 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:28.295 23:49:28 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:28.295 23:49:28 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:28.295 23:49:28 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:28.295 23:49:28 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:28.295 23:49:28 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:28.295 23:49:28 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:28.295 23:49:28 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:28.295 23:49:28 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:28.295 23:49:28 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:28.295 23:49:28 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:28.295 23:49:28 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:28.295 23:49:28 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:28.295 23:49:28 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:28.295 23:49:28 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:28.295 23:49:28 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:28.295 23:49:28 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:28.295 23:49:28 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:28.295 23:49:28 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:28.295 23:49:28 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:28.295 23:49:28 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:28.295 23:49:28 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:07:28.295 23:49:28 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:28.295 00:07:28.295 real 0m1.526s 00:07:28.295 user 0m1.334s 00:07:28.295 sys 0m0.189s 00:07:28.295 23:49:28 accel.accel_crc32c -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:28.295 23:49:28 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:07:28.295 ************************************ 00:07:28.295 END TEST accel_crc32c 00:07:28.295 ************************************ 00:07:28.295 23:49:28 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:07:28.295 23:49:28 accel -- common/autotest_common.sh@1097 -- # '[' 9 -le 1 ']' 00:07:28.295 23:49:28 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:28.295 23:49:28 accel -- common/autotest_common.sh@10 -- # set +x 00:07:28.295 ************************************ 00:07:28.295 START TEST accel_crc32c_C2 00:07:28.295 ************************************ 00:07:28.295 23:49:28 accel.accel_crc32c_C2 -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w crc32c -y -C 2 00:07:28.295 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:07:28.295 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:07:28.295 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:28.295 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:28.295 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:07:28.295 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:07:28.295 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:07:28.295 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:28.295 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:28.295 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:28.295 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:28.295 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:28.295 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:07:28.295 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:07:28.295 [2024-05-14 23:49:28.598161] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:07:28.295 [2024-05-14 23:49:28.598219] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid359375 ] 00:07:28.295 [2024-05-14 23:49:28.727487] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:28.295 [2024-05-14 23:49:28.833087] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:28.555 23:49:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:29.935 00:07:29.935 real 0m1.532s 00:07:29.935 user 0m1.333s 00:07:29.935 sys 0m0.192s 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:29.935 23:49:30 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:07:29.935 ************************************ 00:07:29.935 END TEST accel_crc32c_C2 00:07:29.935 ************************************ 00:07:29.935 23:49:30 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:07:29.935 23:49:30 accel -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:07:29.935 23:49:30 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:29.935 23:49:30 accel -- common/autotest_common.sh@10 -- # set +x 00:07:29.935 ************************************ 00:07:29.935 START TEST accel_copy 00:07:29.935 ************************************ 00:07:29.935 23:49:30 accel.accel_copy -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w copy -y 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:07:29.935 [2024-05-14 23:49:30.224906] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:07:29.935 [2024-05-14 23:49:30.224971] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid359579 ] 00:07:29.935 [2024-05-14 23:49:30.353894] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:29.935 [2024-05-14 23:49:30.455020] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:29.935 23:49:30 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:29.936 23:49:30 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:29.936 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:29.936 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:29.936 23:49:30 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:29.936 23:49:30 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:29.936 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:29.936 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:29.936 23:49:30 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:07:29.936 23:49:30 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:29.936 23:49:30 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:07:29.936 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:29.936 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:29.936 23:49:30 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:07:29.936 23:49:30 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:29.936 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:29.936 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:29.936 23:49:30 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:07:29.936 23:49:30 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:29.936 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:29.936 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:29.936 23:49:30 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:07:29.936 23:49:30 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:29.936 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:29.936 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:30.195 23:49:30 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:07:30.195 23:49:30 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:30.195 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:30.195 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:30.195 23:49:30 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:07:30.195 23:49:30 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:30.195 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:30.195 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:30.195 23:49:30 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:30.195 23:49:30 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:30.195 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:30.195 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:30.195 23:49:30 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:30.195 23:49:30 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:30.195 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:30.195 23:49:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:31.132 23:49:31 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:31.132 23:49:31 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:31.132 23:49:31 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:31.132 23:49:31 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:31.132 23:49:31 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:31.132 23:49:31 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:31.132 23:49:31 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:31.132 23:49:31 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:31.132 23:49:31 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:31.132 23:49:31 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:31.132 23:49:31 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:31.132 23:49:31 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:31.132 23:49:31 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:31.132 23:49:31 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:31.132 23:49:31 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:31.132 23:49:31 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:31.132 23:49:31 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:31.132 23:49:31 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:31.132 23:49:31 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:31.132 23:49:31 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:31.132 23:49:31 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:31.132 23:49:31 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:31.132 23:49:31 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:31.132 23:49:31 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:31.132 23:49:31 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:31.132 23:49:31 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:07:31.132 23:49:31 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:31.132 00:07:31.132 real 0m1.523s 00:07:31.132 user 0m1.326s 00:07:31.132 sys 0m0.195s 00:07:31.132 23:49:31 accel.accel_copy -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:31.132 23:49:31 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:07:31.132 ************************************ 00:07:31.132 END TEST accel_copy 00:07:31.132 ************************************ 00:07:31.391 23:49:31 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:31.391 23:49:31 accel -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:07:31.391 23:49:31 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:31.391 23:49:31 accel -- common/autotest_common.sh@10 -- # set +x 00:07:31.391 ************************************ 00:07:31.391 START TEST accel_fill 00:07:31.391 ************************************ 00:07:31.391 23:49:31 accel.accel_fill -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:31.391 23:49:31 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:07:31.391 23:49:31 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:07:31.391 23:49:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:31.391 23:49:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:31.391 23:49:31 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:31.391 23:49:31 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:31.391 23:49:31 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:07:31.391 23:49:31 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:31.391 23:49:31 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:31.391 23:49:31 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:31.391 23:49:31 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:31.391 23:49:31 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:31.391 23:49:31 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:07:31.391 23:49:31 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:07:31.391 [2024-05-14 23:49:31.824524] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:07:31.391 [2024-05-14 23:49:31.824580] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid359773 ] 00:07:31.391 [2024-05-14 23:49:31.954346] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:31.651 [2024-05-14 23:49:32.056072] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:31.651 23:49:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:33.031 23:49:33 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:33.031 23:49:33 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:33.031 23:49:33 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:33.031 23:49:33 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:33.031 23:49:33 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:33.031 23:49:33 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:33.031 23:49:33 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:33.031 23:49:33 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:33.031 23:49:33 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:33.031 23:49:33 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:33.031 23:49:33 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:33.031 23:49:33 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:33.031 23:49:33 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:33.031 23:49:33 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:33.031 23:49:33 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:33.031 23:49:33 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:33.031 23:49:33 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:33.031 23:49:33 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:33.031 23:49:33 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:33.031 23:49:33 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:33.031 23:49:33 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:33.031 23:49:33 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:33.031 23:49:33 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:33.031 23:49:33 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:33.031 23:49:33 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:33.031 23:49:33 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:07:33.031 23:49:33 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:33.031 00:07:33.031 real 0m1.533s 00:07:33.031 user 0m0.013s 00:07:33.031 sys 0m0.001s 00:07:33.031 23:49:33 accel.accel_fill -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:33.031 23:49:33 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:07:33.031 ************************************ 00:07:33.031 END TEST accel_fill 00:07:33.031 ************************************ 00:07:33.031 23:49:33 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:07:33.031 23:49:33 accel -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:07:33.031 23:49:33 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:33.031 23:49:33 accel -- common/autotest_common.sh@10 -- # set +x 00:07:33.031 ************************************ 00:07:33.031 START TEST accel_copy_crc32c 00:07:33.031 ************************************ 00:07:33.031 23:49:33 accel.accel_copy_crc32c -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w copy_crc32c -y 00:07:33.031 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:07:33.031 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:07:33.031 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:33.031 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:33.031 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:07:33.031 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:07:33.031 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:07:33.031 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:33.031 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:33.031 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:33.031 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:33.031 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:33.031 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:07:33.031 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:07:33.031 [2024-05-14 23:49:33.433219] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:07:33.031 [2024-05-14 23:49:33.433276] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid359973 ] 00:07:33.031 [2024-05-14 23:49:33.561087] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:33.291 [2024-05-14 23:49:33.662627] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:33.291 23:49:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:34.669 00:07:34.669 real 0m1.532s 00:07:34.669 user 0m0.013s 00:07:34.669 sys 0m0.001s 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:34.669 23:49:34 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:07:34.669 ************************************ 00:07:34.669 END TEST accel_copy_crc32c 00:07:34.669 ************************************ 00:07:34.669 23:49:34 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:07:34.669 23:49:34 accel -- common/autotest_common.sh@1097 -- # '[' 9 -le 1 ']' 00:07:34.669 23:49:34 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:34.669 23:49:34 accel -- common/autotest_common.sh@10 -- # set +x 00:07:34.669 ************************************ 00:07:34.669 START TEST accel_copy_crc32c_C2 00:07:34.669 ************************************ 00:07:34.669 23:49:35 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:07:34.669 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:07:34.669 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:07:34.669 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.669 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.669 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:07:34.669 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:07:34.669 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:07:34.669 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:34.669 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:34.669 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:34.669 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:34.669 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:34.669 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:07:34.669 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:07:34.669 [2024-05-14 23:49:35.043936] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:07:34.669 [2024-05-14 23:49:35.043994] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid360252 ] 00:07:34.669 [2024-05-14 23:49:35.172994] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:34.928 [2024-05-14 23:49:35.275003] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:34.928 23:49:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:36.306 00:07:36.306 real 0m1.526s 00:07:36.306 user 0m0.008s 00:07:36.306 sys 0m0.001s 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:36.306 23:49:36 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:07:36.306 ************************************ 00:07:36.306 END TEST accel_copy_crc32c_C2 00:07:36.306 ************************************ 00:07:36.306 23:49:36 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:07:36.306 23:49:36 accel -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:07:36.306 23:49:36 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:36.306 23:49:36 accel -- common/autotest_common.sh@10 -- # set +x 00:07:36.306 ************************************ 00:07:36.306 START TEST accel_dualcast 00:07:36.306 ************************************ 00:07:36.306 23:49:36 accel.accel_dualcast -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w dualcast -y 00:07:36.306 23:49:36 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:07:36.306 23:49:36 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:07:36.306 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.306 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.306 23:49:36 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:07:36.306 23:49:36 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:07:36.306 23:49:36 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:07:36.306 23:49:36 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:36.307 23:49:36 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:36.307 23:49:36 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:36.307 23:49:36 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:36.307 23:49:36 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:36.307 23:49:36 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:07:36.307 23:49:36 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:07:36.307 [2024-05-14 23:49:36.647326] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:07:36.307 [2024-05-14 23:49:36.647382] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid360529 ] 00:07:36.307 [2024-05-14 23:49:36.775079] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:36.307 [2024-05-14 23:49:36.872366] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.565 23:49:36 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:36.566 23:49:36 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:37.942 23:49:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:37.942 23:49:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:37.942 23:49:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:37.942 23:49:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:37.942 23:49:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:37.942 23:49:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:37.942 23:49:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:37.942 23:49:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:37.942 23:49:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:37.942 23:49:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:37.942 23:49:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:37.942 23:49:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:37.942 23:49:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:37.942 23:49:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:37.942 23:49:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:37.942 23:49:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:37.942 23:49:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:37.942 23:49:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:37.942 23:49:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:37.942 23:49:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:37.942 23:49:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:37.942 23:49:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:37.942 23:49:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:37.942 23:49:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:37.942 23:49:38 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:37.942 23:49:38 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:07:37.942 23:49:38 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:37.942 00:07:37.942 real 0m1.501s 00:07:37.942 user 0m1.316s 00:07:37.942 sys 0m0.180s 00:07:37.942 23:49:38 accel.accel_dualcast -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:37.942 23:49:38 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:07:37.942 ************************************ 00:07:37.942 END TEST accel_dualcast 00:07:37.942 ************************************ 00:07:37.942 23:49:38 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:07:37.942 23:49:38 accel -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:07:37.942 23:49:38 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:37.942 23:49:38 accel -- common/autotest_common.sh@10 -- # set +x 00:07:37.942 ************************************ 00:07:37.942 START TEST accel_compare 00:07:37.942 ************************************ 00:07:37.942 23:49:38 accel.accel_compare -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w compare -y 00:07:37.942 23:49:38 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:07:37.942 23:49:38 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:07:37.942 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:37.942 23:49:38 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:07:37.942 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:37.942 23:49:38 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:07:37.942 23:49:38 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:07:37.942 23:49:38 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:37.942 23:49:38 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:37.942 23:49:38 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:37.942 23:49:38 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:07:37.943 [2024-05-14 23:49:38.212054] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:07:37.943 [2024-05-14 23:49:38.212095] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid360729 ] 00:07:37.943 [2024-05-14 23:49:38.321031] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:37.943 [2024-05-14 23:49:38.421662] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:37.943 23:49:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:39.321 23:49:39 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:39.321 23:49:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:39.321 23:49:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:39.321 23:49:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:39.321 23:49:39 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:39.321 23:49:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:39.321 23:49:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:39.321 23:49:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:39.321 23:49:39 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:39.321 23:49:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:39.321 23:49:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:39.321 23:49:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:39.321 23:49:39 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:39.321 23:49:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:39.321 23:49:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:39.321 23:49:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:39.321 23:49:39 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:39.321 23:49:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:39.321 23:49:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:39.321 23:49:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:39.321 23:49:39 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:39.321 23:49:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:39.321 23:49:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:39.321 23:49:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:39.321 23:49:39 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:39.321 23:49:39 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:07:39.321 23:49:39 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:39.321 00:07:39.321 real 0m1.492s 00:07:39.321 user 0m0.013s 00:07:39.321 sys 0m0.001s 00:07:39.321 23:49:39 accel.accel_compare -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:39.321 23:49:39 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:07:39.321 ************************************ 00:07:39.321 END TEST accel_compare 00:07:39.321 ************************************ 00:07:39.321 23:49:39 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:07:39.321 23:49:39 accel -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:07:39.321 23:49:39 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:39.321 23:49:39 accel -- common/autotest_common.sh@10 -- # set +x 00:07:39.321 ************************************ 00:07:39.321 START TEST accel_xor 00:07:39.321 ************************************ 00:07:39.321 23:49:39 accel.accel_xor -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w xor -y 00:07:39.321 23:49:39 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:07:39.321 23:49:39 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:07:39.321 23:49:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.321 23:49:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.321 23:49:39 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:07:39.321 23:49:39 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:07:39.321 23:49:39 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:07:39.321 23:49:39 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:39.321 23:49:39 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:39.321 23:49:39 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:39.321 23:49:39 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:39.321 23:49:39 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:39.321 23:49:39 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:07:39.321 23:49:39 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:07:39.321 [2024-05-14 23:49:39.794649] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:07:39.321 [2024-05-14 23:49:39.794705] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid360923 ] 00:07:39.580 [2024-05-14 23:49:39.923702] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:39.580 [2024-05-14 23:49:40.025793] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.580 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.581 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.581 23:49:40 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:07:39.581 23:49:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.581 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.581 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.581 23:49:40 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:07:39.581 23:49:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.581 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.581 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.581 23:49:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:39.581 23:49:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.581 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.581 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.581 23:49:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:39.581 23:49:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.581 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.581 23:49:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:40.959 00:07:40.959 real 0m1.522s 00:07:40.959 user 0m0.012s 00:07:40.959 sys 0m0.002s 00:07:40.959 23:49:41 accel.accel_xor -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:40.959 23:49:41 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:07:40.959 ************************************ 00:07:40.959 END TEST accel_xor 00:07:40.959 ************************************ 00:07:40.959 23:49:41 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:07:40.959 23:49:41 accel -- common/autotest_common.sh@1097 -- # '[' 9 -le 1 ']' 00:07:40.959 23:49:41 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:40.959 23:49:41 accel -- common/autotest_common.sh@10 -- # set +x 00:07:40.959 ************************************ 00:07:40.959 START TEST accel_xor 00:07:40.959 ************************************ 00:07:40.959 23:49:41 accel.accel_xor -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w xor -y -x 3 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:07:40.959 23:49:41 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:07:40.959 [2024-05-14 23:49:41.405334] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:07:40.959 [2024-05-14 23:49:41.405425] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid361122 ] 00:07:40.959 [2024-05-14 23:49:41.539297] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:41.217 [2024-05-14 23:49:41.644998] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.217 23:49:41 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:41.218 23:49:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.218 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.218 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.218 23:49:41 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:07:41.218 23:49:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.218 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.218 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.218 23:49:41 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:07:41.218 23:49:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.218 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.218 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.218 23:49:41 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:07:41.218 23:49:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.218 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.218 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.218 23:49:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:41.218 23:49:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.218 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.218 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:41.218 23:49:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:41.218 23:49:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:41.218 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:41.218 23:49:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:42.592 23:49:42 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:42.592 23:49:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:42.592 23:49:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:42.592 23:49:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:42.592 23:49:42 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:42.592 23:49:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:42.592 23:49:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:42.592 23:49:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:42.592 23:49:42 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:42.592 23:49:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:42.592 23:49:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:42.592 23:49:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:42.592 23:49:42 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:42.592 23:49:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:42.592 23:49:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:42.592 23:49:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:42.592 23:49:42 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:42.592 23:49:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:42.592 23:49:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:42.593 23:49:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:42.593 23:49:42 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:42.593 23:49:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:42.593 23:49:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:42.593 23:49:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:42.593 23:49:42 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:42.593 23:49:42 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:42.593 23:49:42 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:42.593 00:07:42.593 real 0m1.534s 00:07:42.593 user 0m1.338s 00:07:42.593 sys 0m0.192s 00:07:42.593 23:49:42 accel.accel_xor -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:42.593 23:49:42 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:07:42.593 ************************************ 00:07:42.593 END TEST accel_xor 00:07:42.593 ************************************ 00:07:42.593 23:49:42 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:07:42.593 23:49:42 accel -- common/autotest_common.sh@1097 -- # '[' 6 -le 1 ']' 00:07:42.593 23:49:42 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:42.593 23:49:42 accel -- common/autotest_common.sh@10 -- # set +x 00:07:42.593 ************************************ 00:07:42.593 START TEST accel_dif_verify 00:07:42.593 ************************************ 00:07:42.593 23:49:42 accel.accel_dif_verify -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w dif_verify 00:07:42.593 23:49:42 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:07:42.593 23:49:42 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:07:42.593 23:49:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.593 23:49:42 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:07:42.593 23:49:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.593 23:49:42 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:42.593 23:49:42 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:07:42.593 23:49:42 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:42.593 23:49:42 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:42.593 23:49:42 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:42.593 23:49:42 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:42.593 23:49:42 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:42.593 23:49:42 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:07:42.593 23:49:42 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:07:42.593 [2024-05-14 23:49:43.007255] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:07:42.593 [2024-05-14 23:49:43.007297] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid361327 ] 00:07:42.593 [2024-05-14 23:49:43.119941] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.852 [2024-05-14 23:49:43.224867] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:42.852 23:49:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:44.229 23:49:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:44.229 23:49:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:44.229 23:49:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:44.229 23:49:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:44.229 23:49:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:44.229 23:49:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:44.229 23:49:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:44.229 23:49:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:44.229 23:49:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:44.229 23:49:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:44.229 23:49:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:44.229 23:49:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:44.229 23:49:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:44.229 23:49:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:44.229 23:49:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:44.229 23:49:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:44.229 23:49:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:44.229 23:49:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:44.229 23:49:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:44.229 23:49:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:44.229 23:49:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:44.229 23:49:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:44.229 23:49:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:44.229 23:49:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:44.229 23:49:44 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:44.229 23:49:44 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:07:44.229 23:49:44 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:44.229 00:07:44.229 real 0m1.493s 00:07:44.229 user 0m0.013s 00:07:44.229 sys 0m0.002s 00:07:44.229 23:49:44 accel.accel_dif_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:44.229 23:49:44 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:07:44.229 ************************************ 00:07:44.229 END TEST accel_dif_verify 00:07:44.229 ************************************ 00:07:44.229 23:49:44 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:07:44.229 23:49:44 accel -- common/autotest_common.sh@1097 -- # '[' 6 -le 1 ']' 00:07:44.229 23:49:44 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:44.229 23:49:44 accel -- common/autotest_common.sh@10 -- # set +x 00:07:44.229 ************************************ 00:07:44.229 START TEST accel_dif_generate 00:07:44.229 ************************************ 00:07:44.229 23:49:44 accel.accel_dif_generate -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w dif_generate 00:07:44.229 23:49:44 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:07:44.229 23:49:44 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:07:44.229 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:44.229 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:44.229 23:49:44 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:07:44.229 23:49:44 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:44.229 23:49:44 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:07:44.229 23:49:44 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:44.229 23:49:44 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:44.229 23:49:44 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:44.229 23:49:44 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:44.229 23:49:44 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:44.229 23:49:44 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:07:44.229 23:49:44 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:07:44.229 [2024-05-14 23:49:44.580705] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:07:44.229 [2024-05-14 23:49:44.580762] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid361620 ] 00:07:44.229 [2024-05-14 23:49:44.707352] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:44.229 [2024-05-14 23:49:44.804098] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:44.524 23:49:44 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:44.524 23:49:44 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:44.525 23:49:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:45.484 23:49:46 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:45.484 23:49:46 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:45.484 23:49:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:45.484 23:49:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:45.484 23:49:46 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:45.484 23:49:46 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:45.484 23:49:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:45.484 23:49:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:45.484 23:49:46 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:45.484 23:49:46 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:45.484 23:49:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:45.484 23:49:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:45.484 23:49:46 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:45.484 23:49:46 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:45.484 23:49:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:45.484 23:49:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:45.484 23:49:46 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:45.484 23:49:46 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:45.484 23:49:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:45.484 23:49:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:45.484 23:49:46 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:45.484 23:49:46 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:45.484 23:49:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:45.484 23:49:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:45.484 23:49:46 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:45.484 23:49:46 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:07:45.484 23:49:46 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:45.484 00:07:45.484 real 0m1.502s 00:07:45.484 user 0m1.319s 00:07:45.484 sys 0m0.181s 00:07:45.484 23:49:46 accel.accel_dif_generate -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:45.484 23:49:46 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:07:45.484 ************************************ 00:07:45.484 END TEST accel_dif_generate 00:07:45.484 ************************************ 00:07:45.743 23:49:46 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:07:45.743 23:49:46 accel -- common/autotest_common.sh@1097 -- # '[' 6 -le 1 ']' 00:07:45.743 23:49:46 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:45.743 23:49:46 accel -- common/autotest_common.sh@10 -- # set +x 00:07:45.743 ************************************ 00:07:45.743 START TEST accel_dif_generate_copy 00:07:45.743 ************************************ 00:07:45.743 23:49:46 accel.accel_dif_generate_copy -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w dif_generate_copy 00:07:45.743 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:07:45.743 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:07:45.743 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:07:45.743 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:45.743 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:45.743 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:45.743 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:07:45.743 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:45.743 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:45.743 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:45.743 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:45.743 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:45.743 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:07:45.743 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:07:45.743 [2024-05-14 23:49:46.130998] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:07:45.743 [2024-05-14 23:49:46.131036] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid361875 ] 00:07:45.743 [2024-05-14 23:49:46.240483] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:46.002 [2024-05-14 23:49:46.341364] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.002 23:49:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.387 23:49:47 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:47.387 23:49:47 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.387 23:49:47 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.387 23:49:47 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.387 23:49:47 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:47.387 23:49:47 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.387 23:49:47 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.387 23:49:47 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.387 23:49:47 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:47.387 23:49:47 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.387 23:49:47 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.387 23:49:47 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.387 23:49:47 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:47.387 23:49:47 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.387 23:49:47 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.387 23:49:47 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.387 23:49:47 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:47.387 23:49:47 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.387 23:49:47 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.387 23:49:47 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.387 23:49:47 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:47.387 23:49:47 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.387 23:49:47 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.387 23:49:47 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.387 23:49:47 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:47.387 23:49:47 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:07:47.387 23:49:47 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:47.387 00:07:47.388 real 0m1.481s 00:07:47.388 user 0m0.014s 00:07:47.388 sys 0m0.000s 00:07:47.388 23:49:47 accel.accel_dif_generate_copy -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:47.388 23:49:47 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:07:47.388 ************************************ 00:07:47.388 END TEST accel_dif_generate_copy 00:07:47.388 ************************************ 00:07:47.388 23:49:47 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:07:47.388 23:49:47 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:47.388 23:49:47 accel -- common/autotest_common.sh@1097 -- # '[' 8 -le 1 ']' 00:07:47.388 23:49:47 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:47.388 23:49:47 accel -- common/autotest_common.sh@10 -- # set +x 00:07:47.388 ************************************ 00:07:47.388 START TEST accel_comp 00:07:47.388 ************************************ 00:07:47.388 23:49:47 accel.accel_comp -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:47.388 23:49:47 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:07:47.388 23:49:47 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:07:47.388 23:49:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.388 23:49:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.388 23:49:47 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:47.388 23:49:47 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:47.388 23:49:47 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:07:47.388 23:49:47 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:47.388 23:49:47 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:47.388 23:49:47 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:47.388 23:49:47 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:47.388 23:49:47 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:47.388 23:49:47 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:07:47.388 23:49:47 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:07:47.388 [2024-05-14 23:49:47.712866] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:07:47.388 [2024-05-14 23:49:47.712925] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid362073 ] 00:07:47.388 [2024-05-14 23:49:47.841630] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:47.388 [2024-05-14 23:49:47.942130] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.646 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:47.647 23:49:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:47.647 23:49:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.647 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:47.647 23:49:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:49.023 23:49:49 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:49.023 23:49:49 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.023 23:49:49 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:49.023 23:49:49 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:49.023 23:49:49 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:49.023 23:49:49 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.023 23:49:49 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:49.023 23:49:49 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:49.023 23:49:49 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:49.023 23:49:49 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.023 23:49:49 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:49.023 23:49:49 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:49.023 23:49:49 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:49.023 23:49:49 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.023 23:49:49 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:49.023 23:49:49 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:49.023 23:49:49 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:49.023 23:49:49 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.023 23:49:49 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:49.023 23:49:49 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:49.023 23:49:49 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:49.023 23:49:49 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.023 23:49:49 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:49.023 23:49:49 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:49.023 23:49:49 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:49.023 23:49:49 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:07:49.023 23:49:49 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:49.023 00:07:49.023 real 0m1.529s 00:07:49.023 user 0m1.325s 00:07:49.023 sys 0m0.201s 00:07:49.023 23:49:49 accel.accel_comp -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:49.023 23:49:49 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:07:49.023 ************************************ 00:07:49.023 END TEST accel_comp 00:07:49.023 ************************************ 00:07:49.023 23:49:49 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:49.023 23:49:49 accel -- common/autotest_common.sh@1097 -- # '[' 9 -le 1 ']' 00:07:49.023 23:49:49 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:49.023 23:49:49 accel -- common/autotest_common.sh@10 -- # set +x 00:07:49.023 ************************************ 00:07:49.023 START TEST accel_decomp 00:07:49.023 ************************************ 00:07:49.023 23:49:49 accel.accel_decomp -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:49.023 23:49:49 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:07:49.023 23:49:49 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:07:49.023 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:49.023 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:49.023 23:49:49 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:49.023 23:49:49 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:49.023 23:49:49 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:07:49.023 23:49:49 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:49.023 23:49:49 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:49.023 23:49:49 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:49.023 23:49:49 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:49.023 23:49:49 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:49.023 23:49:49 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:07:49.023 23:49:49 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:07:49.023 [2024-05-14 23:49:49.326729] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:07:49.023 [2024-05-14 23:49:49.326787] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid362272 ] 00:07:49.023 [2024-05-14 23:49:49.455802] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:49.023 [2024-05-14 23:49:49.556786] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:49.282 23:49:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:50.659 23:49:50 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:50.659 23:49:50 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:50.659 23:49:50 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:50.659 23:49:50 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:50.659 23:49:50 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:50.659 23:49:50 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:50.659 23:49:50 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:50.659 23:49:50 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:50.659 23:49:50 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:50.659 23:49:50 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:50.659 23:49:50 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:50.659 23:49:50 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:50.659 23:49:50 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:50.659 23:49:50 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:50.659 23:49:50 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:50.659 23:49:50 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:50.659 23:49:50 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:50.659 23:49:50 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:50.659 23:49:50 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:50.659 23:49:50 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:50.659 23:49:50 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:50.659 23:49:50 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:50.659 23:49:50 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:50.659 23:49:50 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:50.659 23:49:50 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:50.659 23:49:50 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:50.659 23:49:50 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:50.659 00:07:50.659 real 0m1.528s 00:07:50.659 user 0m0.015s 00:07:50.659 sys 0m0.000s 00:07:50.660 23:49:50 accel.accel_decomp -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:50.660 23:49:50 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:07:50.660 ************************************ 00:07:50.660 END TEST accel_decomp 00:07:50.660 ************************************ 00:07:50.660 23:49:50 accel -- accel/accel.sh@118 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:50.660 23:49:50 accel -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:07:50.660 23:49:50 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:50.660 23:49:50 accel -- common/autotest_common.sh@10 -- # set +x 00:07:50.660 ************************************ 00:07:50.660 START TEST accel_decmop_full 00:07:50.660 ************************************ 00:07:50.660 23:49:50 accel.accel_decmop_full -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:50.660 23:49:50 accel.accel_decmop_full -- accel/accel.sh@16 -- # local accel_opc 00:07:50.660 23:49:50 accel.accel_decmop_full -- accel/accel.sh@17 -- # local accel_module 00:07:50.660 23:49:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:50.660 23:49:50 accel.accel_decmop_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:50.660 23:49:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:50.660 23:49:50 accel.accel_decmop_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:50.660 23:49:50 accel.accel_decmop_full -- accel/accel.sh@12 -- # build_accel_config 00:07:50.660 23:49:50 accel.accel_decmop_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:50.660 23:49:50 accel.accel_decmop_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:50.660 23:49:50 accel.accel_decmop_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:50.660 23:49:50 accel.accel_decmop_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:50.660 23:49:50 accel.accel_decmop_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:50.660 23:49:50 accel.accel_decmop_full -- accel/accel.sh@40 -- # local IFS=, 00:07:50.660 23:49:50 accel.accel_decmop_full -- accel/accel.sh@41 -- # jq -r . 00:07:50.660 [2024-05-14 23:49:50.906892] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:07:50.660 [2024-05-14 23:49:50.906934] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid362466 ] 00:07:50.660 [2024-05-14 23:49:51.018782] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:50.660 [2024-05-14 23:49:51.125030] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@20 -- # val=0x1 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@20 -- # val=decompress 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@20 -- # val=software 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@22 -- # accel_module=software 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@20 -- # val=32 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@20 -- # val=32 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@20 -- # val=1 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@20 -- # val='1 seconds' 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@20 -- # val=Yes 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:07:50.660 23:49:51 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:50.661 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:50.661 23:49:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:52.037 23:49:52 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:07:52.037 23:49:52 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:52.037 23:49:52 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:52.037 23:49:52 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:52.037 23:49:52 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:07:52.037 23:49:52 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:52.037 23:49:52 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:52.037 23:49:52 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:52.037 23:49:52 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:07:52.037 23:49:52 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:52.037 23:49:52 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:52.037 23:49:52 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:52.037 23:49:52 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:07:52.037 23:49:52 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:52.037 23:49:52 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:52.037 23:49:52 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:52.037 23:49:52 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:07:52.037 23:49:52 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:52.037 23:49:52 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:52.037 23:49:52 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:52.037 23:49:52 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:07:52.037 23:49:52 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:52.037 23:49:52 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:52.037 23:49:52 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:52.037 23:49:52 accel.accel_decmop_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:52.037 23:49:52 accel.accel_decmop_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:52.037 23:49:52 accel.accel_decmop_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:52.037 00:07:52.037 real 0m1.511s 00:07:52.037 user 0m0.014s 00:07:52.037 sys 0m0.001s 00:07:52.037 23:49:52 accel.accel_decmop_full -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:52.037 23:49:52 accel.accel_decmop_full -- common/autotest_common.sh@10 -- # set +x 00:07:52.037 ************************************ 00:07:52.037 END TEST accel_decmop_full 00:07:52.037 ************************************ 00:07:52.037 23:49:52 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:52.037 23:49:52 accel -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:07:52.037 23:49:52 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:52.037 23:49:52 accel -- common/autotest_common.sh@10 -- # set +x 00:07:52.037 ************************************ 00:07:52.037 START TEST accel_decomp_mcore 00:07:52.037 ************************************ 00:07:52.037 23:49:52 accel.accel_decomp_mcore -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:52.037 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:52.037 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:52.037 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.037 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.037 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:52.037 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:52.037 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:52.037 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:52.037 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:52.037 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:52.037 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:52.037 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:52.037 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:07:52.037 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:07:52.037 [2024-05-14 23:49:52.526205] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:07:52.037 [2024-05-14 23:49:52.526269] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid362736 ] 00:07:52.296 [2024-05-14 23:49:52.657152] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:52.296 [2024-05-14 23:49:52.762876] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:52.296 [2024-05-14 23:49:52.762961] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:52.296 [2024-05-14 23:49:52.763067] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:52.296 [2024-05-14 23:49:52.763068] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.296 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:52.297 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.297 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.297 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.297 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:52.297 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.297 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.297 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.297 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:07:52.297 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.297 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.297 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.297 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:52.297 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.297 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.297 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.297 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:52.297 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.297 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.297 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.297 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:52.297 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.297 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.297 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.297 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:52.297 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.297 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.297 23:49:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:53.672 00:07:53.672 real 0m1.527s 00:07:53.672 user 0m4.740s 00:07:53.672 sys 0m0.198s 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:53.672 23:49:54 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:53.672 ************************************ 00:07:53.672 END TEST accel_decomp_mcore 00:07:53.672 ************************************ 00:07:53.672 23:49:54 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:53.672 23:49:54 accel -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:07:53.672 23:49:54 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:53.672 23:49:54 accel -- common/autotest_common.sh@10 -- # set +x 00:07:53.672 ************************************ 00:07:53.672 START TEST accel_decomp_full_mcore 00:07:53.672 ************************************ 00:07:53.672 23:49:54 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:53.672 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:53.672 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:53.672 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.672 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.673 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:53.673 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:53.673 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:53.673 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:53.673 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:53.673 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:53.673 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:53.673 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:53.673 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:07:53.673 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:07:53.673 [2024-05-14 23:49:54.137858] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:07:53.673 [2024-05-14 23:49:54.137916] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid363024 ] 00:07:53.932 [2024-05-14 23:49:54.267408] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:53.932 [2024-05-14 23:49:54.372307] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:53.932 [2024-05-14 23:49:54.372394] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:53.932 [2024-05-14 23:49:54.372500] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:53.932 [2024-05-14 23:49:54.372502] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.932 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:53.932 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.932 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.932 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.932 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:53.932 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.932 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.932 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.932 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:53.932 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.932 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:53.933 23:49:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:55.310 00:07:55.310 real 0m1.558s 00:07:55.310 user 0m4.839s 00:07:55.310 sys 0m0.211s 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:55.310 23:49:55 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:55.310 ************************************ 00:07:55.310 END TEST accel_decomp_full_mcore 00:07:55.310 ************************************ 00:07:55.310 23:49:55 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:55.310 23:49:55 accel -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:07:55.310 23:49:55 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:55.310 23:49:55 accel -- common/autotest_common.sh@10 -- # set +x 00:07:55.310 ************************************ 00:07:55.310 START TEST accel_decomp_mthread 00:07:55.310 ************************************ 00:07:55.310 23:49:55 accel.accel_decomp_mthread -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:55.310 23:49:55 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:55.310 23:49:55 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:55.310 23:49:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.310 23:49:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.310 23:49:55 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:55.310 23:49:55 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:55.310 23:49:55 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:55.310 23:49:55 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:55.310 23:49:55 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:55.310 23:49:55 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:55.310 23:49:55 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:55.310 23:49:55 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:55.310 23:49:55 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:55.310 23:49:55 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:55.310 [2024-05-14 23:49:55.780389] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:07:55.310 [2024-05-14 23:49:55.780481] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid363226 ] 00:07:55.570 [2024-05-14 23:49:55.914124] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.570 [2024-05-14 23:49:56.017692] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.570 23:49:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:56.946 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:56.947 00:07:56.947 real 0m1.549s 00:07:56.947 user 0m1.352s 00:07:56.947 sys 0m0.200s 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:56.947 23:49:57 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:56.947 ************************************ 00:07:56.947 END TEST accel_decomp_mthread 00:07:56.947 ************************************ 00:07:56.947 23:49:57 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:56.947 23:49:57 accel -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:07:56.947 23:49:57 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:56.947 23:49:57 accel -- common/autotest_common.sh@10 -- # set +x 00:07:56.947 ************************************ 00:07:56.947 START TEST accel_decomp_full_mthread 00:07:56.947 ************************************ 00:07:56.947 23:49:57 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:56.947 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:56.947 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:56.947 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:56.947 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:56.947 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:56.947 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:56.947 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:56.947 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:56.947 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:56.947 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:56.947 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:56.947 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:56.947 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:56.947 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:56.947 [2024-05-14 23:49:57.417222] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:07:56.947 [2024-05-14 23:49:57.417282] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid363420 ] 00:07:57.207 [2024-05-14 23:49:57.546916] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:57.207 [2024-05-14 23:49:57.647603] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:57.207 23:49:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:58.584 00:07:58.584 real 0m1.561s 00:07:58.584 user 0m1.377s 00:07:58.584 sys 0m0.189s 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:58.584 23:49:58 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:58.584 ************************************ 00:07:58.584 END TEST accel_decomp_full_mthread 00:07:58.584 ************************************ 00:07:58.584 23:49:58 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:07:58.584 23:49:58 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:07:58.584 23:49:58 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:07:58.584 23:49:58 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:58.584 23:49:58 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=363616 00:07:58.584 23:49:58 accel -- accel/accel.sh@63 -- # waitforlisten 363616 00:07:58.584 23:49:58 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:07:58.584 23:49:58 accel -- common/autotest_common.sh@827 -- # '[' -z 363616 ']' 00:07:58.584 23:49:58 accel -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:58.584 23:49:58 accel -- accel/accel.sh@61 -- # build_accel_config 00:07:58.584 23:49:58 accel -- common/autotest_common.sh@832 -- # local max_retries=100 00:07:58.584 23:49:58 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:58.584 23:49:58 accel -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:58.584 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:58.584 23:49:58 accel -- common/autotest_common.sh@836 -- # xtrace_disable 00:07:58.584 23:49:58 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:58.584 23:49:58 accel -- common/autotest_common.sh@10 -- # set +x 00:07:58.584 23:49:58 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:58.584 23:49:58 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:58.584 23:49:58 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:58.584 23:49:58 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:58.584 23:49:58 accel -- accel/accel.sh@40 -- # local IFS=, 00:07:58.584 23:49:58 accel -- accel/accel.sh@41 -- # jq -r . 00:07:58.584 [2024-05-14 23:49:59.048501] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:07:58.584 [2024-05-14 23:49:59.048566] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid363616 ] 00:07:58.843 [2024-05-14 23:49:59.179138] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:58.843 [2024-05-14 23:49:59.277107] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:59.779 [2024-05-14 23:50:00.052960] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:59.779 23:50:00 accel -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:07:59.779 23:50:00 accel -- common/autotest_common.sh@860 -- # return 0 00:07:59.779 23:50:00 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:07:59.779 23:50:00 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:07:59.779 23:50:00 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:07:59.779 23:50:00 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:07:59.779 23:50:00 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:07:59.780 23:50:00 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:07:59.780 23:50:00 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:59.780 23:50:00 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:07:59.780 23:50:00 accel -- common/autotest_common.sh@10 -- # set +x 00:07:59.780 23:50:00 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:08:00.039 23:50:00 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:00.039 "method": "compressdev_scan_accel_module", 00:08:00.039 23:50:00 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:08:00.039 23:50:00 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:08:00.039 23:50:00 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:08:00.039 23:50:00 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:00.039 23:50:00 accel -- common/autotest_common.sh@10 -- # set +x 00:08:00.039 23:50:00 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:00.039 23:50:00 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.039 23:50:00 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.039 23:50:00 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.039 23:50:00 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.039 23:50:00 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.039 23:50:00 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.039 23:50:00 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.039 23:50:00 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.039 23:50:00 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.039 23:50:00 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.039 23:50:00 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.039 23:50:00 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.039 23:50:00 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.039 23:50:00 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.039 23:50:00 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.039 23:50:00 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.039 23:50:00 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.039 23:50:00 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.039 23:50:00 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.039 23:50:00 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.039 23:50:00 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.039 23:50:00 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.039 23:50:00 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.039 23:50:00 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.039 23:50:00 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.039 23:50:00 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.039 23:50:00 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.039 23:50:00 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:08:00.039 23:50:00 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.039 23:50:00 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.039 23:50:00 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.039 23:50:00 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:08:00.039 23:50:00 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.039 23:50:00 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.039 23:50:00 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.039 23:50:00 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.039 23:50:00 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.039 23:50:00 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.039 23:50:00 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.039 23:50:00 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.039 23:50:00 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.039 23:50:00 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.039 23:50:00 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.039 23:50:00 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.039 23:50:00 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.039 23:50:00 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.039 23:50:00 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.039 23:50:00 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.039 23:50:00 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.039 23:50:00 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.039 23:50:00 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.039 23:50:00 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.039 23:50:00 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.039 23:50:00 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.039 23:50:00 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.039 23:50:00 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.039 23:50:00 accel -- accel/accel.sh@75 -- # killprocess 363616 00:08:00.039 23:50:00 accel -- common/autotest_common.sh@946 -- # '[' -z 363616 ']' 00:08:00.039 23:50:00 accel -- common/autotest_common.sh@950 -- # kill -0 363616 00:08:00.039 23:50:00 accel -- common/autotest_common.sh@951 -- # uname 00:08:00.039 23:50:00 accel -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:08:00.039 23:50:00 accel -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 363616 00:08:00.039 23:50:00 accel -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:08:00.039 23:50:00 accel -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:08:00.039 23:50:00 accel -- common/autotest_common.sh@964 -- # echo 'killing process with pid 363616' 00:08:00.039 killing process with pid 363616 00:08:00.039 23:50:00 accel -- common/autotest_common.sh@965 -- # kill 363616 00:08:00.039 23:50:00 accel -- common/autotest_common.sh@970 -- # wait 363616 00:08:00.608 23:50:00 accel -- accel/accel.sh@76 -- # trap - ERR 00:08:00.608 23:50:00 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:00.608 23:50:00 accel -- common/autotest_common.sh@1097 -- # '[' 8 -le 1 ']' 00:08:00.608 23:50:00 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:00.608 23:50:00 accel -- common/autotest_common.sh@10 -- # set +x 00:08:00.608 ************************************ 00:08:00.608 START TEST accel_cdev_comp 00:08:00.608 ************************************ 00:08:00.608 23:50:01 accel.accel_cdev_comp -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:00.608 23:50:01 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:08:00.608 23:50:01 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:08:00.608 23:50:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:00.608 23:50:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:00.608 23:50:01 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:00.608 23:50:01 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:00.608 23:50:01 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:08:00.608 23:50:01 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:00.608 23:50:01 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:00.608 23:50:01 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:00.608 23:50:01 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:00.608 23:50:01 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:00.608 23:50:01 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:00.608 23:50:01 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:08:00.608 23:50:01 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:08:00.608 [2024-05-14 23:50:01.043646] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:08:00.608 [2024-05-14 23:50:01.043711] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid363981 ] 00:08:00.608 [2024-05-14 23:50:01.176039] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.866 [2024-05-14 23:50:01.282132] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:01.501 [2024-05-14 23:50:02.044893] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:01.501 [2024-05-14 23:50:02.047470] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x26f1820 PMD being used: compress_qat 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:01.501 [2024-05-14 23:50:02.051515] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x28f6570 PMD being used: compress_qat 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.501 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:01.502 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:01.502 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:01.502 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.502 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:01.502 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:01.502 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:08:01.502 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.502 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:01.502 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:01.502 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:01.502 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.502 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:01.502 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:01.502 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:01.502 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.502 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:01.502 23:50:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.884 23:50:03 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:02.884 23:50:03 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.884 23:50:03 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.884 23:50:03 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.884 23:50:03 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:02.884 23:50:03 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.884 23:50:03 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.884 23:50:03 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.884 23:50:03 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:02.884 23:50:03 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.884 23:50:03 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.884 23:50:03 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.884 23:50:03 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:02.884 23:50:03 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.884 23:50:03 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.884 23:50:03 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.884 23:50:03 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:02.884 23:50:03 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.884 23:50:03 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.884 23:50:03 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.884 23:50:03 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:02.884 23:50:03 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.884 23:50:03 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.884 23:50:03 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.884 23:50:03 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:02.884 23:50:03 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:08:02.884 23:50:03 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:02.884 00:08:02.884 real 0m2.252s 00:08:02.884 user 0m1.680s 00:08:02.884 sys 0m0.575s 00:08:02.884 23:50:03 accel.accel_cdev_comp -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:02.884 23:50:03 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:08:02.884 ************************************ 00:08:02.884 END TEST accel_cdev_comp 00:08:02.884 ************************************ 00:08:02.884 23:50:03 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:02.884 23:50:03 accel -- common/autotest_common.sh@1097 -- # '[' 9 -le 1 ']' 00:08:02.884 23:50:03 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:02.884 23:50:03 accel -- common/autotest_common.sh@10 -- # set +x 00:08:02.884 ************************************ 00:08:02.884 START TEST accel_cdev_decomp 00:08:02.884 ************************************ 00:08:02.884 23:50:03 accel.accel_cdev_decomp -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:02.884 23:50:03 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:08:02.884 23:50:03 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:08:02.884 23:50:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:02.884 23:50:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:02.884 23:50:03 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:02.884 23:50:03 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:02.884 23:50:03 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:08:02.885 23:50:03 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:02.885 23:50:03 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:02.885 23:50:03 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:02.885 23:50:03 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:02.885 23:50:03 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:02.885 23:50:03 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:02.885 23:50:03 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:08:02.885 23:50:03 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:08:02.885 [2024-05-14 23:50:03.376413] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:08:02.885 [2024-05-14 23:50:03.376471] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid364304 ] 00:08:03.143 [2024-05-14 23:50:03.488307] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:03.143 [2024-05-14 23:50:03.588804] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:04.076 [2024-05-14 23:50:04.358103] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:04.076 [2024-05-14 23:50:04.360737] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xf57820 PMD being used: compress_qat 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.076 [2024-05-14 23:50:04.365018] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x115c570 PMD being used: compress_qat 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:04.076 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.077 23:50:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:05.010 00:08:05.010 real 0m2.233s 00:08:05.010 user 0m0.022s 00:08:05.010 sys 0m0.005s 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:05.010 23:50:05 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:08:05.010 ************************************ 00:08:05.010 END TEST accel_cdev_decomp 00:08:05.010 ************************************ 00:08:05.269 23:50:05 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:05.269 23:50:05 accel -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:08:05.269 23:50:05 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:05.269 23:50:05 accel -- common/autotest_common.sh@10 -- # set +x 00:08:05.269 ************************************ 00:08:05.269 START TEST accel_cdev_decmop_full 00:08:05.269 ************************************ 00:08:05.269 23:50:05 accel.accel_cdev_decmop_full -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:05.269 23:50:05 accel.accel_cdev_decmop_full -- accel/accel.sh@16 -- # local accel_opc 00:08:05.269 23:50:05 accel.accel_cdev_decmop_full -- accel/accel.sh@17 -- # local accel_module 00:08:05.269 23:50:05 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.269 23:50:05 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.269 23:50:05 accel.accel_cdev_decmop_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:05.269 23:50:05 accel.accel_cdev_decmop_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:05.269 23:50:05 accel.accel_cdev_decmop_full -- accel/accel.sh@12 -- # build_accel_config 00:08:05.269 23:50:05 accel.accel_cdev_decmop_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:05.269 23:50:05 accel.accel_cdev_decmop_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:05.269 23:50:05 accel.accel_cdev_decmop_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:05.269 23:50:05 accel.accel_cdev_decmop_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:05.269 23:50:05 accel.accel_cdev_decmop_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:05.269 23:50:05 accel.accel_cdev_decmop_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:05.269 23:50:05 accel.accel_cdev_decmop_full -- accel/accel.sh@40 -- # local IFS=, 00:08:05.269 23:50:05 accel.accel_cdev_decmop_full -- accel/accel.sh@41 -- # jq -r . 00:08:05.269 [2024-05-14 23:50:05.698205] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:08:05.269 [2024-05-14 23:50:05.698266] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid364556 ] 00:08:05.269 [2024-05-14 23:50:05.827775] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:05.528 [2024-05-14 23:50:05.927460] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:06.464 [2024-05-14 23:50:06.703071] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:06.464 [2024-05-14 23:50:06.705484] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xef1820 PMD being used: compress_qat 00:08:06.464 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:08:06.464 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.464 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.464 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.464 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:08:06.464 [2024-05-14 23:50:06.708656] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xef4b00 PMD being used: compress_qat 00:08:06.464 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.464 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.464 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.464 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:08:06.464 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.464 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.464 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val=0x1 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val=decompress 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val=32 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val=32 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val=1 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val='1 seconds' 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val=Yes 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.465 23:50:06 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:07.414 00:08:07.414 real 0m2.254s 00:08:07.414 user 0m1.678s 00:08:07.414 sys 0m0.571s 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:07.414 23:50:07 accel.accel_cdev_decmop_full -- common/autotest_common.sh@10 -- # set +x 00:08:07.414 ************************************ 00:08:07.414 END TEST accel_cdev_decmop_full 00:08:07.414 ************************************ 00:08:07.414 23:50:07 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:07.414 23:50:07 accel -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:08:07.414 23:50:07 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:07.414 23:50:07 accel -- common/autotest_common.sh@10 -- # set +x 00:08:07.672 ************************************ 00:08:07.672 START TEST accel_cdev_decomp_mcore 00:08:07.672 ************************************ 00:08:07.672 23:50:08 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:07.672 23:50:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:07.672 23:50:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:07.672 23:50:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.672 23:50:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.672 23:50:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:07.672 23:50:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:07.672 23:50:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:07.672 23:50:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:07.672 23:50:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:07.672 23:50:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:07.672 23:50:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:07.672 23:50:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:07.672 23:50:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:07.672 23:50:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:07.672 23:50:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:07.672 [2024-05-14 23:50:08.037308] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:08:07.672 [2024-05-14 23:50:08.037366] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid364920 ] 00:08:07.672 [2024-05-14 23:50:08.166495] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:07.930 [2024-05-14 23:50:08.271169] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:08:07.930 [2024-05-14 23:50:08.271254] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:08:07.930 [2024-05-14 23:50:08.271365] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:08:07.930 [2024-05-14 23:50:08.271366] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.498 [2024-05-14 23:50:09.026294] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:08.498 [2024-05-14 23:50:09.028894] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x19bde80 PMD being used: compress_qat 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.498 [2024-05-14 23:50:09.034516] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7effec19b890 PMD being used: compress_qat 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.498 [2024-05-14 23:50:09.035274] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7effe419b890 PMD being used: compress_qat 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.498 [2024-05-14 23:50:09.036334] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x19c3390 PMD being used: compress_qat 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.498 [2024-05-14 23:50:09.036548] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7effdc19b890 PMD being used: compress_qat 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:08.498 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.499 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.499 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.499 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:08:08.499 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.499 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.499 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.499 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:08.499 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.499 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.499 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.499 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:08.499 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.499 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.499 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.499 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.499 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.499 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.499 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.499 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.499 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.499 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.499 23:50:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:09.874 00:08:09.874 real 0m2.246s 00:08:09.874 user 0m7.222s 00:08:09.874 sys 0m0.583s 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:09.874 23:50:10 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:09.874 ************************************ 00:08:09.874 END TEST accel_cdev_decomp_mcore 00:08:09.874 ************************************ 00:08:09.874 23:50:10 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:09.874 23:50:10 accel -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:08:09.874 23:50:10 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:09.874 23:50:10 accel -- common/autotest_common.sh@10 -- # set +x 00:08:09.874 ************************************ 00:08:09.874 START TEST accel_cdev_decomp_full_mcore 00:08:09.874 ************************************ 00:08:09.875 23:50:10 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:09.875 23:50:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:09.875 23:50:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:09.875 23:50:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:09.875 23:50:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.875 23:50:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:09.875 23:50:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:09.875 23:50:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:09.875 23:50:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:09.875 23:50:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:09.875 23:50:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:09.875 23:50:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:09.875 23:50:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:09.875 23:50:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:09.875 23:50:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:09.875 23:50:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:09.875 [2024-05-14 23:50:10.374620] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:08:09.875 [2024-05-14 23:50:10.374687] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid365288 ] 00:08:10.132 [2024-05-14 23:50:10.508419] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:10.132 [2024-05-14 23:50:10.618064] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:08:10.132 [2024-05-14 23:50:10.618149] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:08:10.132 [2024-05-14 23:50:10.618256] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:08:10.132 [2024-05-14 23:50:10.618257] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:11.068 [2024-05-14 23:50:11.373977] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:11.068 [2024-05-14 23:50:11.376530] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x14cbe80 PMD being used: compress_qat 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:11.068 [2024-05-14 23:50:11.381155] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fc3ac19b890 PMD being used: compress_qat 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:11.068 [2024-05-14 23:50:11.381895] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fc3a419b890 PMD being used: compress_qat 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:11.068 [2024-05-14 23:50:11.382990] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x14cbf20 PMD being used: compress_qat 00:08:11.068 [2024-05-14 23:50:11.383138] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fc39c19b890 PMD being used: compress_qat 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:11.068 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:11.069 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:11.069 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:11.069 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:11.069 23:50:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:12.447 00:08:12.447 real 0m2.269s 00:08:12.447 user 0m7.250s 00:08:12.447 sys 0m0.603s 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:12.447 23:50:12 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:12.447 ************************************ 00:08:12.447 END TEST accel_cdev_decomp_full_mcore 00:08:12.447 ************************************ 00:08:12.447 23:50:12 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:12.447 23:50:12 accel -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:08:12.447 23:50:12 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:12.447 23:50:12 accel -- common/autotest_common.sh@10 -- # set +x 00:08:12.447 ************************************ 00:08:12.447 START TEST accel_cdev_decomp_mthread 00:08:12.447 ************************************ 00:08:12.447 23:50:12 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:12.447 23:50:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:12.447 23:50:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:12.447 23:50:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.447 23:50:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.447 23:50:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:12.447 23:50:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:12.447 23:50:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:12.447 23:50:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:12.447 23:50:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:12.447 23:50:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:12.447 23:50:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:12.447 23:50:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:12.447 23:50:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:12.447 23:50:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:12.447 23:50:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:12.447 [2024-05-14 23:50:12.728074] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:08:12.447 [2024-05-14 23:50:12.728131] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid365571 ] 00:08:12.447 [2024-05-14 23:50:12.857254] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:12.447 [2024-05-14 23:50:12.957867] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.385 [2024-05-14 23:50:13.739309] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:13.385 [2024-05-14 23:50:13.741921] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x17af820 PMD being used: compress_qat 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:13.385 [2024-05-14 23:50:13.746893] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x17b4870 PMD being used: compress_qat 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:13.385 [2024-05-14 23:50:13.749352] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x18d7120 PMD being used: compress_qat 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.385 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.386 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:13.386 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.386 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.386 23:50:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:14.762 00:08:14.762 real 0m2.267s 00:08:14.762 user 0m1.707s 00:08:14.762 sys 0m0.556s 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:14.762 23:50:14 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:14.762 ************************************ 00:08:14.762 END TEST accel_cdev_decomp_mthread 00:08:14.762 ************************************ 00:08:14.762 23:50:14 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:14.762 23:50:14 accel -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:08:14.762 23:50:14 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:14.762 23:50:15 accel -- common/autotest_common.sh@10 -- # set +x 00:08:14.762 ************************************ 00:08:14.762 START TEST accel_cdev_decomp_full_mthread 00:08:14.762 ************************************ 00:08:14.762 23:50:15 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:14.762 23:50:15 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:14.762 23:50:15 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:14.762 23:50:15 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:14.762 23:50:15 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:14.762 23:50:15 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:14.762 23:50:15 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:14.762 23:50:15 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:14.762 23:50:15 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:14.762 23:50:15 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:14.762 23:50:15 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:14.762 23:50:15 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:14.762 23:50:15 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:14.762 23:50:15 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:14.762 23:50:15 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:14.762 23:50:15 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:14.762 [2024-05-14 23:50:15.062750] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:08:14.763 [2024-05-14 23:50:15.062791] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid365861 ] 00:08:14.763 [2024-05-14 23:50:15.172946] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:14.763 [2024-05-14 23:50:15.270929] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.699 [2024-05-14 23:50:16.040689] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:15.699 [2024-05-14 23:50:16.043303] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1856820 PMD being used: compress_qat 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:15.699 [2024-05-14 23:50:16.047487] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1859b00 PMD being used: compress_qat 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:15.699 [2024-05-14 23:50:16.050281] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1a5b320 PMD being used: compress_qat 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:15.699 23:50:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:17.076 00:08:17.076 real 0m2.221s 00:08:17.076 user 0m1.665s 00:08:17.076 sys 0m0.556s 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:17.076 23:50:17 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:17.076 ************************************ 00:08:17.076 END TEST accel_cdev_decomp_full_mthread 00:08:17.076 ************************************ 00:08:17.076 23:50:17 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:08:17.076 23:50:17 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:17.076 23:50:17 accel -- accel/accel.sh@137 -- # build_accel_config 00:08:17.076 23:50:17 accel -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:08:17.076 23:50:17 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:17.076 23:50:17 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:17.076 23:50:17 accel -- common/autotest_common.sh@10 -- # set +x 00:08:17.076 23:50:17 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:17.076 23:50:17 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:17.076 23:50:17 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:17.076 23:50:17 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:17.076 23:50:17 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:17.076 23:50:17 accel -- accel/accel.sh@41 -- # jq -r . 00:08:17.076 ************************************ 00:08:17.076 START TEST accel_dif_functional_tests 00:08:17.076 ************************************ 00:08:17.076 23:50:17 accel.accel_dif_functional_tests -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:17.076 [2024-05-14 23:50:17.410336] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:08:17.076 [2024-05-14 23:50:17.410393] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid366226 ] 00:08:17.076 [2024-05-14 23:50:17.540684] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:17.076 [2024-05-14 23:50:17.644009] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:08:17.076 [2024-05-14 23:50:17.644095] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:08:17.076 [2024-05-14 23:50:17.644102] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:17.335 00:08:17.335 00:08:17.335 CUnit - A unit testing framework for C - Version 2.1-3 00:08:17.335 http://cunit.sourceforge.net/ 00:08:17.335 00:08:17.335 00:08:17.335 Suite: accel_dif 00:08:17.335 Test: verify: DIF generated, GUARD check ...passed 00:08:17.335 Test: verify: DIF generated, APPTAG check ...passed 00:08:17.335 Test: verify: DIF generated, REFTAG check ...passed 00:08:17.335 Test: verify: DIF not generated, GUARD check ...[2024-05-14 23:50:17.744167] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:17.335 [2024-05-14 23:50:17.744223] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:17.335 passed 00:08:17.335 Test: verify: DIF not generated, APPTAG check ...[2024-05-14 23:50:17.744262] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:17.335 [2024-05-14 23:50:17.744285] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:17.335 passed 00:08:17.335 Test: verify: DIF not generated, REFTAG check ...[2024-05-14 23:50:17.744312] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:17.335 [2024-05-14 23:50:17.744341] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:17.335 passed 00:08:17.335 Test: verify: APPTAG correct, APPTAG check ...passed 00:08:17.335 Test: verify: APPTAG incorrect, APPTAG check ...[2024-05-14 23:50:17.744407] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:08:17.335 passed 00:08:17.335 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:08:17.335 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:08:17.335 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:08:17.335 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-05-14 23:50:17.744564] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:08:17.335 passed 00:08:17.335 Test: generate copy: DIF generated, GUARD check ...passed 00:08:17.335 Test: generate copy: DIF generated, APTTAG check ...passed 00:08:17.335 Test: generate copy: DIF generated, REFTAG check ...passed 00:08:17.335 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:08:17.335 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:08:17.335 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:08:17.335 Test: generate copy: iovecs-len validate ...[2024-05-14 23:50:17.744807] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:08:17.335 passed 00:08:17.335 Test: generate copy: buffer alignment validate ...passed 00:08:17.335 00:08:17.335 Run Summary: Type Total Ran Passed Failed Inactive 00:08:17.335 suites 1 1 n/a 0 0 00:08:17.335 tests 20 20 20 0 0 00:08:17.335 asserts 204 204 204 0 n/a 00:08:17.335 00:08:17.335 Elapsed time = 0.002 seconds 00:08:17.594 00:08:17.594 real 0m0.635s 00:08:17.594 user 0m0.820s 00:08:17.594 sys 0m0.223s 00:08:17.594 23:50:17 accel.accel_dif_functional_tests -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:17.594 23:50:17 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:08:17.594 ************************************ 00:08:17.594 END TEST accel_dif_functional_tests 00:08:17.594 ************************************ 00:08:17.594 00:08:17.594 real 0m54.309s 00:08:17.594 user 1m2.480s 00:08:17.594 sys 0m11.665s 00:08:17.594 23:50:18 accel -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:17.594 23:50:18 accel -- common/autotest_common.sh@10 -- # set +x 00:08:17.594 ************************************ 00:08:17.594 END TEST accel 00:08:17.594 ************************************ 00:08:17.594 23:50:18 -- spdk/autotest.sh@180 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:17.594 23:50:18 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:08:17.594 23:50:18 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:17.594 23:50:18 -- common/autotest_common.sh@10 -- # set +x 00:08:17.594 ************************************ 00:08:17.594 START TEST accel_rpc 00:08:17.594 ************************************ 00:08:17.594 23:50:18 accel_rpc -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:17.871 * Looking for test storage... 00:08:17.871 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:08:17.871 23:50:18 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:17.871 23:50:18 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=366435 00:08:17.871 23:50:18 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 366435 00:08:17.871 23:50:18 accel_rpc -- common/autotest_common.sh@827 -- # '[' -z 366435 ']' 00:08:17.871 23:50:18 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:08:17.871 23:50:18 accel_rpc -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:17.871 23:50:18 accel_rpc -- common/autotest_common.sh@832 -- # local max_retries=100 00:08:17.871 23:50:18 accel_rpc -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:17.871 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:17.871 23:50:18 accel_rpc -- common/autotest_common.sh@836 -- # xtrace_disable 00:08:17.871 23:50:18 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:17.871 [2024-05-14 23:50:18.296544] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:08:17.871 [2024-05-14 23:50:18.296612] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid366435 ] 00:08:17.871 [2024-05-14 23:50:18.424261] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:18.138 [2024-05-14 23:50:18.533039] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.705 23:50:19 accel_rpc -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:08:18.705 23:50:19 accel_rpc -- common/autotest_common.sh@860 -- # return 0 00:08:18.705 23:50:19 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:08:18.705 23:50:19 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:08:18.705 23:50:19 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:08:18.705 23:50:19 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:08:18.705 23:50:19 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:08:18.705 23:50:19 accel_rpc -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:08:18.705 23:50:19 accel_rpc -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:18.705 23:50:19 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:18.705 ************************************ 00:08:18.705 START TEST accel_assign_opcode 00:08:18.705 ************************************ 00:08:18.705 23:50:19 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1121 -- # accel_assign_opcode_test_suite 00:08:18.705 23:50:19 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:08:18.705 23:50:19 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:18.705 23:50:19 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:18.705 [2024-05-14 23:50:19.259341] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:08:18.705 23:50:19 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:18.705 23:50:19 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:08:18.705 23:50:19 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:18.705 23:50:19 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:18.705 [2024-05-14 23:50:19.267349] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:08:18.705 23:50:19 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:18.705 23:50:19 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:08:18.705 23:50:19 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:18.705 23:50:19 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:18.963 23:50:19 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:18.963 23:50:19 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:08:18.964 23:50:19 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:18.964 23:50:19 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:08:18.964 23:50:19 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:18.964 23:50:19 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:08:18.964 23:50:19 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:18.964 software 00:08:18.964 00:08:18.964 real 0m0.268s 00:08:18.964 user 0m0.049s 00:08:18.964 sys 0m0.014s 00:08:18.964 23:50:19 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:18.964 23:50:19 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:18.964 ************************************ 00:08:18.964 END TEST accel_assign_opcode 00:08:18.964 ************************************ 00:08:19.223 23:50:19 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 366435 00:08:19.223 23:50:19 accel_rpc -- common/autotest_common.sh@946 -- # '[' -z 366435 ']' 00:08:19.223 23:50:19 accel_rpc -- common/autotest_common.sh@950 -- # kill -0 366435 00:08:19.223 23:50:19 accel_rpc -- common/autotest_common.sh@951 -- # uname 00:08:19.223 23:50:19 accel_rpc -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:08:19.223 23:50:19 accel_rpc -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 366435 00:08:19.223 23:50:19 accel_rpc -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:08:19.223 23:50:19 accel_rpc -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:08:19.223 23:50:19 accel_rpc -- common/autotest_common.sh@964 -- # echo 'killing process with pid 366435' 00:08:19.223 killing process with pid 366435 00:08:19.223 23:50:19 accel_rpc -- common/autotest_common.sh@965 -- # kill 366435 00:08:19.223 23:50:19 accel_rpc -- common/autotest_common.sh@970 -- # wait 366435 00:08:19.483 00:08:19.483 real 0m1.908s 00:08:19.483 user 0m1.975s 00:08:19.483 sys 0m0.585s 00:08:19.483 23:50:20 accel_rpc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:19.483 23:50:20 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:19.483 ************************************ 00:08:19.483 END TEST accel_rpc 00:08:19.483 ************************************ 00:08:19.483 23:50:20 -- spdk/autotest.sh@181 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:19.483 23:50:20 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:08:19.483 23:50:20 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:19.483 23:50:20 -- common/autotest_common.sh@10 -- # set +x 00:08:19.742 ************************************ 00:08:19.742 START TEST app_cmdline 00:08:19.742 ************************************ 00:08:19.742 23:50:20 app_cmdline -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:19.742 * Looking for test storage... 00:08:19.742 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:08:19.742 23:50:20 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:08:19.742 23:50:20 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=366719 00:08:19.742 23:50:20 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 366719 00:08:19.742 23:50:20 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:08:19.742 23:50:20 app_cmdline -- common/autotest_common.sh@827 -- # '[' -z 366719 ']' 00:08:19.742 23:50:20 app_cmdline -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:19.742 23:50:20 app_cmdline -- common/autotest_common.sh@832 -- # local max_retries=100 00:08:19.742 23:50:20 app_cmdline -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:19.742 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:19.742 23:50:20 app_cmdline -- common/autotest_common.sh@836 -- # xtrace_disable 00:08:19.742 23:50:20 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:19.742 [2024-05-14 23:50:20.297793] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:08:19.742 [2024-05-14 23:50:20.297862] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid366719 ] 00:08:20.001 [2024-05-14 23:50:20.426652] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.001 [2024-05-14 23:50:20.533105] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.936 23:50:21 app_cmdline -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:08:20.936 23:50:21 app_cmdline -- common/autotest_common.sh@860 -- # return 0 00:08:20.936 23:50:21 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:08:20.936 { 00:08:20.936 "version": "SPDK v24.05-pre git sha1 52939f252", 00:08:20.936 "fields": { 00:08:20.936 "major": 24, 00:08:20.936 "minor": 5, 00:08:20.936 "patch": 0, 00:08:20.936 "suffix": "-pre", 00:08:20.937 "commit": "52939f252" 00:08:20.937 } 00:08:20.937 } 00:08:20.937 23:50:21 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:08:20.937 23:50:21 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:08:20.937 23:50:21 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:08:20.937 23:50:21 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:08:20.937 23:50:21 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:08:20.937 23:50:21 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:08:20.937 23:50:21 app_cmdline -- app/cmdline.sh@26 -- # sort 00:08:20.937 23:50:21 app_cmdline -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:20.937 23:50:21 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:20.937 23:50:21 app_cmdline -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:20.937 23:50:21 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:08:20.937 23:50:21 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:08:20.937 23:50:21 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:20.937 23:50:21 app_cmdline -- common/autotest_common.sh@648 -- # local es=0 00:08:20.937 23:50:21 app_cmdline -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:20.937 23:50:21 app_cmdline -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:20.937 23:50:21 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:20.937 23:50:21 app_cmdline -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:20.937 23:50:21 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:20.937 23:50:21 app_cmdline -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:20.937 23:50:21 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:20.937 23:50:21 app_cmdline -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:20.937 23:50:21 app_cmdline -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:08:20.937 23:50:21 app_cmdline -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:21.195 request: 00:08:21.195 { 00:08:21.195 "method": "env_dpdk_get_mem_stats", 00:08:21.195 "req_id": 1 00:08:21.195 } 00:08:21.195 Got JSON-RPC error response 00:08:21.195 response: 00:08:21.195 { 00:08:21.195 "code": -32601, 00:08:21.195 "message": "Method not found" 00:08:21.195 } 00:08:21.195 23:50:21 app_cmdline -- common/autotest_common.sh@651 -- # es=1 00:08:21.195 23:50:21 app_cmdline -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:21.195 23:50:21 app_cmdline -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:08:21.195 23:50:21 app_cmdline -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:21.195 23:50:21 app_cmdline -- app/cmdline.sh@1 -- # killprocess 366719 00:08:21.195 23:50:21 app_cmdline -- common/autotest_common.sh@946 -- # '[' -z 366719 ']' 00:08:21.195 23:50:21 app_cmdline -- common/autotest_common.sh@950 -- # kill -0 366719 00:08:21.195 23:50:21 app_cmdline -- common/autotest_common.sh@951 -- # uname 00:08:21.195 23:50:21 app_cmdline -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:08:21.195 23:50:21 app_cmdline -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 366719 00:08:21.195 23:50:21 app_cmdline -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:08:21.195 23:50:21 app_cmdline -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:08:21.195 23:50:21 app_cmdline -- common/autotest_common.sh@964 -- # echo 'killing process with pid 366719' 00:08:21.195 killing process with pid 366719 00:08:21.195 23:50:21 app_cmdline -- common/autotest_common.sh@965 -- # kill 366719 00:08:21.195 23:50:21 app_cmdline -- common/autotest_common.sh@970 -- # wait 366719 00:08:21.761 00:08:21.761 real 0m2.052s 00:08:21.761 user 0m2.478s 00:08:21.761 sys 0m0.591s 00:08:21.761 23:50:22 app_cmdline -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:21.761 23:50:22 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:21.761 ************************************ 00:08:21.761 END TEST app_cmdline 00:08:21.761 ************************************ 00:08:21.761 23:50:22 -- spdk/autotest.sh@182 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:08:21.761 23:50:22 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:08:21.761 23:50:22 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:21.761 23:50:22 -- common/autotest_common.sh@10 -- # set +x 00:08:21.761 ************************************ 00:08:21.761 START TEST version 00:08:21.761 ************************************ 00:08:21.761 23:50:22 version -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:08:22.020 * Looking for test storage... 00:08:22.020 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:08:22.020 23:50:22 version -- app/version.sh@17 -- # get_header_version major 00:08:22.020 23:50:22 version -- app/version.sh@14 -- # cut -f2 00:08:22.020 23:50:22 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:22.020 23:50:22 version -- app/version.sh@14 -- # tr -d '"' 00:08:22.020 23:50:22 version -- app/version.sh@17 -- # major=24 00:08:22.020 23:50:22 version -- app/version.sh@18 -- # get_header_version minor 00:08:22.020 23:50:22 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:22.020 23:50:22 version -- app/version.sh@14 -- # cut -f2 00:08:22.020 23:50:22 version -- app/version.sh@14 -- # tr -d '"' 00:08:22.020 23:50:22 version -- app/version.sh@18 -- # minor=5 00:08:22.020 23:50:22 version -- app/version.sh@19 -- # get_header_version patch 00:08:22.020 23:50:22 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:22.020 23:50:22 version -- app/version.sh@14 -- # cut -f2 00:08:22.020 23:50:22 version -- app/version.sh@14 -- # tr -d '"' 00:08:22.020 23:50:22 version -- app/version.sh@19 -- # patch=0 00:08:22.020 23:50:22 version -- app/version.sh@20 -- # get_header_version suffix 00:08:22.020 23:50:22 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:22.020 23:50:22 version -- app/version.sh@14 -- # cut -f2 00:08:22.020 23:50:22 version -- app/version.sh@14 -- # tr -d '"' 00:08:22.020 23:50:22 version -- app/version.sh@20 -- # suffix=-pre 00:08:22.020 23:50:22 version -- app/version.sh@22 -- # version=24.5 00:08:22.020 23:50:22 version -- app/version.sh@25 -- # (( patch != 0 )) 00:08:22.020 23:50:22 version -- app/version.sh@28 -- # version=24.5rc0 00:08:22.020 23:50:22 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:08:22.020 23:50:22 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:08:22.020 23:50:22 version -- app/version.sh@30 -- # py_version=24.5rc0 00:08:22.020 23:50:22 version -- app/version.sh@31 -- # [[ 24.5rc0 == \2\4\.\5\r\c\0 ]] 00:08:22.020 00:08:22.020 real 0m0.190s 00:08:22.020 user 0m0.094s 00:08:22.020 sys 0m0.141s 00:08:22.020 23:50:22 version -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:22.020 23:50:22 version -- common/autotest_common.sh@10 -- # set +x 00:08:22.020 ************************************ 00:08:22.020 END TEST version 00:08:22.020 ************************************ 00:08:22.020 23:50:22 -- spdk/autotest.sh@184 -- # '[' 1 -eq 1 ']' 00:08:22.020 23:50:22 -- spdk/autotest.sh@185 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:08:22.020 23:50:22 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:08:22.020 23:50:22 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:22.020 23:50:22 -- common/autotest_common.sh@10 -- # set +x 00:08:22.020 ************************************ 00:08:22.020 START TEST blockdev_general 00:08:22.020 ************************************ 00:08:22.020 23:50:22 blockdev_general -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:08:22.279 * Looking for test storage... 00:08:22.279 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:08:22.279 23:50:22 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:08:22.279 23:50:22 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:08:22.279 23:50:22 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:08:22.279 23:50:22 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:08:22.279 23:50:22 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:08:22.279 23:50:22 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:08:22.279 23:50:22 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:08:22.279 23:50:22 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:08:22.279 23:50:22 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:08:22.279 23:50:22 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:08:22.279 23:50:22 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:08:22.279 23:50:22 blockdev_general -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:08:22.279 23:50:22 blockdev_general -- bdev/blockdev.sh@674 -- # uname -s 00:08:22.279 23:50:22 blockdev_general -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:08:22.279 23:50:22 blockdev_general -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:08:22.279 23:50:22 blockdev_general -- bdev/blockdev.sh@682 -- # test_type=bdev 00:08:22.279 23:50:22 blockdev_general -- bdev/blockdev.sh@683 -- # crypto_device= 00:08:22.279 23:50:22 blockdev_general -- bdev/blockdev.sh@684 -- # dek= 00:08:22.279 23:50:22 blockdev_general -- bdev/blockdev.sh@685 -- # env_ctx= 00:08:22.279 23:50:22 blockdev_general -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:08:22.279 23:50:22 blockdev_general -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:08:22.279 23:50:22 blockdev_general -- bdev/blockdev.sh@690 -- # [[ bdev == bdev ]] 00:08:22.279 23:50:22 blockdev_general -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:08:22.279 23:50:22 blockdev_general -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:08:22.279 23:50:22 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=367187 00:08:22.279 23:50:22 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:08:22.279 23:50:22 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:08:22.279 23:50:22 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 367187 00:08:22.279 23:50:22 blockdev_general -- common/autotest_common.sh@827 -- # '[' -z 367187 ']' 00:08:22.279 23:50:22 blockdev_general -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:22.279 23:50:22 blockdev_general -- common/autotest_common.sh@832 -- # local max_retries=100 00:08:22.279 23:50:22 blockdev_general -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:22.279 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:22.279 23:50:22 blockdev_general -- common/autotest_common.sh@836 -- # xtrace_disable 00:08:22.279 23:50:22 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:22.279 [2024-05-14 23:50:22.732331] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:08:22.279 [2024-05-14 23:50:22.732407] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid367187 ] 00:08:22.279 [2024-05-14 23:50:22.860118] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:22.538 [2024-05-14 23:50:22.964501] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.103 23:50:23 blockdev_general -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:08:23.103 23:50:23 blockdev_general -- common/autotest_common.sh@860 -- # return 0 00:08:23.103 23:50:23 blockdev_general -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:08:23.103 23:50:23 blockdev_general -- bdev/blockdev.sh@696 -- # setup_bdev_conf 00:08:23.103 23:50:23 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:08:23.103 23:50:23 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:23.103 23:50:23 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:23.361 [2024-05-14 23:50:23.920346] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:23.361 [2024-05-14 23:50:23.920406] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:23.361 00:08:23.361 [2024-05-14 23:50:23.928332] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:23.361 [2024-05-14 23:50:23.928357] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:23.361 00:08:23.361 Malloc0 00:08:23.620 Malloc1 00:08:23.620 Malloc2 00:08:23.620 Malloc3 00:08:23.620 Malloc4 00:08:23.620 Malloc5 00:08:23.620 Malloc6 00:08:23.620 Malloc7 00:08:23.620 Malloc8 00:08:23.620 Malloc9 00:08:23.620 [2024-05-14 23:50:24.076938] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:23.620 [2024-05-14 23:50:24.076984] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:23.620 [2024-05-14 23:50:24.077004] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20e9680 00:08:23.620 [2024-05-14 23:50:24.077016] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:23.621 [2024-05-14 23:50:24.078372] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:23.621 [2024-05-14 23:50:24.078406] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:23.621 TestPT 00:08:23.621 23:50:24 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:23.621 23:50:24 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:08:23.621 5000+0 records in 00:08:23.621 5000+0 records out 00:08:23.621 10240000 bytes (10 MB, 9.8 MiB) copied, 0.026118 s, 392 MB/s 00:08:23.621 23:50:24 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:08:23.621 23:50:24 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:23.621 23:50:24 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:23.621 AIO0 00:08:23.621 23:50:24 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:23.621 23:50:24 blockdev_general -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:08:23.621 23:50:24 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:23.621 23:50:24 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:23.621 23:50:24 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:23.621 23:50:24 blockdev_general -- bdev/blockdev.sh@740 -- # cat 00:08:23.621 23:50:24 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:08:23.621 23:50:24 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:23.621 23:50:24 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:23.621 23:50:24 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:23.621 23:50:24 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:08:23.621 23:50:24 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:23.621 23:50:24 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:23.880 23:50:24 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:23.880 23:50:24 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:08:23.880 23:50:24 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:23.880 23:50:24 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:23.880 23:50:24 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:23.880 23:50:24 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:08:23.880 23:50:24 blockdev_general -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:08:23.880 23:50:24 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:23.880 23:50:24 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:08:23.880 23:50:24 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:23.880 23:50:24 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:23.880 23:50:24 blockdev_general -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:08:23.880 23:50:24 blockdev_general -- bdev/blockdev.sh@749 -- # jq -r .name 00:08:23.881 23:50:24 blockdev_general -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "fa6ddae7-c906-49d6-9195-9609537935ef"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "fa6ddae7-c906-49d6-9195-9609537935ef",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "cfdabea7-e0b7-5e19-979d-b6e4703f4793"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "cfdabea7-e0b7-5e19-979d-b6e4703f4793",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "56b2027c-e9b7-5bd8-881f-9bf369ada9d1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "56b2027c-e9b7-5bd8-881f-9bf369ada9d1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "95b592a7-b30c-599c-a4ea-5cb1d6d7cfa9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "95b592a7-b30c-599c-a4ea-5cb1d6d7cfa9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "2dd09488-2b29-510a-8d6a-2d34a949e2fe"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "2dd09488-2b29-510a-8d6a-2d34a949e2fe",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "11788ef6-030c-5e5e-8352-b986b8528e86"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "11788ef6-030c-5e5e-8352-b986b8528e86",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "09fd8caf-7ac9-58c2-a791-4bfc377303ec"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "09fd8caf-7ac9-58c2-a791-4bfc377303ec",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "4de189f3-5463-5012-a4cb-ba3951d12e5f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "4de189f3-5463-5012-a4cb-ba3951d12e5f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "f8cdcbc7-0e8f-5965-93fe-ac2b1fd6343f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f8cdcbc7-0e8f-5965-93fe-ac2b1fd6343f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "6e6a484f-6272-5d0d-a2b1-279240238d05"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "6e6a484f-6272-5d0d-a2b1-279240238d05",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "1858bfa4-8ca2-5d90-843d-24db1013bd78"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "1858bfa4-8ca2-5d90-843d-24db1013bd78",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "fa731af3-e4a5-5d17-9028-137ebf186bd0"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "fa731af3-e4a5-5d17-9028-137ebf186bd0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "d1284f53-1877-4171-a41e-84dc59ebc642"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "d1284f53-1877-4171-a41e-84dc59ebc642",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "d1284f53-1877-4171-a41e-84dc59ebc642",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "1376d111-c140-43cf-be04-fcaad502d39d",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "43a7f8f4-1a90-40fe-93ea-69acd6f0811b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "bfe7e916-fd61-420f-af6e-adbacd4e56fe"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "bfe7e916-fd61-420f-af6e-adbacd4e56fe",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "bfe7e916-fd61-420f-af6e-adbacd4e56fe",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "18a32a47-4f4c-438d-90c8-a6e11ef81c25",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "57465bac-a2ed-4b9d-ad46-2ccf19a907b4",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "3ebb59dd-a76e-4b1a-8ac0-b7f5b7cb1c69"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "3ebb59dd-a76e-4b1a-8ac0-b7f5b7cb1c69",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": false,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "3ebb59dd-a76e-4b1a-8ac0-b7f5b7cb1c69",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "8dff0970-634c-4250-8812-173f3532f5e9",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "3488c802-d3a9-4229-a8fd-5248bf167ca7",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "b6ca11e7-b913-452c-a97b-d9fa2f7a74e9"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "b6ca11e7-b913-452c-a97b-d9fa2f7a74e9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:08:24.141 23:50:24 blockdev_general -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:08:24.141 23:50:24 blockdev_general -- bdev/blockdev.sh@752 -- # hello_world_bdev=Malloc0 00:08:24.141 23:50:24 blockdev_general -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:08:24.141 23:50:24 blockdev_general -- bdev/blockdev.sh@754 -- # killprocess 367187 00:08:24.141 23:50:24 blockdev_general -- common/autotest_common.sh@946 -- # '[' -z 367187 ']' 00:08:24.141 23:50:24 blockdev_general -- common/autotest_common.sh@950 -- # kill -0 367187 00:08:24.141 23:50:24 blockdev_general -- common/autotest_common.sh@951 -- # uname 00:08:24.141 23:50:24 blockdev_general -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:08:24.141 23:50:24 blockdev_general -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 367187 00:08:24.141 23:50:24 blockdev_general -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:08:24.141 23:50:24 blockdev_general -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:08:24.141 23:50:24 blockdev_general -- common/autotest_common.sh@964 -- # echo 'killing process with pid 367187' 00:08:24.141 killing process with pid 367187 00:08:24.141 23:50:24 blockdev_general -- common/autotest_common.sh@965 -- # kill 367187 00:08:24.141 23:50:24 blockdev_general -- common/autotest_common.sh@970 -- # wait 367187 00:08:24.709 23:50:25 blockdev_general -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:08:24.709 23:50:25 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:08:24.709 23:50:25 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:08:24.709 23:50:25 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:24.709 23:50:25 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:24.709 ************************************ 00:08:24.709 START TEST bdev_hello_world 00:08:24.709 ************************************ 00:08:24.709 23:50:25 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:08:24.709 [2024-05-14 23:50:25.146284] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:08:24.709 [2024-05-14 23:50:25.146352] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid367566 ] 00:08:24.709 [2024-05-14 23:50:25.278706] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:24.969 [2024-05-14 23:50:25.385666] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:24.969 [2024-05-14 23:50:25.548069] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:24.969 [2024-05-14 23:50:25.548134] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:24.969 [2024-05-14 23:50:25.548149] vbdev_passthru.c: 731:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:24.969 [2024-05-14 23:50:25.556086] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:24.969 [2024-05-14 23:50:25.556129] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:25.229 [2024-05-14 23:50:25.564091] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:25.229 [2024-05-14 23:50:25.564122] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:25.229 [2024-05-14 23:50:25.641297] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:25.229 [2024-05-14 23:50:25.641351] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:25.229 [2024-05-14 23:50:25.641371] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1957e10 00:08:25.229 [2024-05-14 23:50:25.641383] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:25.229 [2024-05-14 23:50:25.642836] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:25.229 [2024-05-14 23:50:25.642866] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:25.229 [2024-05-14 23:50:25.783866] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:08:25.229 [2024-05-14 23:50:25.783935] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:08:25.229 [2024-05-14 23:50:25.783990] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:08:25.229 [2024-05-14 23:50:25.784064] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:08:25.229 [2024-05-14 23:50:25.784142] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:08:25.229 [2024-05-14 23:50:25.784172] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:08:25.229 [2024-05-14 23:50:25.784235] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:08:25.229 00:08:25.229 [2024-05-14 23:50:25.784275] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:08:25.797 00:08:25.797 real 0m1.053s 00:08:25.797 user 0m0.703s 00:08:25.797 sys 0m0.319s 00:08:25.797 23:50:26 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:25.797 23:50:26 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:08:25.797 ************************************ 00:08:25.797 END TEST bdev_hello_world 00:08:25.797 ************************************ 00:08:25.797 23:50:26 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:08:25.797 23:50:26 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:08:25.797 23:50:26 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:25.797 23:50:26 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:25.797 ************************************ 00:08:25.797 START TEST bdev_bounds 00:08:25.797 ************************************ 00:08:25.797 23:50:26 blockdev_general.bdev_bounds -- common/autotest_common.sh@1121 -- # bdev_bounds '' 00:08:25.797 23:50:26 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=367737 00:08:25.797 23:50:26 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:08:25.797 23:50:26 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:08:25.797 23:50:26 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 367737' 00:08:25.797 Process bdevio pid: 367737 00:08:25.797 23:50:26 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 367737 00:08:25.797 23:50:26 blockdev_general.bdev_bounds -- common/autotest_common.sh@827 -- # '[' -z 367737 ']' 00:08:25.798 23:50:26 blockdev_general.bdev_bounds -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:25.798 23:50:26 blockdev_general.bdev_bounds -- common/autotest_common.sh@832 -- # local max_retries=100 00:08:25.798 23:50:26 blockdev_general.bdev_bounds -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:25.798 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:25.798 23:50:26 blockdev_general.bdev_bounds -- common/autotest_common.sh@836 -- # xtrace_disable 00:08:25.798 23:50:26 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:08:25.798 [2024-05-14 23:50:26.284596] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:08:25.798 [2024-05-14 23:50:26.284645] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid367737 ] 00:08:26.057 [2024-05-14 23:50:26.395792] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:26.057 [2024-05-14 23:50:26.497727] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:08:26.057 [2024-05-14 23:50:26.497812] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:08:26.057 [2024-05-14 23:50:26.497817] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:26.316 [2024-05-14 23:50:26.661459] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:26.316 [2024-05-14 23:50:26.661513] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:26.316 [2024-05-14 23:50:26.661529] vbdev_passthru.c: 731:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:26.316 [2024-05-14 23:50:26.669466] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:26.316 [2024-05-14 23:50:26.669492] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:26.316 [2024-05-14 23:50:26.677479] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:26.316 [2024-05-14 23:50:26.677502] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:26.316 [2024-05-14 23:50:26.754996] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:26.316 [2024-05-14 23:50:26.755051] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:26.316 [2024-05-14 23:50:26.755070] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1881fa0 00:08:26.316 [2024-05-14 23:50:26.755082] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:26.316 [2024-05-14 23:50:26.756609] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:26.316 [2024-05-14 23:50:26.756638] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:26.575 23:50:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:08:26.575 23:50:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@860 -- # return 0 00:08:26.575 23:50:27 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:08:26.835 I/O targets: 00:08:26.835 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:08:26.835 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:08:26.835 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:08:26.835 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:08:26.835 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:08:26.835 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:08:26.835 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:08:26.835 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:08:26.835 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:08:26.835 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:08:26.835 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:08:26.835 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:08:26.835 raid0: 131072 blocks of 512 bytes (64 MiB) 00:08:26.835 concat0: 131072 blocks of 512 bytes (64 MiB) 00:08:26.835 raid1: 65536 blocks of 512 bytes (32 MiB) 00:08:26.835 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:08:26.835 00:08:26.835 00:08:26.835 CUnit - A unit testing framework for C - Version 2.1-3 00:08:26.835 http://cunit.sourceforge.net/ 00:08:26.835 00:08:26.835 00:08:26.835 Suite: bdevio tests on: AIO0 00:08:26.835 Test: blockdev write read block ...passed 00:08:26.835 Test: blockdev write zeroes read block ...passed 00:08:26.835 Test: blockdev write zeroes read no split ...passed 00:08:26.835 Test: blockdev write zeroes read split ...passed 00:08:26.835 Test: blockdev write zeroes read split partial ...passed 00:08:26.835 Test: blockdev reset ...passed 00:08:26.835 Test: blockdev write read 8 blocks ...passed 00:08:26.835 Test: blockdev write read size > 128k ...passed 00:08:26.835 Test: blockdev write read invalid size ...passed 00:08:26.836 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:26.836 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:26.836 Test: blockdev write read max offset ...passed 00:08:26.836 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:26.836 Test: blockdev writev readv 8 blocks ...passed 00:08:26.836 Test: blockdev writev readv 30 x 1block ...passed 00:08:26.836 Test: blockdev writev readv block ...passed 00:08:26.836 Test: blockdev writev readv size > 128k ...passed 00:08:26.836 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:26.836 Test: blockdev comparev and writev ...passed 00:08:26.836 Test: blockdev nvme passthru rw ...passed 00:08:26.836 Test: blockdev nvme passthru vendor specific ...passed 00:08:26.836 Test: blockdev nvme admin passthru ...passed 00:08:26.836 Test: blockdev copy ...passed 00:08:26.836 Suite: bdevio tests on: raid1 00:08:26.836 Test: blockdev write read block ...passed 00:08:26.836 Test: blockdev write zeroes read block ...passed 00:08:26.836 Test: blockdev write zeroes read no split ...passed 00:08:26.836 Test: blockdev write zeroes read split ...passed 00:08:26.836 Test: blockdev write zeroes read split partial ...passed 00:08:26.836 Test: blockdev reset ...passed 00:08:26.836 Test: blockdev write read 8 blocks ...passed 00:08:26.836 Test: blockdev write read size > 128k ...passed 00:08:26.836 Test: blockdev write read invalid size ...passed 00:08:26.836 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:26.836 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:26.836 Test: blockdev write read max offset ...passed 00:08:26.836 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:26.836 Test: blockdev writev readv 8 blocks ...passed 00:08:26.836 Test: blockdev writev readv 30 x 1block ...passed 00:08:26.836 Test: blockdev writev readv block ...passed 00:08:26.836 Test: blockdev writev readv size > 128k ...passed 00:08:26.836 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:26.836 Test: blockdev comparev and writev ...passed 00:08:26.836 Test: blockdev nvme passthru rw ...passed 00:08:26.836 Test: blockdev nvme passthru vendor specific ...passed 00:08:26.836 Test: blockdev nvme admin passthru ...passed 00:08:26.836 Test: blockdev copy ...passed 00:08:26.836 Suite: bdevio tests on: concat0 00:08:26.836 Test: blockdev write read block ...passed 00:08:26.836 Test: blockdev write zeroes read block ...passed 00:08:26.836 Test: blockdev write zeroes read no split ...passed 00:08:26.836 Test: blockdev write zeroes read split ...passed 00:08:26.836 Test: blockdev write zeroes read split partial ...passed 00:08:26.836 Test: blockdev reset ...passed 00:08:26.836 Test: blockdev write read 8 blocks ...passed 00:08:26.836 Test: blockdev write read size > 128k ...passed 00:08:26.836 Test: blockdev write read invalid size ...passed 00:08:26.836 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:26.836 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:26.836 Test: blockdev write read max offset ...passed 00:08:26.836 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:26.836 Test: blockdev writev readv 8 blocks ...passed 00:08:26.836 Test: blockdev writev readv 30 x 1block ...passed 00:08:26.836 Test: blockdev writev readv block ...passed 00:08:26.836 Test: blockdev writev readv size > 128k ...passed 00:08:26.836 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:26.836 Test: blockdev comparev and writev ...passed 00:08:26.836 Test: blockdev nvme passthru rw ...passed 00:08:26.836 Test: blockdev nvme passthru vendor specific ...passed 00:08:26.836 Test: blockdev nvme admin passthru ...passed 00:08:26.836 Test: blockdev copy ...passed 00:08:26.836 Suite: bdevio tests on: raid0 00:08:26.836 Test: blockdev write read block ...passed 00:08:26.836 Test: blockdev write zeroes read block ...passed 00:08:26.836 Test: blockdev write zeroes read no split ...passed 00:08:26.836 Test: blockdev write zeroes read split ...passed 00:08:26.836 Test: blockdev write zeroes read split partial ...passed 00:08:26.836 Test: blockdev reset ...passed 00:08:26.836 Test: blockdev write read 8 blocks ...passed 00:08:26.836 Test: blockdev write read size > 128k ...passed 00:08:26.836 Test: blockdev write read invalid size ...passed 00:08:26.836 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:26.836 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:26.836 Test: blockdev write read max offset ...passed 00:08:26.836 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:26.836 Test: blockdev writev readv 8 blocks ...passed 00:08:26.836 Test: blockdev writev readv 30 x 1block ...passed 00:08:26.836 Test: blockdev writev readv block ...passed 00:08:26.836 Test: blockdev writev readv size > 128k ...passed 00:08:26.836 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:26.836 Test: blockdev comparev and writev ...passed 00:08:26.836 Test: blockdev nvme passthru rw ...passed 00:08:26.836 Test: blockdev nvme passthru vendor specific ...passed 00:08:26.836 Test: blockdev nvme admin passthru ...passed 00:08:26.836 Test: blockdev copy ...passed 00:08:26.836 Suite: bdevio tests on: TestPT 00:08:26.836 Test: blockdev write read block ...passed 00:08:26.836 Test: blockdev write zeroes read block ...passed 00:08:26.836 Test: blockdev write zeroes read no split ...passed 00:08:26.836 Test: blockdev write zeroes read split ...passed 00:08:26.836 Test: blockdev write zeroes read split partial ...passed 00:08:26.836 Test: blockdev reset ...passed 00:08:26.836 Test: blockdev write read 8 blocks ...passed 00:08:26.836 Test: blockdev write read size > 128k ...passed 00:08:26.836 Test: blockdev write read invalid size ...passed 00:08:26.836 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:26.836 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:26.836 Test: blockdev write read max offset ...passed 00:08:26.836 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:26.836 Test: blockdev writev readv 8 blocks ...passed 00:08:26.836 Test: blockdev writev readv 30 x 1block ...passed 00:08:26.836 Test: blockdev writev readv block ...passed 00:08:26.836 Test: blockdev writev readv size > 128k ...passed 00:08:26.836 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:26.836 Test: blockdev comparev and writev ...passed 00:08:26.836 Test: blockdev nvme passthru rw ...passed 00:08:26.836 Test: blockdev nvme passthru vendor specific ...passed 00:08:26.836 Test: blockdev nvme admin passthru ...passed 00:08:26.836 Test: blockdev copy ...passed 00:08:26.836 Suite: bdevio tests on: Malloc2p7 00:08:26.836 Test: blockdev write read block ...passed 00:08:26.836 Test: blockdev write zeroes read block ...passed 00:08:26.836 Test: blockdev write zeroes read no split ...passed 00:08:26.836 Test: blockdev write zeroes read split ...passed 00:08:26.836 Test: blockdev write zeroes read split partial ...passed 00:08:26.836 Test: blockdev reset ...passed 00:08:26.836 Test: blockdev write read 8 blocks ...passed 00:08:26.836 Test: blockdev write read size > 128k ...passed 00:08:26.836 Test: blockdev write read invalid size ...passed 00:08:26.836 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:26.836 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:26.836 Test: blockdev write read max offset ...passed 00:08:26.836 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:26.836 Test: blockdev writev readv 8 blocks ...passed 00:08:26.836 Test: blockdev writev readv 30 x 1block ...passed 00:08:26.836 Test: blockdev writev readv block ...passed 00:08:26.836 Test: blockdev writev readv size > 128k ...passed 00:08:26.836 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:26.836 Test: blockdev comparev and writev ...passed 00:08:26.836 Test: blockdev nvme passthru rw ...passed 00:08:26.836 Test: blockdev nvme passthru vendor specific ...passed 00:08:26.836 Test: blockdev nvme admin passthru ...passed 00:08:26.836 Test: blockdev copy ...passed 00:08:26.836 Suite: bdevio tests on: Malloc2p6 00:08:26.836 Test: blockdev write read block ...passed 00:08:26.836 Test: blockdev write zeroes read block ...passed 00:08:26.836 Test: blockdev write zeroes read no split ...passed 00:08:26.836 Test: blockdev write zeroes read split ...passed 00:08:26.836 Test: blockdev write zeroes read split partial ...passed 00:08:26.836 Test: blockdev reset ...passed 00:08:26.836 Test: blockdev write read 8 blocks ...passed 00:08:26.836 Test: blockdev write read size > 128k ...passed 00:08:26.836 Test: blockdev write read invalid size ...passed 00:08:26.836 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:26.836 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:26.836 Test: blockdev write read max offset ...passed 00:08:26.836 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:26.836 Test: blockdev writev readv 8 blocks ...passed 00:08:26.836 Test: blockdev writev readv 30 x 1block ...passed 00:08:26.836 Test: blockdev writev readv block ...passed 00:08:26.836 Test: blockdev writev readv size > 128k ...passed 00:08:26.836 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:26.836 Test: blockdev comparev and writev ...passed 00:08:26.836 Test: blockdev nvme passthru rw ...passed 00:08:26.836 Test: blockdev nvme passthru vendor specific ...passed 00:08:26.836 Test: blockdev nvme admin passthru ...passed 00:08:26.836 Test: blockdev copy ...passed 00:08:26.836 Suite: bdevio tests on: Malloc2p5 00:08:26.836 Test: blockdev write read block ...passed 00:08:26.836 Test: blockdev write zeroes read block ...passed 00:08:26.836 Test: blockdev write zeroes read no split ...passed 00:08:26.836 Test: blockdev write zeroes read split ...passed 00:08:26.836 Test: blockdev write zeroes read split partial ...passed 00:08:26.836 Test: blockdev reset ...passed 00:08:26.836 Test: blockdev write read 8 blocks ...passed 00:08:26.836 Test: blockdev write read size > 128k ...passed 00:08:26.836 Test: blockdev write read invalid size ...passed 00:08:26.836 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:26.836 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:26.836 Test: blockdev write read max offset ...passed 00:08:26.836 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:26.836 Test: blockdev writev readv 8 blocks ...passed 00:08:26.836 Test: blockdev writev readv 30 x 1block ...passed 00:08:26.836 Test: blockdev writev readv block ...passed 00:08:26.836 Test: blockdev writev readv size > 128k ...passed 00:08:26.836 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:26.836 Test: blockdev comparev and writev ...passed 00:08:26.836 Test: blockdev nvme passthru rw ...passed 00:08:26.836 Test: blockdev nvme passthru vendor specific ...passed 00:08:26.836 Test: blockdev nvme admin passthru ...passed 00:08:26.836 Test: blockdev copy ...passed 00:08:26.837 Suite: bdevio tests on: Malloc2p4 00:08:26.837 Test: blockdev write read block ...passed 00:08:26.837 Test: blockdev write zeroes read block ...passed 00:08:26.837 Test: blockdev write zeroes read no split ...passed 00:08:26.837 Test: blockdev write zeroes read split ...passed 00:08:26.837 Test: blockdev write zeroes read split partial ...passed 00:08:26.837 Test: blockdev reset ...passed 00:08:26.837 Test: blockdev write read 8 blocks ...passed 00:08:26.837 Test: blockdev write read size > 128k ...passed 00:08:26.837 Test: blockdev write read invalid size ...passed 00:08:26.837 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:26.837 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:26.837 Test: blockdev write read max offset ...passed 00:08:26.837 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:26.837 Test: blockdev writev readv 8 blocks ...passed 00:08:26.837 Test: blockdev writev readv 30 x 1block ...passed 00:08:26.837 Test: blockdev writev readv block ...passed 00:08:26.837 Test: blockdev writev readv size > 128k ...passed 00:08:26.837 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:26.837 Test: blockdev comparev and writev ...passed 00:08:26.837 Test: blockdev nvme passthru rw ...passed 00:08:26.837 Test: blockdev nvme passthru vendor specific ...passed 00:08:26.837 Test: blockdev nvme admin passthru ...passed 00:08:26.837 Test: blockdev copy ...passed 00:08:26.837 Suite: bdevio tests on: Malloc2p3 00:08:26.837 Test: blockdev write read block ...passed 00:08:26.837 Test: blockdev write zeroes read block ...passed 00:08:26.837 Test: blockdev write zeroes read no split ...passed 00:08:26.837 Test: blockdev write zeroes read split ...passed 00:08:26.837 Test: blockdev write zeroes read split partial ...passed 00:08:26.837 Test: blockdev reset ...passed 00:08:26.837 Test: blockdev write read 8 blocks ...passed 00:08:26.837 Test: blockdev write read size > 128k ...passed 00:08:26.837 Test: blockdev write read invalid size ...passed 00:08:26.837 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:26.837 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:26.837 Test: blockdev write read max offset ...passed 00:08:26.837 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:26.837 Test: blockdev writev readv 8 blocks ...passed 00:08:26.837 Test: blockdev writev readv 30 x 1block ...passed 00:08:26.837 Test: blockdev writev readv block ...passed 00:08:26.837 Test: blockdev writev readv size > 128k ...passed 00:08:26.837 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:26.837 Test: blockdev comparev and writev ...passed 00:08:26.837 Test: blockdev nvme passthru rw ...passed 00:08:26.837 Test: blockdev nvme passthru vendor specific ...passed 00:08:26.837 Test: blockdev nvme admin passthru ...passed 00:08:26.837 Test: blockdev copy ...passed 00:08:26.837 Suite: bdevio tests on: Malloc2p2 00:08:26.837 Test: blockdev write read block ...passed 00:08:26.837 Test: blockdev write zeroes read block ...passed 00:08:26.837 Test: blockdev write zeroes read no split ...passed 00:08:26.837 Test: blockdev write zeroes read split ...passed 00:08:26.837 Test: blockdev write zeroes read split partial ...passed 00:08:26.837 Test: blockdev reset ...passed 00:08:26.837 Test: blockdev write read 8 blocks ...passed 00:08:26.837 Test: blockdev write read size > 128k ...passed 00:08:26.837 Test: blockdev write read invalid size ...passed 00:08:26.837 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:26.837 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:26.837 Test: blockdev write read max offset ...passed 00:08:26.837 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:26.837 Test: blockdev writev readv 8 blocks ...passed 00:08:26.837 Test: blockdev writev readv 30 x 1block ...passed 00:08:26.837 Test: blockdev writev readv block ...passed 00:08:26.837 Test: blockdev writev readv size > 128k ...passed 00:08:26.837 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:26.837 Test: blockdev comparev and writev ...passed 00:08:26.837 Test: blockdev nvme passthru rw ...passed 00:08:26.837 Test: blockdev nvme passthru vendor specific ...passed 00:08:26.837 Test: blockdev nvme admin passthru ...passed 00:08:26.837 Test: blockdev copy ...passed 00:08:26.837 Suite: bdevio tests on: Malloc2p1 00:08:26.837 Test: blockdev write read block ...passed 00:08:26.837 Test: blockdev write zeroes read block ...passed 00:08:26.837 Test: blockdev write zeroes read no split ...passed 00:08:26.837 Test: blockdev write zeroes read split ...passed 00:08:26.837 Test: blockdev write zeroes read split partial ...passed 00:08:26.837 Test: blockdev reset ...passed 00:08:26.837 Test: blockdev write read 8 blocks ...passed 00:08:26.837 Test: blockdev write read size > 128k ...passed 00:08:26.837 Test: blockdev write read invalid size ...passed 00:08:26.837 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:26.837 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:26.837 Test: blockdev write read max offset ...passed 00:08:26.837 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:26.837 Test: blockdev writev readv 8 blocks ...passed 00:08:26.837 Test: blockdev writev readv 30 x 1block ...passed 00:08:26.837 Test: blockdev writev readv block ...passed 00:08:26.837 Test: blockdev writev readv size > 128k ...passed 00:08:26.837 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:26.837 Test: blockdev comparev and writev ...passed 00:08:26.837 Test: blockdev nvme passthru rw ...passed 00:08:26.837 Test: blockdev nvme passthru vendor specific ...passed 00:08:26.837 Test: blockdev nvme admin passthru ...passed 00:08:26.837 Test: blockdev copy ...passed 00:08:26.837 Suite: bdevio tests on: Malloc2p0 00:08:26.837 Test: blockdev write read block ...passed 00:08:26.837 Test: blockdev write zeroes read block ...passed 00:08:26.837 Test: blockdev write zeroes read no split ...passed 00:08:26.837 Test: blockdev write zeroes read split ...passed 00:08:26.837 Test: blockdev write zeroes read split partial ...passed 00:08:26.837 Test: blockdev reset ...passed 00:08:26.837 Test: blockdev write read 8 blocks ...passed 00:08:26.837 Test: blockdev write read size > 128k ...passed 00:08:26.837 Test: blockdev write read invalid size ...passed 00:08:26.837 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:26.837 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:26.837 Test: blockdev write read max offset ...passed 00:08:26.837 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:26.837 Test: blockdev writev readv 8 blocks ...passed 00:08:26.837 Test: blockdev writev readv 30 x 1block ...passed 00:08:26.837 Test: blockdev writev readv block ...passed 00:08:26.837 Test: blockdev writev readv size > 128k ...passed 00:08:26.837 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:26.837 Test: blockdev comparev and writev ...passed 00:08:26.837 Test: blockdev nvme passthru rw ...passed 00:08:26.837 Test: blockdev nvme passthru vendor specific ...passed 00:08:26.837 Test: blockdev nvme admin passthru ...passed 00:08:26.837 Test: blockdev copy ...passed 00:08:26.837 Suite: bdevio tests on: Malloc1p1 00:08:26.837 Test: blockdev write read block ...passed 00:08:26.837 Test: blockdev write zeroes read block ...passed 00:08:26.837 Test: blockdev write zeroes read no split ...passed 00:08:26.837 Test: blockdev write zeroes read split ...passed 00:08:26.837 Test: blockdev write zeroes read split partial ...passed 00:08:26.837 Test: blockdev reset ...passed 00:08:26.837 Test: blockdev write read 8 blocks ...passed 00:08:26.837 Test: blockdev write read size > 128k ...passed 00:08:26.837 Test: blockdev write read invalid size ...passed 00:08:26.837 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:26.837 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:26.837 Test: blockdev write read max offset ...passed 00:08:26.837 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:26.837 Test: blockdev writev readv 8 blocks ...passed 00:08:26.837 Test: blockdev writev readv 30 x 1block ...passed 00:08:26.837 Test: blockdev writev readv block ...passed 00:08:26.837 Test: blockdev writev readv size > 128k ...passed 00:08:26.837 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:26.837 Test: blockdev comparev and writev ...passed 00:08:26.837 Test: blockdev nvme passthru rw ...passed 00:08:26.837 Test: blockdev nvme passthru vendor specific ...passed 00:08:26.837 Test: blockdev nvme admin passthru ...passed 00:08:26.837 Test: blockdev copy ...passed 00:08:26.837 Suite: bdevio tests on: Malloc1p0 00:08:26.837 Test: blockdev write read block ...passed 00:08:26.837 Test: blockdev write zeroes read block ...passed 00:08:26.837 Test: blockdev write zeroes read no split ...passed 00:08:26.837 Test: blockdev write zeroes read split ...passed 00:08:26.837 Test: blockdev write zeroes read split partial ...passed 00:08:26.837 Test: blockdev reset ...passed 00:08:26.837 Test: blockdev write read 8 blocks ...passed 00:08:26.837 Test: blockdev write read size > 128k ...passed 00:08:26.837 Test: blockdev write read invalid size ...passed 00:08:26.837 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:26.837 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:26.837 Test: blockdev write read max offset ...passed 00:08:26.837 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:26.837 Test: blockdev writev readv 8 blocks ...passed 00:08:26.837 Test: blockdev writev readv 30 x 1block ...passed 00:08:26.837 Test: blockdev writev readv block ...passed 00:08:26.837 Test: blockdev writev readv size > 128k ...passed 00:08:26.837 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:26.837 Test: blockdev comparev and writev ...passed 00:08:26.837 Test: blockdev nvme passthru rw ...passed 00:08:26.837 Test: blockdev nvme passthru vendor specific ...passed 00:08:26.837 Test: blockdev nvme admin passthru ...passed 00:08:26.837 Test: blockdev copy ...passed 00:08:26.837 Suite: bdevio tests on: Malloc0 00:08:26.837 Test: blockdev write read block ...passed 00:08:26.837 Test: blockdev write zeroes read block ...passed 00:08:26.837 Test: blockdev write zeroes read no split ...passed 00:08:26.837 Test: blockdev write zeroes read split ...passed 00:08:26.837 Test: blockdev write zeroes read split partial ...passed 00:08:26.837 Test: blockdev reset ...passed 00:08:26.837 Test: blockdev write read 8 blocks ...passed 00:08:26.837 Test: blockdev write read size > 128k ...passed 00:08:26.837 Test: blockdev write read invalid size ...passed 00:08:26.837 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:26.837 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:26.837 Test: blockdev write read max offset ...passed 00:08:26.837 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:26.837 Test: blockdev writev readv 8 blocks ...passed 00:08:26.838 Test: blockdev writev readv 30 x 1block ...passed 00:08:26.838 Test: blockdev writev readv block ...passed 00:08:26.838 Test: blockdev writev readv size > 128k ...passed 00:08:26.838 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:26.838 Test: blockdev comparev and writev ...passed 00:08:26.838 Test: blockdev nvme passthru rw ...passed 00:08:26.838 Test: blockdev nvme passthru vendor specific ...passed 00:08:26.838 Test: blockdev nvme admin passthru ...passed 00:08:26.838 Test: blockdev copy ...passed 00:08:26.838 00:08:26.838 Run Summary: Type Total Ran Passed Failed Inactive 00:08:26.838 suites 16 16 n/a 0 0 00:08:26.838 tests 368 368 368 0 0 00:08:26.838 asserts 2224 2224 2224 0 n/a 00:08:26.838 00:08:26.838 Elapsed time = 0.532 seconds 00:08:27.097 0 00:08:27.097 23:50:27 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 367737 00:08:27.097 23:50:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@946 -- # '[' -z 367737 ']' 00:08:27.097 23:50:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@950 -- # kill -0 367737 00:08:27.097 23:50:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@951 -- # uname 00:08:27.097 23:50:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:08:27.097 23:50:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 367737 00:08:27.097 23:50:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:08:27.097 23:50:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:08:27.098 23:50:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@964 -- # echo 'killing process with pid 367737' 00:08:27.098 killing process with pid 367737 00:08:27.098 23:50:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@965 -- # kill 367737 00:08:27.098 23:50:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@970 -- # wait 367737 00:08:27.357 23:50:27 blockdev_general.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:08:27.357 00:08:27.357 real 0m1.585s 00:08:27.357 user 0m3.705s 00:08:27.357 sys 0m0.489s 00:08:27.357 23:50:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:27.357 23:50:27 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:08:27.357 ************************************ 00:08:27.357 END TEST bdev_bounds 00:08:27.357 ************************************ 00:08:27.358 23:50:27 blockdev_general -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:08:27.358 23:50:27 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:08:27.358 23:50:27 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:27.358 23:50:27 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:27.358 ************************************ 00:08:27.358 START TEST bdev_nbd 00:08:27.358 ************************************ 00:08:27.358 23:50:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@1121 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:08:27.358 23:50:27 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:08:27.358 23:50:27 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:08:27.358 23:50:27 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:27.358 23:50:27 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:08:27.358 23:50:27 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:27.358 23:50:27 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:08:27.358 23:50:27 blockdev_general.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=16 00:08:27.358 23:50:27 blockdev_general.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:08:27.358 23:50:27 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:27.358 23:50:27 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:08:27.358 23:50:27 blockdev_general.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=16 00:08:27.358 23:50:27 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:27.358 23:50:27 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:08:27.358 23:50:27 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:27.358 23:50:27 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:08:27.358 23:50:27 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=367965 00:08:27.358 23:50:27 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:08:27.358 23:50:27 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:08:27.358 23:50:27 blockdev_general.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 367965 /var/tmp/spdk-nbd.sock 00:08:27.358 23:50:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@827 -- # '[' -z 367965 ']' 00:08:27.358 23:50:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:27.358 23:50:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@832 -- # local max_retries=100 00:08:27.358 23:50:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:27.358 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:27.358 23:50:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@836 -- # xtrace_disable 00:08:27.358 23:50:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:08:27.617 [2024-05-14 23:50:27.976182] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:08:27.617 [2024-05-14 23:50:27.976255] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:08:27.617 [2024-05-14 23:50:28.107639] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:27.877 [2024-05-14 23:50:28.213414] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:27.877 [2024-05-14 23:50:28.378158] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:27.877 [2024-05-14 23:50:28.378229] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:27.877 [2024-05-14 23:50:28.378244] vbdev_passthru.c: 731:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:27.877 [2024-05-14 23:50:28.386164] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:27.877 [2024-05-14 23:50:28.386191] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:27.877 [2024-05-14 23:50:28.394174] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:27.877 [2024-05-14 23:50:28.394199] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:28.136 [2024-05-14 23:50:28.471555] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:28.136 [2024-05-14 23:50:28.471608] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:28.136 [2024-05-14 23:50:28.471631] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27e6930 00:08:28.136 [2024-05-14 23:50:28.471644] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:28.136 [2024-05-14 23:50:28.473099] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:28.136 [2024-05-14 23:50:28.473129] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:28.395 23:50:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:08:28.395 23:50:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@860 -- # return 0 00:08:28.395 23:50:28 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:08:28.395 23:50:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:28.395 23:50:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:28.395 23:50:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:08:28.395 23:50:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:08:28.395 23:50:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:28.395 23:50:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:28.395 23:50:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:08:28.395 23:50:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:08:28.395 23:50:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:08:28.395 23:50:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:08:28.395 23:50:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:28.395 23:50:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:08:28.395 23:50:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:08:28.395 23:50:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:08:28.395 23:50:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:08:28.395 23:50:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:08:28.395 23:50:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:28.395 23:50:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:28.395 23:50:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:28.395 23:50:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:08:28.654 23:50:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:28.654 23:50:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:28.654 23:50:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:28.654 23:50:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:28.654 1+0 records in 00:08:28.654 1+0 records out 00:08:28.654 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269407 s, 15.2 MB/s 00:08:28.654 23:50:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:28.654 23:50:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:28.654 23:50:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:28.654 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:28.654 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:28.654 23:50:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:28.654 23:50:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:28.654 23:50:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:08:28.913 23:50:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:08:28.913 23:50:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:08:28.913 23:50:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:08:28.913 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:08:28.913 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:28.913 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:28.913 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:28.913 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:08:28.913 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:28.913 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:28.913 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:28.913 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:28.913 1+0 records in 00:08:28.913 1+0 records out 00:08:28.913 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000284185 s, 14.4 MB/s 00:08:28.913 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:28.913 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:28.913 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:28.913 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:28.913 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:28.913 23:50:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:28.913 23:50:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:28.913 23:50:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:08:29.172 23:50:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:08:29.172 23:50:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:08:29.172 23:50:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:08:29.172 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd2 00:08:29.172 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:29.172 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:29.172 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:29.172 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd2 /proc/partitions 00:08:29.172 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:29.172 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:29.172 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:29.172 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:29.172 1+0 records in 00:08:29.172 1+0 records out 00:08:29.172 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000271028 s, 15.1 MB/s 00:08:29.172 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.172 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:29.172 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.172 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:29.172 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:29.172 23:50:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:29.172 23:50:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:29.172 23:50:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:08:29.432 23:50:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:08:29.432 23:50:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:08:29.432 23:50:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:08:29.432 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd3 00:08:29.432 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:29.432 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:29.432 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:29.432 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd3 /proc/partitions 00:08:29.432 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:29.432 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:29.432 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:29.432 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:29.432 1+0 records in 00:08:29.432 1+0 records out 00:08:29.432 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000319117 s, 12.8 MB/s 00:08:29.432 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.432 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:29.432 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.432 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:29.432 23:50:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:29.432 23:50:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:29.432 23:50:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:29.432 23:50:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:08:29.691 23:50:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:08:29.691 23:50:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:08:29.691 23:50:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:08:29.691 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd4 00:08:29.691 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:29.691 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:29.691 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:29.691 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd4 /proc/partitions 00:08:29.691 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:29.691 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:29.691 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:29.691 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:29.691 1+0 records in 00:08:29.691 1+0 records out 00:08:29.691 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000347209 s, 11.8 MB/s 00:08:29.691 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.691 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:29.691 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.691 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:29.691 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:29.692 23:50:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:29.692 23:50:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:29.692 23:50:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:08:29.951 23:50:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:08:29.951 23:50:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:08:29.951 23:50:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:08:29.951 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd5 00:08:29.951 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:29.951 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:29.951 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:29.951 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd5 /proc/partitions 00:08:29.951 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:29.951 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:29.951 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:29.951 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:29.951 1+0 records in 00:08:29.951 1+0 records out 00:08:29.951 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000446245 s, 9.2 MB/s 00:08:29.951 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.951 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:29.952 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.952 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:29.952 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:29.952 23:50:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:29.952 23:50:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:29.952 23:50:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:08:30.211 23:50:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:08:30.211 23:50:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:08:30.211 23:50:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:08:30.211 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd6 00:08:30.211 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:30.211 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:30.211 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:30.211 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd6 /proc/partitions 00:08:30.211 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:30.211 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:30.211 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:30.211 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:30.211 1+0 records in 00:08:30.211 1+0 records out 00:08:30.211 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000421854 s, 9.7 MB/s 00:08:30.211 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:30.211 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:30.211 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:30.211 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:30.211 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:30.211 23:50:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:30.211 23:50:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:30.211 23:50:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:08:30.470 23:50:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:08:30.470 23:50:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:08:30.470 23:50:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:08:30.470 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd7 00:08:30.470 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:30.470 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:30.470 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:30.470 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd7 /proc/partitions 00:08:30.470 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:30.470 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:30.470 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:30.470 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:30.470 1+0 records in 00:08:30.470 1+0 records out 00:08:30.470 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000488718 s, 8.4 MB/s 00:08:30.470 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:30.470 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:30.470 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:30.470 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:30.470 23:50:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:30.470 23:50:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:30.470 23:50:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:30.470 23:50:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:08:30.729 23:50:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:08:30.729 23:50:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:08:30.729 23:50:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:08:30.729 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd8 00:08:30.729 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:30.729 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:30.729 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:30.729 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd8 /proc/partitions 00:08:30.729 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:30.729 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:30.729 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:30.729 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:30.729 1+0 records in 00:08:30.729 1+0 records out 00:08:30.729 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000539524 s, 7.6 MB/s 00:08:30.729 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:30.729 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:30.729 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:30.729 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:30.729 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:30.729 23:50:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:30.729 23:50:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:30.729 23:50:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:08:30.988 23:50:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:08:30.988 23:50:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:08:30.988 23:50:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:08:30.988 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd9 00:08:30.988 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:30.988 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:30.988 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:30.988 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd9 /proc/partitions 00:08:30.988 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:30.988 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:30.988 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:30.988 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:30.988 1+0 records in 00:08:30.988 1+0 records out 00:08:30.988 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000508881 s, 8.0 MB/s 00:08:30.988 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:30.988 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:30.988 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:30.988 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:30.988 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:30.988 23:50:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:30.988 23:50:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:30.988 23:50:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:08:31.247 23:50:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:08:31.247 23:50:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:08:31.247 23:50:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:08:31.247 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd10 00:08:31.509 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:31.509 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:31.509 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:31.509 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd10 /proc/partitions 00:08:31.509 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:31.509 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:31.509 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:31.509 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:31.509 1+0 records in 00:08:31.509 1+0 records out 00:08:31.509 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000664697 s, 6.2 MB/s 00:08:31.509 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:31.509 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:31.509 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:31.509 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:31.509 23:50:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:31.509 23:50:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:31.509 23:50:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:31.509 23:50:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:08:31.818 23:50:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:08:31.818 23:50:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:08:31.818 23:50:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:08:31.818 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd11 00:08:31.818 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:31.818 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:31.818 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:31.818 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd11 /proc/partitions 00:08:31.818 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:31.818 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:31.818 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:31.818 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:31.818 1+0 records in 00:08:31.818 1+0 records out 00:08:31.818 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000495648 s, 8.3 MB/s 00:08:31.818 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:31.818 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:31.818 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:31.818 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:31.818 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:31.818 23:50:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:31.819 23:50:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:31.819 23:50:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:08:31.819 23:50:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:08:31.819 23:50:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:08:32.078 23:50:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:08:32.078 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd12 00:08:32.078 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:32.078 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:32.078 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:32.078 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd12 /proc/partitions 00:08:32.078 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:32.078 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:32.078 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:32.078 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:32.078 1+0 records in 00:08:32.078 1+0 records out 00:08:32.078 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000754725 s, 5.4 MB/s 00:08:32.078 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:32.078 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:32.078 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:32.078 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:32.078 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:32.078 23:50:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:32.078 23:50:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:32.078 23:50:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:08:32.337 23:50:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:08:32.337 23:50:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:08:32.337 23:50:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:08:32.337 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd13 00:08:32.337 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:32.337 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:32.337 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:32.337 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd13 /proc/partitions 00:08:32.337 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:32.337 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:32.337 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:32.337 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:32.337 1+0 records in 00:08:32.337 1+0 records out 00:08:32.337 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000940217 s, 4.4 MB/s 00:08:32.337 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:32.337 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:32.337 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:32.337 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:32.337 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:32.337 23:50:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:32.337 23:50:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:32.337 23:50:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:08:32.596 23:50:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:08:32.596 23:50:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:08:32.596 23:50:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:08:32.597 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd14 00:08:32.597 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:32.597 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:32.597 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:32.597 23:50:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd14 /proc/partitions 00:08:32.597 23:50:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:32.597 23:50:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:32.597 23:50:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:32.597 23:50:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:32.597 1+0 records in 00:08:32.597 1+0 records out 00:08:32.597 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000763932 s, 5.4 MB/s 00:08:32.597 23:50:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:32.597 23:50:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:32.597 23:50:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:32.597 23:50:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:32.597 23:50:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:32.597 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:32.597 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:32.597 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:08:32.856 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:08:32.856 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:08:32.856 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:08:32.856 23:50:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd15 00:08:32.856 23:50:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:32.856 23:50:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:32.856 23:50:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:32.856 23:50:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd15 /proc/partitions 00:08:32.856 23:50:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:32.856 23:50:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:32.856 23:50:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:32.856 23:50:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:32.856 1+0 records in 00:08:32.856 1+0 records out 00:08:32.856 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000561249 s, 7.3 MB/s 00:08:32.856 23:50:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:32.856 23:50:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:32.856 23:50:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:32.856 23:50:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:32.856 23:50:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:32.856 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:32.856 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:32.856 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:33.114 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:08:33.114 { 00:08:33.114 "nbd_device": "/dev/nbd0", 00:08:33.114 "bdev_name": "Malloc0" 00:08:33.114 }, 00:08:33.114 { 00:08:33.114 "nbd_device": "/dev/nbd1", 00:08:33.114 "bdev_name": "Malloc1p0" 00:08:33.114 }, 00:08:33.114 { 00:08:33.114 "nbd_device": "/dev/nbd2", 00:08:33.114 "bdev_name": "Malloc1p1" 00:08:33.114 }, 00:08:33.114 { 00:08:33.114 "nbd_device": "/dev/nbd3", 00:08:33.114 "bdev_name": "Malloc2p0" 00:08:33.115 }, 00:08:33.115 { 00:08:33.115 "nbd_device": "/dev/nbd4", 00:08:33.115 "bdev_name": "Malloc2p1" 00:08:33.115 }, 00:08:33.115 { 00:08:33.115 "nbd_device": "/dev/nbd5", 00:08:33.115 "bdev_name": "Malloc2p2" 00:08:33.115 }, 00:08:33.115 { 00:08:33.115 "nbd_device": "/dev/nbd6", 00:08:33.115 "bdev_name": "Malloc2p3" 00:08:33.115 }, 00:08:33.115 { 00:08:33.115 "nbd_device": "/dev/nbd7", 00:08:33.115 "bdev_name": "Malloc2p4" 00:08:33.115 }, 00:08:33.115 { 00:08:33.115 "nbd_device": "/dev/nbd8", 00:08:33.115 "bdev_name": "Malloc2p5" 00:08:33.115 }, 00:08:33.115 { 00:08:33.115 "nbd_device": "/dev/nbd9", 00:08:33.115 "bdev_name": "Malloc2p6" 00:08:33.115 }, 00:08:33.115 { 00:08:33.115 "nbd_device": "/dev/nbd10", 00:08:33.115 "bdev_name": "Malloc2p7" 00:08:33.115 }, 00:08:33.115 { 00:08:33.115 "nbd_device": "/dev/nbd11", 00:08:33.115 "bdev_name": "TestPT" 00:08:33.115 }, 00:08:33.115 { 00:08:33.115 "nbd_device": "/dev/nbd12", 00:08:33.115 "bdev_name": "raid0" 00:08:33.115 }, 00:08:33.115 { 00:08:33.115 "nbd_device": "/dev/nbd13", 00:08:33.115 "bdev_name": "concat0" 00:08:33.115 }, 00:08:33.115 { 00:08:33.115 "nbd_device": "/dev/nbd14", 00:08:33.115 "bdev_name": "raid1" 00:08:33.115 }, 00:08:33.115 { 00:08:33.115 "nbd_device": "/dev/nbd15", 00:08:33.115 "bdev_name": "AIO0" 00:08:33.115 } 00:08:33.115 ]' 00:08:33.115 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:08:33.115 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:08:33.115 { 00:08:33.115 "nbd_device": "/dev/nbd0", 00:08:33.115 "bdev_name": "Malloc0" 00:08:33.115 }, 00:08:33.115 { 00:08:33.115 "nbd_device": "/dev/nbd1", 00:08:33.115 "bdev_name": "Malloc1p0" 00:08:33.115 }, 00:08:33.115 { 00:08:33.115 "nbd_device": "/dev/nbd2", 00:08:33.115 "bdev_name": "Malloc1p1" 00:08:33.115 }, 00:08:33.115 { 00:08:33.115 "nbd_device": "/dev/nbd3", 00:08:33.115 "bdev_name": "Malloc2p0" 00:08:33.115 }, 00:08:33.115 { 00:08:33.115 "nbd_device": "/dev/nbd4", 00:08:33.115 "bdev_name": "Malloc2p1" 00:08:33.115 }, 00:08:33.115 { 00:08:33.115 "nbd_device": "/dev/nbd5", 00:08:33.115 "bdev_name": "Malloc2p2" 00:08:33.115 }, 00:08:33.115 { 00:08:33.115 "nbd_device": "/dev/nbd6", 00:08:33.115 "bdev_name": "Malloc2p3" 00:08:33.115 }, 00:08:33.115 { 00:08:33.115 "nbd_device": "/dev/nbd7", 00:08:33.115 "bdev_name": "Malloc2p4" 00:08:33.115 }, 00:08:33.115 { 00:08:33.115 "nbd_device": "/dev/nbd8", 00:08:33.115 "bdev_name": "Malloc2p5" 00:08:33.115 }, 00:08:33.115 { 00:08:33.115 "nbd_device": "/dev/nbd9", 00:08:33.115 "bdev_name": "Malloc2p6" 00:08:33.115 }, 00:08:33.115 { 00:08:33.115 "nbd_device": "/dev/nbd10", 00:08:33.115 "bdev_name": "Malloc2p7" 00:08:33.115 }, 00:08:33.115 { 00:08:33.115 "nbd_device": "/dev/nbd11", 00:08:33.115 "bdev_name": "TestPT" 00:08:33.115 }, 00:08:33.115 { 00:08:33.115 "nbd_device": "/dev/nbd12", 00:08:33.115 "bdev_name": "raid0" 00:08:33.115 }, 00:08:33.115 { 00:08:33.115 "nbd_device": "/dev/nbd13", 00:08:33.115 "bdev_name": "concat0" 00:08:33.115 }, 00:08:33.115 { 00:08:33.115 "nbd_device": "/dev/nbd14", 00:08:33.115 "bdev_name": "raid1" 00:08:33.115 }, 00:08:33.115 { 00:08:33.115 "nbd_device": "/dev/nbd15", 00:08:33.115 "bdev_name": "AIO0" 00:08:33.115 } 00:08:33.115 ]' 00:08:33.115 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:08:33.115 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:08:33.115 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:33.115 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:08:33.115 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:33.115 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:08:33.115 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:33.115 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:33.374 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:33.374 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:33.374 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:33.374 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:33.374 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:33.374 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:33.374 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:33.374 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:33.374 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:33.374 23:50:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:33.633 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:33.633 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:33.633 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:33.633 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:33.633 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:33.633 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:33.633 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:33.633 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:33.633 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:33.633 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:08:33.892 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:08:33.892 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:08:33.892 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:08:33.892 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:33.892 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:33.892 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:08:33.892 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:33.892 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:33.892 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:33.892 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:08:34.151 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:08:34.151 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:08:34.151 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:08:34.151 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:34.151 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:34.151 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:08:34.151 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:34.151 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:34.151 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:34.151 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:08:34.410 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:08:34.410 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:08:34.410 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:08:34.410 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:34.410 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:34.410 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:08:34.410 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:34.410 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:34.410 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:34.410 23:50:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:08:34.670 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:08:34.670 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:08:34.670 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:08:34.670 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:34.670 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:34.670 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:08:34.670 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:34.670 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:34.670 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:34.670 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:08:34.929 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:08:34.929 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:08:34.929 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:08:34.929 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:34.929 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:34.929 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:08:34.929 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:34.929 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:34.929 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:34.929 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:08:35.188 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:08:35.188 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:08:35.188 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:08:35.188 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:35.188 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:35.188 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:08:35.188 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:35.188 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:35.188 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:35.188 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:08:35.447 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:08:35.447 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:08:35.447 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:08:35.447 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:35.447 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:35.447 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:08:35.447 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:35.447 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:35.447 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:35.447 23:50:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:08:35.706 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:08:35.706 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:08:35.706 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:08:35.706 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:35.706 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:35.706 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:08:35.706 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:35.706 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:35.706 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:35.706 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:08:35.965 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:08:35.965 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:08:35.965 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:08:35.965 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:35.965 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:35.965 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:08:35.965 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:35.965 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:35.965 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:35.965 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:08:36.223 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:08:36.223 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:08:36.223 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:08:36.223 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:36.223 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:36.223 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:08:36.223 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:36.223 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:36.223 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:36.223 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:08:36.482 23:50:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:08:36.482 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:08:36.482 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:08:36.482 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:36.482 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:36.482 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:08:36.482 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:36.482 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:36.482 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:36.482 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:08:36.742 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:08:36.742 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:08:36.742 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:08:36.742 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:36.742 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:36.742 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:08:36.742 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:36.742 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:36.742 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:36.742 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:08:37.000 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:08:37.001 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:08:37.001 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:08:37.001 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:37.001 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:37.001 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:08:37.001 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:37.001 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:37.001 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:37.001 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:08:37.260 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:08:37.260 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:08:37.260 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:08:37.260 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:37.260 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:37.260 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:08:37.260 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:37.260 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:37.260 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:37.260 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:37.260 23:50:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:37.519 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:37.519 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:37.519 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:37.519 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:37.519 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:08:37.519 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:37.519 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:08:37.519 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:08:37.519 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:08:37.519 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:08:37.779 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:08:37.779 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:08:37.779 23:50:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:37.779 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:37.779 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:37.779 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:08:37.779 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:37.779 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:08:37.779 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:37.779 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:37.779 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:37.779 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:37.779 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:37.779 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:37.779 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:08:37.779 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:37.779 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:37.779 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:08:37.779 /dev/nbd0 00:08:38.038 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:38.038 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:38.038 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:08:38.038 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:38.038 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:38.038 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:38.039 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:08:38.039 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:38.039 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:38.039 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:38.039 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:38.039 1+0 records in 00:08:38.039 1+0 records out 00:08:38.039 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000264626 s, 15.5 MB/s 00:08:38.039 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.039 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:38.039 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.039 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:38.039 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:38.039 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:38.039 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:38.039 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:08:38.299 /dev/nbd1 00:08:38.299 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:08:38.299 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:08:38.299 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:08:38.299 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:38.299 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:38.299 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:38.299 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:08:38.299 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:38.299 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:38.299 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:38.299 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:38.299 1+0 records in 00:08:38.299 1+0 records out 00:08:38.299 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000293321 s, 14.0 MB/s 00:08:38.299 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.299 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:38.299 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.299 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:38.299 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:38.299 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:38.299 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:38.299 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:08:38.559 /dev/nbd10 00:08:38.559 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:08:38.559 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:08:38.559 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd10 00:08:38.559 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:38.559 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:38.559 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:38.559 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd10 /proc/partitions 00:08:38.559 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:38.559 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:38.559 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:38.559 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:38.559 1+0 records in 00:08:38.559 1+0 records out 00:08:38.559 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000333067 s, 12.3 MB/s 00:08:38.559 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.559 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:38.559 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.559 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:38.559 23:50:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:38.559 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:38.559 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:38.559 23:50:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:08:38.818 /dev/nbd11 00:08:38.818 23:50:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:08:38.818 23:50:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:08:38.818 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd11 00:08:38.818 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:38.818 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:38.818 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:38.818 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd11 /proc/partitions 00:08:38.818 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:38.818 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:38.818 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:38.818 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:38.818 1+0 records in 00:08:38.818 1+0 records out 00:08:38.818 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000311992 s, 13.1 MB/s 00:08:38.818 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.818 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:38.818 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.818 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:38.818 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:38.818 23:50:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:38.818 23:50:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:38.818 23:50:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:08:39.077 /dev/nbd12 00:08:39.077 23:50:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:08:39.077 23:50:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:08:39.077 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd12 00:08:39.077 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:39.077 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:39.077 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:39.077 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd12 /proc/partitions 00:08:39.077 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:39.077 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:39.077 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:39.077 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:39.077 1+0 records in 00:08:39.077 1+0 records out 00:08:39.077 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000457583 s, 9.0 MB/s 00:08:39.077 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:39.077 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:39.077 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:39.077 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:39.077 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:39.077 23:50:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:39.077 23:50:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:39.077 23:50:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:08:39.336 /dev/nbd13 00:08:39.336 23:50:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:08:39.336 23:50:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:08:39.336 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd13 00:08:39.336 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:39.336 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:39.336 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:39.336 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd13 /proc/partitions 00:08:39.336 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:39.336 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:39.336 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:39.337 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:39.337 1+0 records in 00:08:39.337 1+0 records out 00:08:39.337 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000367005 s, 11.2 MB/s 00:08:39.337 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:39.337 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:39.337 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:39.337 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:39.337 23:50:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:39.337 23:50:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:39.337 23:50:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:39.337 23:50:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:08:39.596 /dev/nbd14 00:08:39.596 23:50:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:08:39.596 23:50:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:08:39.596 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd14 00:08:39.596 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:39.596 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:39.596 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:39.596 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd14 /proc/partitions 00:08:39.596 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:39.596 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:39.596 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:39.596 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:39.596 1+0 records in 00:08:39.596 1+0 records out 00:08:39.596 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000418205 s, 9.8 MB/s 00:08:39.596 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:39.596 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:39.596 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:39.596 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:39.596 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:39.596 23:50:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:39.596 23:50:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:39.596 23:50:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:08:39.855 /dev/nbd15 00:08:39.855 23:50:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:08:39.855 23:50:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:08:39.855 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd15 00:08:39.855 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:39.855 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:39.855 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:39.855 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd15 /proc/partitions 00:08:39.855 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:39.855 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:39.855 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:39.855 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:39.855 1+0 records in 00:08:39.855 1+0 records out 00:08:39.855 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000448338 s, 9.1 MB/s 00:08:39.855 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:39.855 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:39.855 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:39.855 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:39.855 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:39.855 23:50:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:39.855 23:50:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:39.855 23:50:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:08:40.114 /dev/nbd2 00:08:40.114 23:50:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:08:40.114 23:50:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:08:40.114 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd2 00:08:40.114 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:40.114 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:40.114 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:40.114 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd2 /proc/partitions 00:08:40.114 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:40.114 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:40.114 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:40.114 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:40.114 1+0 records in 00:08:40.114 1+0 records out 00:08:40.114 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000479266 s, 8.5 MB/s 00:08:40.114 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:40.114 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:40.114 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:40.114 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:40.114 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:40.114 23:50:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:40.114 23:50:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:40.114 23:50:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:08:40.373 /dev/nbd3 00:08:40.373 23:50:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:08:40.373 23:50:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:08:40.373 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd3 00:08:40.373 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:40.373 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:40.373 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:40.373 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd3 /proc/partitions 00:08:40.373 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:40.373 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:40.373 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:40.373 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:40.373 1+0 records in 00:08:40.373 1+0 records out 00:08:40.373 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000623921 s, 6.6 MB/s 00:08:40.373 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:40.373 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:40.373 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:40.373 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:40.373 23:50:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:40.373 23:50:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:40.373 23:50:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:40.373 23:50:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:08:40.632 /dev/nbd4 00:08:40.632 23:50:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:08:40.632 23:50:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:08:40.632 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd4 00:08:40.632 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:40.632 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:40.632 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:40.633 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd4 /proc/partitions 00:08:40.633 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:40.633 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:40.633 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:40.633 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:40.633 1+0 records in 00:08:40.633 1+0 records out 00:08:40.633 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000484342 s, 8.5 MB/s 00:08:40.633 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:40.633 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:40.633 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:40.633 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:40.633 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:40.633 23:50:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:40.633 23:50:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:40.633 23:50:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:08:40.892 /dev/nbd5 00:08:40.892 23:50:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:08:40.892 23:50:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:08:40.892 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd5 00:08:40.892 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:40.892 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:40.892 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:40.892 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd5 /proc/partitions 00:08:40.892 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:40.892 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:40.892 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:40.892 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:40.892 1+0 records in 00:08:40.892 1+0 records out 00:08:40.892 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000517409 s, 7.9 MB/s 00:08:40.892 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:40.892 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:40.892 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:40.892 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:40.892 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:40.892 23:50:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:40.892 23:50:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:40.892 23:50:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:08:41.151 /dev/nbd6 00:08:41.151 23:50:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:08:41.151 23:50:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:08:41.151 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd6 00:08:41.151 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:41.151 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:41.151 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:41.151 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd6 /proc/partitions 00:08:41.151 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:41.151 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:41.151 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:41.151 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:41.151 1+0 records in 00:08:41.151 1+0 records out 00:08:41.151 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000601581 s, 6.8 MB/s 00:08:41.151 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:41.151 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:41.151 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:41.151 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:41.151 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:41.151 23:50:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:41.151 23:50:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:41.151 23:50:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:08:41.411 /dev/nbd7 00:08:41.411 23:50:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:08:41.411 23:50:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:08:41.411 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd7 00:08:41.411 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:41.411 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:41.411 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:41.411 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd7 /proc/partitions 00:08:41.411 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:41.411 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:41.411 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:41.411 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:41.411 1+0 records in 00:08:41.411 1+0 records out 00:08:41.411 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000807019 s, 5.1 MB/s 00:08:41.411 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:41.411 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:41.411 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:41.411 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:41.411 23:50:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:41.411 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:41.670 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:41.670 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:08:41.670 /dev/nbd8 00:08:41.670 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:08:41.670 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:08:41.930 23:50:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd8 00:08:41.930 23:50:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:41.930 23:50:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:41.930 23:50:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:41.930 23:50:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd8 /proc/partitions 00:08:41.930 23:50:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:41.930 23:50:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:41.930 23:50:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:41.930 23:50:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:41.930 1+0 records in 00:08:41.930 1+0 records out 00:08:41.930 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00076582 s, 5.3 MB/s 00:08:41.930 23:50:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:41.930 23:50:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:41.930 23:50:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:41.930 23:50:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:41.930 23:50:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:41.930 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:41.930 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:41.930 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:08:42.189 /dev/nbd9 00:08:42.189 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:08:42.189 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:08:42.189 23:50:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd9 00:08:42.189 23:50:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:42.189 23:50:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:42.189 23:50:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:42.189 23:50:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd9 /proc/partitions 00:08:42.189 23:50:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:42.189 23:50:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:42.189 23:50:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:42.190 23:50:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:42.190 1+0 records in 00:08:42.190 1+0 records out 00:08:42.190 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00056142 s, 7.3 MB/s 00:08:42.190 23:50:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:42.190 23:50:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:42.190 23:50:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:42.190 23:50:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:42.190 23:50:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:42.190 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:42.190 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:42.190 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:42.190 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:42.190 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:42.450 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd0", 00:08:42.450 "bdev_name": "Malloc0" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd1", 00:08:42.450 "bdev_name": "Malloc1p0" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd10", 00:08:42.450 "bdev_name": "Malloc1p1" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd11", 00:08:42.450 "bdev_name": "Malloc2p0" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd12", 00:08:42.450 "bdev_name": "Malloc2p1" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd13", 00:08:42.450 "bdev_name": "Malloc2p2" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd14", 00:08:42.450 "bdev_name": "Malloc2p3" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd15", 00:08:42.450 "bdev_name": "Malloc2p4" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd2", 00:08:42.450 "bdev_name": "Malloc2p5" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd3", 00:08:42.450 "bdev_name": "Malloc2p6" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd4", 00:08:42.450 "bdev_name": "Malloc2p7" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd5", 00:08:42.450 "bdev_name": "TestPT" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd6", 00:08:42.450 "bdev_name": "raid0" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd7", 00:08:42.450 "bdev_name": "concat0" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd8", 00:08:42.450 "bdev_name": "raid1" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd9", 00:08:42.450 "bdev_name": "AIO0" 00:08:42.450 } 00:08:42.450 ]' 00:08:42.450 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd0", 00:08:42.450 "bdev_name": "Malloc0" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd1", 00:08:42.450 "bdev_name": "Malloc1p0" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd10", 00:08:42.450 "bdev_name": "Malloc1p1" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd11", 00:08:42.450 "bdev_name": "Malloc2p0" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd12", 00:08:42.450 "bdev_name": "Malloc2p1" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd13", 00:08:42.450 "bdev_name": "Malloc2p2" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd14", 00:08:42.450 "bdev_name": "Malloc2p3" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd15", 00:08:42.450 "bdev_name": "Malloc2p4" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd2", 00:08:42.450 "bdev_name": "Malloc2p5" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd3", 00:08:42.450 "bdev_name": "Malloc2p6" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd4", 00:08:42.450 "bdev_name": "Malloc2p7" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd5", 00:08:42.450 "bdev_name": "TestPT" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd6", 00:08:42.450 "bdev_name": "raid0" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd7", 00:08:42.450 "bdev_name": "concat0" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd8", 00:08:42.450 "bdev_name": "raid1" 00:08:42.450 }, 00:08:42.450 { 00:08:42.450 "nbd_device": "/dev/nbd9", 00:08:42.450 "bdev_name": "AIO0" 00:08:42.450 } 00:08:42.450 ]' 00:08:42.450 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:42.450 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:08:42.450 /dev/nbd1 00:08:42.450 /dev/nbd10 00:08:42.450 /dev/nbd11 00:08:42.450 /dev/nbd12 00:08:42.450 /dev/nbd13 00:08:42.450 /dev/nbd14 00:08:42.450 /dev/nbd15 00:08:42.450 /dev/nbd2 00:08:42.450 /dev/nbd3 00:08:42.450 /dev/nbd4 00:08:42.450 /dev/nbd5 00:08:42.450 /dev/nbd6 00:08:42.450 /dev/nbd7 00:08:42.450 /dev/nbd8 00:08:42.450 /dev/nbd9' 00:08:42.450 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:42.450 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:08:42.450 /dev/nbd1 00:08:42.450 /dev/nbd10 00:08:42.450 /dev/nbd11 00:08:42.450 /dev/nbd12 00:08:42.450 /dev/nbd13 00:08:42.450 /dev/nbd14 00:08:42.450 /dev/nbd15 00:08:42.450 /dev/nbd2 00:08:42.450 /dev/nbd3 00:08:42.450 /dev/nbd4 00:08:42.450 /dev/nbd5 00:08:42.450 /dev/nbd6 00:08:42.450 /dev/nbd7 00:08:42.450 /dev/nbd8 00:08:42.450 /dev/nbd9' 00:08:42.450 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:08:42.450 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:08:42.450 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:08:42.450 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:08:42.450 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:08:42.450 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:42.450 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:42.450 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:08:42.450 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:08:42.450 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:08:42.450 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:08:42.450 256+0 records in 00:08:42.450 256+0 records out 00:08:42.450 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0104406 s, 100 MB/s 00:08:42.450 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:42.450 23:50:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:08:42.709 256+0 records in 00:08:42.709 256+0 records out 00:08:42.709 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183371 s, 5.7 MB/s 00:08:42.709 23:50:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:42.709 23:50:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:08:42.709 256+0 records in 00:08:42.709 256+0 records out 00:08:42.709 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.185154 s, 5.7 MB/s 00:08:42.709 23:50:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:42.709 23:50:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:08:42.967 256+0 records in 00:08:42.967 256+0 records out 00:08:42.967 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.185328 s, 5.7 MB/s 00:08:42.967 23:50:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:42.967 23:50:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:08:43.226 256+0 records in 00:08:43.226 256+0 records out 00:08:43.226 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183667 s, 5.7 MB/s 00:08:43.226 23:50:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:43.226 23:50:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:08:43.485 256+0 records in 00:08:43.485 256+0 records out 00:08:43.485 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.185453 s, 5.7 MB/s 00:08:43.485 23:50:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:43.485 23:50:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:08:43.485 256+0 records in 00:08:43.485 256+0 records out 00:08:43.485 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.185238 s, 5.7 MB/s 00:08:43.485 23:50:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:43.485 23:50:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:08:43.745 256+0 records in 00:08:43.745 256+0 records out 00:08:43.745 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.185404 s, 5.7 MB/s 00:08:43.745 23:50:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:43.745 23:50:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:08:44.003 256+0 records in 00:08:44.003 256+0 records out 00:08:44.003 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.182309 s, 5.8 MB/s 00:08:44.003 23:50:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:44.003 23:50:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:08:44.003 256+0 records in 00:08:44.003 256+0 records out 00:08:44.003 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184563 s, 5.7 MB/s 00:08:44.003 23:50:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:44.003 23:50:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:08:44.261 256+0 records in 00:08:44.261 256+0 records out 00:08:44.261 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.185321 s, 5.7 MB/s 00:08:44.261 23:50:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:44.261 23:50:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:08:44.520 256+0 records in 00:08:44.520 256+0 records out 00:08:44.520 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.185516 s, 5.7 MB/s 00:08:44.520 23:50:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:44.520 23:50:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:08:44.779 256+0 records in 00:08:44.779 256+0 records out 00:08:44.779 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.18559 s, 5.6 MB/s 00:08:44.779 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:44.779 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:08:44.779 256+0 records in 00:08:44.779 256+0 records out 00:08:44.779 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.18555 s, 5.7 MB/s 00:08:44.779 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:44.779 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:08:45.037 256+0 records in 00:08:45.037 256+0 records out 00:08:45.037 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.181639 s, 5.8 MB/s 00:08:45.037 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:45.037 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:08:45.328 256+0 records in 00:08:45.328 256+0 records out 00:08:45.328 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.189943 s, 5.5 MB/s 00:08:45.328 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:45.328 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:08:45.587 256+0 records in 00:08:45.587 256+0 records out 00:08:45.587 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184249 s, 5.7 MB/s 00:08:45.588 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:08:45.588 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:45.588 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:45.588 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:08:45.588 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:08:45.588 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:08:45.588 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:08:45.588 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:45.588 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:08:45.588 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:45.588 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:08:45.588 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:45.588 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:08:45.588 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:45.588 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:08:45.588 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:45.588 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:08:45.588 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:45.588 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:08:45.588 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:45.588 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:08:45.588 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:45.588 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:08:45.588 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:45.588 23:50:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:08:45.588 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:45.588 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:08:45.588 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:45.588 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:08:45.588 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:45.588 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:08:45.588 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:45.588 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:08:45.588 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:45.588 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:08:45.588 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:45.588 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:08:45.588 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:45.588 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:08:45.588 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:08:45.588 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:45.588 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:45.588 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:45.588 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:45.588 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:08:45.588 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:45.588 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:45.848 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:45.848 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:45.848 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:45.848 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:45.848 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:45.848 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:45.848 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:45.848 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:45.848 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:45.848 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:46.107 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:46.107 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:46.107 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:46.107 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:46.107 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:46.107 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:46.107 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:46.107 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:46.107 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:46.107 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:08:46.366 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:08:46.366 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:08:46.366 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:08:46.366 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:46.366 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:46.366 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:08:46.366 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:46.366 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:46.366 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:46.366 23:50:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:08:46.625 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:08:46.625 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:08:46.625 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:08:46.625 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:46.625 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:46.625 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:08:46.625 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:46.625 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:46.625 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:46.625 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:08:46.884 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:08:46.884 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:08:46.884 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:08:46.884 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:46.884 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:46.884 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:08:46.884 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:46.884 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:46.884 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:46.884 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:08:47.142 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:08:47.142 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:08:47.142 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:08:47.142 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:47.142 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:47.142 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:08:47.142 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:47.142 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:47.142 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:47.142 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:08:47.401 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:08:47.401 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:08:47.401 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:08:47.401 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:47.401 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:47.659 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:08:47.659 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:47.659 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:47.659 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:47.659 23:50:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:08:47.918 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:08:47.918 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:08:47.918 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:08:47.918 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:47.918 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:47.918 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:08:47.918 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:47.918 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:47.918 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:47.918 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:08:47.918 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:08:47.918 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:08:47.918 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:08:47.918 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:47.918 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:47.918 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:08:47.918 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:47.918 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:47.918 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:47.918 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:08:48.176 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:08:48.176 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:08:48.176 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:08:48.176 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:48.176 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:48.176 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:08:48.176 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:48.176 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:48.176 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:48.176 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:08:48.435 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:08:48.435 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:08:48.435 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:08:48.435 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:48.435 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:48.435 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:08:48.435 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:48.435 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:48.435 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:48.435 23:50:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:08:48.694 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:08:48.694 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:08:48.694 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:08:48.694 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:48.694 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:48.694 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:08:48.694 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:48.694 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:48.694 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:48.694 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:08:48.953 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:08:48.953 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:08:48.953 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:08:48.953 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:48.953 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:48.953 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:08:48.953 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:48.953 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:48.953 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:48.953 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:08:49.212 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:08:49.212 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:08:49.212 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:08:49.212 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:49.212 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:49.212 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:08:49.212 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:49.212 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:49.212 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:49.212 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:08:49.470 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:08:49.470 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:08:49.470 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:08:49.470 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:49.470 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:49.470 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:08:49.470 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:49.470 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:49.470 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:49.470 23:50:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:08:49.729 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:08:49.729 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:08:49.729 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:08:49.729 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:49.729 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:49.729 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:08:49.729 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:49.729 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:49.729 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:49.729 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:49.729 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:49.987 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:49.987 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:49.987 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:49.987 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:49.987 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:08:49.987 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:49.987 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:08:49.987 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:08:49.987 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:08:49.987 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:08:49.987 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:08:49.987 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:08:49.988 23:50:50 blockdev_general.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:49.988 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:49.988 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:49.988 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:08:49.988 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:08:49.988 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:08:50.246 malloc_lvol_verify 00:08:50.246 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:08:50.504 b462de3a-6ab5-4b71-99a1-0ade359f6783 00:08:50.504 23:50:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:08:50.763 6e94ece4-651f-42ce-bb77-a1256d60cfb4 00:08:50.763 23:50:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:08:51.021 /dev/nbd0 00:08:51.021 23:50:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:08:51.021 mke2fs 1.46.5 (30-Dec-2021) 00:08:51.022 Discarding device blocks: 0/4096 done 00:08:51.022 Creating filesystem with 4096 1k blocks and 1024 inodes 00:08:51.022 00:08:51.022 Allocating group tables: 0/1 done 00:08:51.022 Writing inode tables: 0/1 done 00:08:51.022 Creating journal (1024 blocks): done 00:08:51.022 Writing superblocks and filesystem accounting information: 0/1 done 00:08:51.022 00:08:51.022 23:50:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:08:51.022 23:50:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:08:51.022 23:50:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:51.022 23:50:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:08:51.022 23:50:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:51.022 23:50:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:08:51.022 23:50:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:51.022 23:50:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:51.280 23:50:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:51.280 23:50:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:51.280 23:50:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:51.280 23:50:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:51.280 23:50:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:51.280 23:50:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:51.280 23:50:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:51.280 23:50:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:51.280 23:50:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:08:51.280 23:50:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:08:51.281 23:50:51 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 367965 00:08:51.281 23:50:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@946 -- # '[' -z 367965 ']' 00:08:51.281 23:50:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@950 -- # kill -0 367965 00:08:51.281 23:50:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@951 -- # uname 00:08:51.281 23:50:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:08:51.281 23:50:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 367965 00:08:51.281 23:50:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:08:51.281 23:50:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:08:51.281 23:50:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@964 -- # echo 'killing process with pid 367965' 00:08:51.281 killing process with pid 367965 00:08:51.281 23:50:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@965 -- # kill 367965 00:08:51.281 23:50:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@970 -- # wait 367965 00:08:51.540 23:50:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:08:51.540 00:08:51.540 real 0m24.186s 00:08:51.540 user 0m29.473s 00:08:51.540 sys 0m14.189s 00:08:51.540 23:50:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:51.540 23:50:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:08:51.540 ************************************ 00:08:51.540 END TEST bdev_nbd 00:08:51.540 ************************************ 00:08:51.799 23:50:52 blockdev_general -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:08:51.800 23:50:52 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = nvme ']' 00:08:51.800 23:50:52 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = gpt ']' 00:08:51.800 23:50:52 blockdev_general -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:08:51.800 23:50:52 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:08:51.800 23:50:52 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:51.800 23:50:52 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:51.800 ************************************ 00:08:51.800 START TEST bdev_fio 00:08:51.800 ************************************ 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1121 -- # fio_test_suite '' 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:08:51.800 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1276 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1277 -- # local workload=verify 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1278 -- # local bdev_type=AIO 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1279 -- # local env_context= 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local fio_dir=/usr/src/fio 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1287 -- # '[' -z verify ']' 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -n '' ']' 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1297 -- # cat 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1309 -- # '[' verify == verify ']' 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1310 -- # cat 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1319 -- # '[' AIO == AIO ']' 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1320 -- # /usr/src/fio/fio --version 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1320 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1321 -- # echo serialize_overlap=1 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc0]' 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc0 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p0]' 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p0 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p1]' 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p1 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p0]' 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p0 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p1]' 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p1 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p2]' 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p2 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p3]' 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p3 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p4]' 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p4 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p5]' 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p5 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p6]' 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p6 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p7]' 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p7 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_TestPT]' 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=TestPT 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid0]' 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid0 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_concat0]' 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=concat0 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid1]' 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid1 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_AIO0]' 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=AIO0 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:51.800 23:50:52 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:08:51.800 ************************************ 00:08:51.800 START TEST bdev_fio_rw_verify 00:08:51.800 ************************************ 00:08:51.800 23:50:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1121 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:51.800 23:50:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:51.800 23:50:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1333 -- # local fio_dir=/usr/src/fio 00:08:51.800 23:50:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1335 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:08:51.800 23:50:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1335 -- # local sanitizers 00:08:51.800 23:50:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:51.800 23:50:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # shift 00:08:51.800 23:50:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local asan_lib= 00:08:51.800 23:50:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:08:51.800 23:50:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:51.800 23:50:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # grep libasan 00:08:51.800 23:50:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:08:51.800 23:50:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # asan_lib= 00:08:51.801 23:50:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:08:51.801 23:50:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:08:51.801 23:50:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:51.801 23:50:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # grep libclang_rt.asan 00:08:51.801 23:50:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:08:51.801 23:50:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # asan_lib= 00:08:51.801 23:50:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:08:51.801 23:50:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1348 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:08:51.801 23:50:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1348 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:52.365 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.365 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.365 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.365 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.365 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.365 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.365 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.365 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.365 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.365 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.365 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.365 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.365 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.365 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.365 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.365 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.365 fio-3.35 00:08:52.365 Starting 16 threads 00:09:04.569 00:09:04.569 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=371971: Tue May 14 23:51:03 2024 00:09:04.569 read: IOPS=88.9k, BW=347MiB/s (364MB/s)(3474MiB/10001msec) 00:09:04.569 slat (usec): min=2, max=459, avg=35.78, stdev=15.71 00:09:04.569 clat (usec): min=9, max=1151, avg=297.73, stdev=141.53 00:09:04.569 lat (usec): min=20, max=1188, avg=333.50, stdev=150.91 00:09:04.569 clat percentiles (usec): 00:09:04.569 | 50.000th=[ 289], 99.000th=[ 627], 99.900th=[ 701], 99.990th=[ 971], 00:09:04.569 | 99.999th=[ 1090] 00:09:04.569 write: IOPS=140k, BW=548MiB/s (574MB/s)(5404MiB/9869msec); 0 zone resets 00:09:04.569 slat (usec): min=6, max=397, avg=48.62, stdev=17.05 00:09:04.569 clat (usec): min=11, max=4135, avg=351.69, stdev=167.44 00:09:04.569 lat (usec): min=28, max=4200, avg=400.30, stdev=177.39 00:09:04.569 clat percentiles (usec): 00:09:04.570 | 50.000th=[ 334], 99.000th=[ 791], 99.900th=[ 1057], 99.990th=[ 1139], 00:09:04.570 | 99.999th=[ 1991] 00:09:04.570 bw ( KiB/s): min=463560, max=734855, per=98.87%, avg=554402.84, stdev=4311.22, samples=304 00:09:04.570 iops : min=115890, max=183710, avg=138600.37, stdev=1077.77, samples=304 00:09:04.570 lat (usec) : 10=0.01%, 20=0.04%, 50=0.58%, 100=4.47%, 250=30.11% 00:09:04.570 lat (usec) : 500=50.11%, 750=13.78%, 1000=0.77% 00:09:04.570 lat (msec) : 2=0.14%, 4=0.01%, 10=0.01% 00:09:04.570 cpu : usr=99.28%, sys=0.31%, ctx=616, majf=0, minf=1700 00:09:04.570 IO depths : 1=12.4%, 2=24.8%, 4=50.3%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:09:04.570 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:04.570 complete : 0=0.0%, 4=89.1%, 8=10.9%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:04.570 issued rwts: total=889411,1383545,0,0 short=0,0,0,0 dropped=0,0,0,0 00:09:04.570 latency : target=0, window=0, percentile=100.00%, depth=8 00:09:04.570 00:09:04.570 Run status group 0 (all jobs): 00:09:04.570 READ: bw=347MiB/s (364MB/s), 347MiB/s-347MiB/s (364MB/s-364MB/s), io=3474MiB (3643MB), run=10001-10001msec 00:09:04.570 WRITE: bw=548MiB/s (574MB/s), 548MiB/s-548MiB/s (574MB/s-574MB/s), io=5404MiB (5667MB), run=9869-9869msec 00:09:04.570 00:09:04.570 real 0m11.533s 00:09:04.570 user 2m45.280s 00:09:04.570 sys 0m1.331s 00:09:04.570 23:51:03 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:04.570 23:51:03 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:09:04.570 ************************************ 00:09:04.570 END TEST bdev_fio_rw_verify 00:09:04.570 ************************************ 00:09:04.570 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:09:04.570 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:04.570 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:09:04.570 23:51:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1276 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:04.570 23:51:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1277 -- # local workload=trim 00:09:04.570 23:51:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1278 -- # local bdev_type= 00:09:04.570 23:51:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1279 -- # local env_context= 00:09:04.570 23:51:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local fio_dir=/usr/src/fio 00:09:04.570 23:51:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:09:04.570 23:51:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1287 -- # '[' -z trim ']' 00:09:04.570 23:51:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -n '' ']' 00:09:04.570 23:51:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:04.570 23:51:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1297 -- # cat 00:09:04.570 23:51:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1309 -- # '[' trim == verify ']' 00:09:04.570 23:51:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # '[' trim == trim ']' 00:09:04.570 23:51:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1325 -- # echo rw=trimwrite 00:09:04.570 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:09:04.571 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "fa6ddae7-c906-49d6-9195-9609537935ef"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "fa6ddae7-c906-49d6-9195-9609537935ef",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "cfdabea7-e0b7-5e19-979d-b6e4703f4793"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "cfdabea7-e0b7-5e19-979d-b6e4703f4793",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "56b2027c-e9b7-5bd8-881f-9bf369ada9d1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "56b2027c-e9b7-5bd8-881f-9bf369ada9d1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "95b592a7-b30c-599c-a4ea-5cb1d6d7cfa9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "95b592a7-b30c-599c-a4ea-5cb1d6d7cfa9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "2dd09488-2b29-510a-8d6a-2d34a949e2fe"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "2dd09488-2b29-510a-8d6a-2d34a949e2fe",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "11788ef6-030c-5e5e-8352-b986b8528e86"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "11788ef6-030c-5e5e-8352-b986b8528e86",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "09fd8caf-7ac9-58c2-a791-4bfc377303ec"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "09fd8caf-7ac9-58c2-a791-4bfc377303ec",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "4de189f3-5463-5012-a4cb-ba3951d12e5f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "4de189f3-5463-5012-a4cb-ba3951d12e5f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "f8cdcbc7-0e8f-5965-93fe-ac2b1fd6343f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f8cdcbc7-0e8f-5965-93fe-ac2b1fd6343f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "6e6a484f-6272-5d0d-a2b1-279240238d05"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "6e6a484f-6272-5d0d-a2b1-279240238d05",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "1858bfa4-8ca2-5d90-843d-24db1013bd78"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "1858bfa4-8ca2-5d90-843d-24db1013bd78",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "fa731af3-e4a5-5d17-9028-137ebf186bd0"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "fa731af3-e4a5-5d17-9028-137ebf186bd0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "d1284f53-1877-4171-a41e-84dc59ebc642"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "d1284f53-1877-4171-a41e-84dc59ebc642",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "d1284f53-1877-4171-a41e-84dc59ebc642",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "1376d111-c140-43cf-be04-fcaad502d39d",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "43a7f8f4-1a90-40fe-93ea-69acd6f0811b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "bfe7e916-fd61-420f-af6e-adbacd4e56fe"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "bfe7e916-fd61-420f-af6e-adbacd4e56fe",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "bfe7e916-fd61-420f-af6e-adbacd4e56fe",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "18a32a47-4f4c-438d-90c8-a6e11ef81c25",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "57465bac-a2ed-4b9d-ad46-2ccf19a907b4",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "3ebb59dd-a76e-4b1a-8ac0-b7f5b7cb1c69"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "3ebb59dd-a76e-4b1a-8ac0-b7f5b7cb1c69",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": false,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "3ebb59dd-a76e-4b1a-8ac0-b7f5b7cb1c69",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "8dff0970-634c-4250-8812-173f3532f5e9",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "3488c802-d3a9-4229-a8fd-5248bf167ca7",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "b6ca11e7-b913-452c-a97b-d9fa2f7a74e9"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "b6ca11e7-b913-452c-a97b-d9fa2f7a74e9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:04.571 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n Malloc0 00:09:04.571 Malloc1p0 00:09:04.571 Malloc1p1 00:09:04.571 Malloc2p0 00:09:04.571 Malloc2p1 00:09:04.571 Malloc2p2 00:09:04.571 Malloc2p3 00:09:04.571 Malloc2p4 00:09:04.571 Malloc2p5 00:09:04.571 Malloc2p6 00:09:04.571 Malloc2p7 00:09:04.571 TestPT 00:09:04.571 raid0 00:09:04.571 concat0 ]] 00:09:04.571 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:09:04.572 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "fa6ddae7-c906-49d6-9195-9609537935ef"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "fa6ddae7-c906-49d6-9195-9609537935ef",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "cfdabea7-e0b7-5e19-979d-b6e4703f4793"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "cfdabea7-e0b7-5e19-979d-b6e4703f4793",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "56b2027c-e9b7-5bd8-881f-9bf369ada9d1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "56b2027c-e9b7-5bd8-881f-9bf369ada9d1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "95b592a7-b30c-599c-a4ea-5cb1d6d7cfa9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "95b592a7-b30c-599c-a4ea-5cb1d6d7cfa9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "2dd09488-2b29-510a-8d6a-2d34a949e2fe"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "2dd09488-2b29-510a-8d6a-2d34a949e2fe",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "11788ef6-030c-5e5e-8352-b986b8528e86"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "11788ef6-030c-5e5e-8352-b986b8528e86",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "09fd8caf-7ac9-58c2-a791-4bfc377303ec"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "09fd8caf-7ac9-58c2-a791-4bfc377303ec",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "4de189f3-5463-5012-a4cb-ba3951d12e5f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "4de189f3-5463-5012-a4cb-ba3951d12e5f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "f8cdcbc7-0e8f-5965-93fe-ac2b1fd6343f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f8cdcbc7-0e8f-5965-93fe-ac2b1fd6343f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "6e6a484f-6272-5d0d-a2b1-279240238d05"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "6e6a484f-6272-5d0d-a2b1-279240238d05",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "1858bfa4-8ca2-5d90-843d-24db1013bd78"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "1858bfa4-8ca2-5d90-843d-24db1013bd78",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "fa731af3-e4a5-5d17-9028-137ebf186bd0"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "fa731af3-e4a5-5d17-9028-137ebf186bd0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "d1284f53-1877-4171-a41e-84dc59ebc642"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "d1284f53-1877-4171-a41e-84dc59ebc642",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "d1284f53-1877-4171-a41e-84dc59ebc642",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "1376d111-c140-43cf-be04-fcaad502d39d",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "43a7f8f4-1a90-40fe-93ea-69acd6f0811b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "bfe7e916-fd61-420f-af6e-adbacd4e56fe"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "bfe7e916-fd61-420f-af6e-adbacd4e56fe",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "bfe7e916-fd61-420f-af6e-adbacd4e56fe",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "18a32a47-4f4c-438d-90c8-a6e11ef81c25",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "57465bac-a2ed-4b9d-ad46-2ccf19a907b4",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "3ebb59dd-a76e-4b1a-8ac0-b7f5b7cb1c69"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "3ebb59dd-a76e-4b1a-8ac0-b7f5b7cb1c69",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": false,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "3ebb59dd-a76e-4b1a-8ac0-b7f5b7cb1c69",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "8dff0970-634c-4250-8812-173f3532f5e9",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "3488c802-d3a9-4229-a8fd-5248bf167ca7",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "b6ca11e7-b913-452c-a97b-d9fa2f7a74e9"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "b6ca11e7-b913-452c-a97b-d9fa2f7a74e9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:04.572 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:04.572 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc0]' 00:09:04.572 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc0 00:09:04.572 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:04.572 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p0]' 00:09:04.572 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p0 00:09:04.572 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:04.572 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p1]' 00:09:04.572 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p1 00:09:04.572 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:04.572 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p0]' 00:09:04.572 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p0 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p1]' 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p1 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p2]' 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p2 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p3]' 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p3 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p4]' 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p4 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p5]' 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p5 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p6]' 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p6 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p7]' 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p7 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_TestPT]' 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=TestPT 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_raid0]' 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=raid0 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_concat0]' 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=concat0 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:04.573 23:51:03 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:04.573 ************************************ 00:09:04.573 START TEST bdev_fio_trim 00:09:04.573 ************************************ 00:09:04.573 23:51:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1121 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:04.573 23:51:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:04.573 23:51:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1333 -- # local fio_dir=/usr/src/fio 00:09:04.573 23:51:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1335 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:09:04.573 23:51:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1335 -- # local sanitizers 00:09:04.573 23:51:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:04.573 23:51:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # shift 00:09:04.573 23:51:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local asan_lib= 00:09:04.573 23:51:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:09:04.573 23:51:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:04.573 23:51:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # grep libasan 00:09:04.573 23:51:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:09:04.573 23:51:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # asan_lib= 00:09:04.573 23:51:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:09:04.573 23:51:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:09:04.573 23:51:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:04.573 23:51:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # grep libclang_rt.asan 00:09:04.573 23:51:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:09:04.573 23:51:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # asan_lib= 00:09:04.573 23:51:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:09:04.573 23:51:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1348 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:09:04.573 23:51:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1348 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:04.573 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:04.573 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:04.573 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:04.573 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:04.573 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:04.573 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:04.573 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:04.573 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:04.573 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:04.573 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:04.573 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:04.573 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:04.573 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:04.573 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:04.573 fio-3.35 00:09:04.573 Starting 14 threads 00:09:16.828 00:09:16.828 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=373660: Tue May 14 23:51:15 2024 00:09:16.828 write: IOPS=120k, BW=468MiB/s (490MB/s)(4676MiB/10001msec); 0 zone resets 00:09:16.828 slat (usec): min=3, max=649, avg=40.31, stdev=11.70 00:09:16.828 clat (usec): min=27, max=4251, avg=296.06, stdev=102.30 00:09:16.828 lat (usec): min=41, max=4285, avg=336.37, stdev=106.93 00:09:16.828 clat percentiles (usec): 00:09:16.828 | 50.000th=[ 289], 99.000th=[ 523], 99.900th=[ 578], 99.990th=[ 635], 00:09:16.828 | 99.999th=[ 914] 00:09:16.828 bw ( KiB/s): min=434661, max=525184, per=100.00%, avg=479421.47, stdev=2021.59, samples=266 00:09:16.828 iops : min=108665, max=131296, avg=119855.00, stdev=505.39, samples=266 00:09:16.828 trim: IOPS=120k, BW=468MiB/s (490MB/s)(4676MiB/10001msec); 0 zone resets 00:09:16.828 slat (usec): min=4, max=148, avg=27.64, stdev= 7.65 00:09:16.828 clat (usec): min=11, max=4285, avg=335.78, stdev=108.02 00:09:16.828 lat (usec): min=24, max=4311, avg=363.42, stdev=111.16 00:09:16.828 clat percentiles (usec): 00:09:16.828 | 50.000th=[ 330], 99.000th=[ 570], 99.900th=[ 635], 99.990th=[ 693], 00:09:16.828 | 99.999th=[ 963] 00:09:16.828 bw ( KiB/s): min=434661, max=525192, per=100.00%, avg=479421.47, stdev=2021.61, samples=266 00:09:16.828 iops : min=108665, max=131298, avg=119854.89, stdev=505.39, samples=266 00:09:16.828 lat (usec) : 20=0.01%, 50=0.04%, 100=0.82%, 250=29.65%, 500=65.08% 00:09:16.828 lat (usec) : 750=4.40%, 1000=0.01% 00:09:16.828 lat (msec) : 2=0.01%, 10=0.01% 00:09:16.828 cpu : usr=99.63%, sys=0.00%, ctx=557, majf=0, minf=1033 00:09:16.828 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:09:16.828 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:16.828 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:16.828 issued rwts: total=0,1196977,1196979,0 short=0,0,0,0 dropped=0,0,0,0 00:09:16.828 latency : target=0, window=0, percentile=100.00%, depth=8 00:09:16.828 00:09:16.828 Run status group 0 (all jobs): 00:09:16.828 WRITE: bw=468MiB/s (490MB/s), 468MiB/s-468MiB/s (490MB/s-490MB/s), io=4676MiB (4903MB), run=10001-10001msec 00:09:16.828 TRIM: bw=468MiB/s (490MB/s), 468MiB/s-468MiB/s (490MB/s-490MB/s), io=4676MiB (4903MB), run=10001-10001msec 00:09:16.828 00:09:16.828 real 0m11.433s 00:09:16.828 user 2m25.404s 00:09:16.828 sys 0m0.667s 00:09:16.828 23:51:15 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:16.828 23:51:15 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:09:16.828 ************************************ 00:09:16.828 END TEST bdev_fio_trim 00:09:16.828 ************************************ 00:09:16.828 23:51:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:09:16.828 23:51:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:16.828 23:51:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:09:16.828 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:09:16.828 23:51:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:09:16.828 00:09:16.828 real 0m23.345s 00:09:16.828 user 5m10.901s 00:09:16.828 sys 0m2.182s 00:09:16.828 23:51:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:16.828 23:51:15 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:16.828 ************************************ 00:09:16.828 END TEST bdev_fio 00:09:16.828 ************************************ 00:09:16.828 23:51:15 blockdev_general -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:09:16.828 23:51:15 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:09:16.828 23:51:15 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 16 -le 1 ']' 00:09:16.828 23:51:15 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:16.829 23:51:15 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:16.829 ************************************ 00:09:16.829 START TEST bdev_verify 00:09:16.829 ************************************ 00:09:16.829 23:51:15 blockdev_general.bdev_verify -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:09:16.829 [2024-05-14 23:51:15.660927] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:09:16.829 [2024-05-14 23:51:15.660988] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid375479 ] 00:09:16.829 [2024-05-14 23:51:15.788578] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:16.829 [2024-05-14 23:51:15.899821] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:09:16.829 [2024-05-14 23:51:15.899827] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:16.829 [2024-05-14 23:51:16.047558] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:16.829 [2024-05-14 23:51:16.047614] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:16.829 [2024-05-14 23:51:16.047632] vbdev_passthru.c: 731:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:16.829 [2024-05-14 23:51:16.055567] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:16.829 [2024-05-14 23:51:16.055598] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:16.829 [2024-05-14 23:51:16.063582] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:16.829 [2024-05-14 23:51:16.063610] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:16.829 [2024-05-14 23:51:16.136511] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:16.829 [2024-05-14 23:51:16.136562] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:16.829 [2024-05-14 23:51:16.136585] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x248d890 00:09:16.829 [2024-05-14 23:51:16.136607] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:16.829 [2024-05-14 23:51:16.138167] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:16.829 [2024-05-14 23:51:16.138199] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:16.829 Running I/O for 5 seconds... 00:09:22.095 00:09:22.095 Latency(us) 00:09:22.095 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:22.095 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.095 Verification LBA range: start 0x0 length 0x1000 00:09:22.095 Malloc0 : 5.09 1180.79 4.61 0.00 0.00 108189.95 559.19 232510.33 00:09:22.095 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.095 Verification LBA range: start 0x1000 length 0x1000 00:09:22.095 Malloc0 : 5.07 1162.17 4.54 0.00 0.00 109913.35 609.06 373840.14 00:09:22.095 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.095 Verification LBA range: start 0x0 length 0x800 00:09:22.095 Malloc1p0 : 5.10 602.70 2.35 0.00 0.00 211283.32 3604.48 220656.86 00:09:22.095 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.095 Verification LBA range: start 0x800 length 0x800 00:09:22.095 Malloc1p0 : 5.07 606.07 2.37 0.00 0.00 210118.31 3647.22 206979.78 00:09:22.095 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.095 Verification LBA range: start 0x0 length 0x800 00:09:22.095 Malloc1p1 : 5.10 602.45 2.35 0.00 0.00 210761.48 3490.50 217921.45 00:09:22.095 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.095 Verification LBA range: start 0x800 length 0x800 00:09:22.095 Malloc1p1 : 5.23 611.55 2.39 0.00 0.00 207676.43 3504.75 202420.76 00:09:22.095 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.095 Verification LBA range: start 0x0 length 0x200 00:09:22.095 Malloc2p0 : 5.10 602.19 2.35 0.00 0.00 210234.71 3547.49 213362.42 00:09:22.095 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.095 Verification LBA range: start 0x200 length 0x200 00:09:22.095 Malloc2p0 : 5.23 611.28 2.39 0.00 0.00 207178.33 3547.49 198773.54 00:09:22.095 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.095 Verification LBA range: start 0x0 length 0x200 00:09:22.095 Malloc2p1 : 5.24 610.62 2.39 0.00 0.00 206814.03 3632.97 206979.78 00:09:22.095 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.095 Verification LBA range: start 0x200 length 0x200 00:09:22.095 Malloc2p1 : 5.24 611.00 2.39 0.00 0.00 206687.18 3647.22 195126.32 00:09:22.095 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.095 Verification LBA range: start 0x0 length 0x200 00:09:22.095 Malloc2p2 : 5.24 610.38 2.38 0.00 0.00 206312.59 3561.74 203332.56 00:09:22.095 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.095 Verification LBA range: start 0x200 length 0x200 00:09:22.095 Malloc2p2 : 5.24 610.74 2.39 0.00 0.00 206189.57 3575.99 189655.49 00:09:22.095 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.095 Verification LBA range: start 0x0 length 0x200 00:09:22.095 Malloc2p3 : 5.24 610.14 2.38 0.00 0.00 205820.15 3561.74 199685.34 00:09:22.095 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.095 Verification LBA range: start 0x200 length 0x200 00:09:22.095 Malloc2p3 : 5.24 610.51 2.38 0.00 0.00 205700.80 3561.74 185096.46 00:09:22.095 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.095 Verification LBA range: start 0x0 length 0x200 00:09:22.095 Malloc2p4 : 5.25 609.91 2.38 0.00 0.00 205351.25 3590.23 196949.93 00:09:22.095 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.095 Verification LBA range: start 0x200 length 0x200 00:09:22.096 Malloc2p4 : 5.24 610.27 2.38 0.00 0.00 205238.10 3590.23 182361.04 00:09:22.096 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.096 Verification LBA range: start 0x0 length 0x200 00:09:22.096 Malloc2p5 : 5.25 609.68 2.38 0.00 0.00 204869.94 3376.53 193302.71 00:09:22.096 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.096 Verification LBA range: start 0x200 length 0x200 00:09:22.096 Malloc2p5 : 5.25 610.04 2.38 0.00 0.00 204781.46 3419.27 179625.63 00:09:22.096 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.096 Verification LBA range: start 0x0 length 0x200 00:09:22.096 Malloc2p6 : 5.25 609.44 2.38 0.00 0.00 204397.87 3348.03 190567.29 00:09:22.096 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.096 Verification LBA range: start 0x200 length 0x200 00:09:22.096 Malloc2p6 : 5.25 609.80 2.38 0.00 0.00 204321.84 3348.03 176890.21 00:09:22.096 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.096 Verification LBA range: start 0x0 length 0x200 00:09:22.096 Malloc2p7 : 5.25 609.13 2.38 0.00 0.00 203903.66 3305.29 187831.87 00:09:22.096 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.096 Verification LBA range: start 0x200 length 0x200 00:09:22.096 Malloc2p7 : 5.25 609.56 2.38 0.00 0.00 203837.09 3348.03 171419.38 00:09:22.096 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.096 Verification LBA range: start 0x0 length 0x1000 00:09:22.096 TestPT : 5.26 585.84 2.29 0.00 0.00 210325.54 15158.76 189655.49 00:09:22.096 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.096 Verification LBA range: start 0x1000 length 0x1000 00:09:22.096 TestPT : 5.26 587.28 2.29 0.00 0.00 210528.53 11340.58 255305.46 00:09:22.096 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.096 Verification LBA range: start 0x0 length 0x2000 00:09:22.096 raid0 : 5.26 608.60 2.38 0.00 0.00 202862.53 3533.25 169595.77 00:09:22.096 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.096 Verification LBA range: start 0x2000 length 0x2000 00:09:22.096 raid0 : 5.25 609.11 2.38 0.00 0.00 202713.58 3504.75 154095.08 00:09:22.096 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.096 Verification LBA range: start 0x0 length 0x2000 00:09:22.096 concat0 : 5.26 608.33 2.38 0.00 0.00 202408.18 3319.54 166860.35 00:09:22.096 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.096 Verification LBA range: start 0x2000 length 0x2000 00:09:22.096 concat0 : 5.26 608.69 2.38 0.00 0.00 202368.02 3333.79 150447.86 00:09:22.096 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.096 Verification LBA range: start 0x0 length 0x1000 00:09:22.096 raid1 : 5.26 608.01 2.38 0.00 0.00 201988.88 3789.69 163213.13 00:09:22.096 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.096 Verification LBA range: start 0x1000 length 0x1000 00:09:22.096 raid1 : 5.26 608.42 2.38 0.00 0.00 201907.74 3960.65 155006.89 00:09:22.096 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:22.096 Verification LBA range: start 0x0 length 0x4e2 00:09:22.096 AIO0 : 5.27 607.68 2.37 0.00 0.00 201527.30 1524.42 158654.11 00:09:22.096 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:22.096 Verification LBA range: start 0x4e2 length 0x4e2 00:09:22.096 AIO0 : 5.26 608.20 2.38 0.00 0.00 201417.44 1524.42 162301.33 00:09:22.096 =================================================================================================================== 00:09:22.096 Total : 20560.59 80.31 0.00 0.00 194885.31 559.19 373840.14 00:09:22.096 00:09:22.096 real 0m6.535s 00:09:22.096 user 0m12.126s 00:09:22.096 sys 0m0.385s 00:09:22.096 23:51:22 blockdev_general.bdev_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:22.096 23:51:22 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:09:22.096 ************************************ 00:09:22.096 END TEST bdev_verify 00:09:22.096 ************************************ 00:09:22.096 23:51:22 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:09:22.096 23:51:22 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 16 -le 1 ']' 00:09:22.096 23:51:22 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:22.096 23:51:22 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:22.096 ************************************ 00:09:22.096 START TEST bdev_verify_big_io 00:09:22.096 ************************************ 00:09:22.096 23:51:22 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:09:22.096 [2024-05-14 23:51:22.285824] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:09:22.096 [2024-05-14 23:51:22.285885] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid376370 ] 00:09:22.096 [2024-05-14 23:51:22.416527] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:22.096 [2024-05-14 23:51:22.519712] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:09:22.096 [2024-05-14 23:51:22.519721] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:22.096 [2024-05-14 23:51:22.670953] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:22.096 [2024-05-14 23:51:22.671021] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:22.096 [2024-05-14 23:51:22.671040] vbdev_passthru.c: 731:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:22.096 [2024-05-14 23:51:22.678963] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:22.096 [2024-05-14 23:51:22.679006] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:22.354 [2024-05-14 23:51:22.686977] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:22.354 [2024-05-14 23:51:22.687007] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:22.354 [2024-05-14 23:51:22.759620] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:22.354 [2024-05-14 23:51:22.759673] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:22.354 [2024-05-14 23:51:22.759697] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2028890 00:09:22.354 [2024-05-14 23:51:22.759714] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:22.354 [2024-05-14 23:51:22.761277] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:22.354 [2024-05-14 23:51:22.761308] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:22.354 [2024-05-14 23:51:22.944493] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:09:22.613 [2024-05-14 23:51:22.945962] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:09:22.613 [2024-05-14 23:51:22.948061] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:09:22.613 [2024-05-14 23:51:22.949343] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:09:22.613 [2024-05-14 23:51:22.951131] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:09:22.613 [2024-05-14 23:51:22.952294] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:09:22.613 [2024-05-14 23:51:22.954055] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:09:22.613 [2024-05-14 23:51:22.955828] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:09:22.613 [2024-05-14 23:51:22.957004] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:09:22.613 [2024-05-14 23:51:22.958670] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:09:22.613 [2024-05-14 23:51:22.959592] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:09:22.613 [2024-05-14 23:51:22.961022] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:09:22.613 [2024-05-14 23:51:22.961929] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:09:22.613 [2024-05-14 23:51:22.963364] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:09:22.613 [2024-05-14 23:51:22.964296] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:09:22.613 [2024-05-14 23:51:22.965746] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:09:22.613 [2024-05-14 23:51:22.989450] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:09:22.613 [2024-05-14 23:51:22.991454] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:09:22.613 Running I/O for 5 seconds... 00:09:30.720 00:09:30.720 Latency(us) 00:09:30.720 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:30.720 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:30.720 Verification LBA range: start 0x0 length 0x100 00:09:30.720 Malloc0 : 5.91 173.20 10.83 0.00 0.00 724262.97 894.00 1940321.50 00:09:30.720 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:30.720 Verification LBA range: start 0x100 length 0x100 00:09:30.720 Malloc0 : 6.00 149.35 9.33 0.00 0.00 840848.70 908.24 2290454.71 00:09:30.720 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:30.720 Verification LBA range: start 0x0 length 0x80 00:09:30.720 Malloc1p0 : 6.69 35.89 2.24 0.00 0.00 3239560.31 1510.18 5514597.95 00:09:30.720 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:30.720 Verification LBA range: start 0x80 length 0x80 00:09:30.720 Malloc1p0 : 6.30 86.98 5.44 0.00 0.00 1357211.06 2592.95 2728121.21 00:09:30.720 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:30.720 Verification LBA range: start 0x0 length 0x80 00:09:30.720 Malloc1p1 : 6.69 35.89 2.24 0.00 0.00 3133077.76 1510.18 5310353.59 00:09:30.720 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:30.720 Verification LBA range: start 0x80 length 0x80 00:09:30.720 Malloc1p1 : 6.74 35.62 2.23 0.00 0.00 3155840.21 1531.55 5427064.65 00:09:30.720 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:30.720 Verification LBA range: start 0x0 length 0x20 00:09:30.720 Malloc2p0 : 6.29 25.42 1.59 0.00 0.00 1121035.81 655.36 2129976.99 00:09:30.720 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:30.720 Verification LBA range: start 0x20 length 0x20 00:09:30.720 Malloc2p0 : 6.21 23.17 1.45 0.00 0.00 1217535.20 658.92 1998677.04 00:09:30.720 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:30.720 Verification LBA range: start 0x0 length 0x20 00:09:30.720 Malloc2p1 : 6.30 25.41 1.59 0.00 0.00 1111101.76 626.87 2115388.10 00:09:30.720 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:30.720 Verification LBA range: start 0x20 length 0x20 00:09:30.720 Malloc2p1 : 6.22 23.17 1.45 0.00 0.00 1206739.91 666.05 1984088.15 00:09:30.720 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:30.720 Verification LBA range: start 0x0 length 0x20 00:09:30.720 Malloc2p2 : 6.30 25.41 1.59 0.00 0.00 1101098.04 648.24 2071621.45 00:09:30.720 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:30.720 Verification LBA range: start 0x20 length 0x20 00:09:30.720 Malloc2p2 : 6.22 23.16 1.45 0.00 0.00 1196350.79 662.48 1954910.39 00:09:30.720 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:30.720 Verification LBA range: start 0x0 length 0x20 00:09:30.720 Malloc2p3 : 6.30 25.40 1.59 0.00 0.00 1091092.89 655.36 2042443.69 00:09:30.720 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:30.720 Verification LBA range: start 0x20 length 0x20 00:09:30.720 Malloc2p3 : 6.22 23.16 1.45 0.00 0.00 1185282.11 666.05 1925732.62 00:09:30.720 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:30.720 Verification LBA range: start 0x0 length 0x20 00:09:30.720 Malloc2p4 : 6.30 25.40 1.59 0.00 0.00 1081388.35 648.24 2013265.92 00:09:30.720 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:30.720 Verification LBA range: start 0x20 length 0x20 00:09:30.720 Malloc2p4 : 6.30 25.38 1.59 0.00 0.00 1087094.02 666.05 1896554.85 00:09:30.720 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:30.720 Verification LBA range: start 0x0 length 0x20 00:09:30.720 Malloc2p5 : 6.30 25.39 1.59 0.00 0.00 1071480.24 637.55 1998677.04 00:09:30.720 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:30.720 Verification LBA range: start 0x20 length 0x20 00:09:30.720 Malloc2p5 : 6.31 25.37 1.59 0.00 0.00 1077094.21 673.17 1881965.97 00:09:30.720 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:30.720 Verification LBA range: start 0x0 length 0x20 00:09:30.720 Malloc2p6 : 6.30 25.38 1.59 0.00 0.00 1062433.00 655.36 1969499.27 00:09:30.720 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:30.720 Verification LBA range: start 0x20 length 0x20 00:09:30.721 Malloc2p6 : 6.31 25.37 1.59 0.00 0.00 1067318.97 666.05 1852788.20 00:09:30.721 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:30.721 Verification LBA range: start 0x0 length 0x20 00:09:30.721 Malloc2p7 : 6.31 25.37 1.59 0.00 0.00 1052523.31 648.24 1940321.50 00:09:30.721 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:30.721 Verification LBA range: start 0x20 length 0x20 00:09:30.721 Malloc2p7 : 6.31 25.36 1.58 0.00 0.00 1057818.43 658.92 1823610.43 00:09:30.721 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:30.721 Verification LBA range: start 0x0 length 0x100 00:09:30.721 TestPT : 6.81 37.57 2.35 0.00 0.00 2689592.95 1517.30 4901864.85 00:09:30.721 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:30.721 Verification LBA range: start 0x100 length 0x100 00:09:30.721 TestPT : 6.80 33.55 2.10 0.00 0.00 3045141.49 96651.35 3501332.03 00:09:30.721 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:30.721 Verification LBA range: start 0x0 length 0x200 00:09:30.721 raid0 : 6.55 43.99 2.75 0.00 0.00 2286801.14 1638.40 4697620.48 00:09:30.721 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:30.721 Verification LBA range: start 0x200 length 0x200 00:09:30.721 raid0 : 6.64 40.97 2.56 0.00 0.00 2441932.24 1659.77 4755976.01 00:09:30.721 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:30.721 Verification LBA range: start 0x0 length 0x200 00:09:30.721 concat0 : 6.82 46.95 2.93 0.00 0.00 2048382.05 1602.78 4522553.88 00:09:30.721 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:30.721 Verification LBA range: start 0x200 length 0x200 00:09:30.721 concat0 : 6.85 44.35 2.77 0.00 0.00 2202048.21 1638.40 4580909.41 00:09:30.721 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:30.721 Verification LBA range: start 0x0 length 0x100 00:09:30.721 raid1 : 6.86 60.67 3.79 0.00 0.00 1566522.51 2023.07 4347487.28 00:09:30.721 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:30.721 Verification LBA range: start 0x100 length 0x100 00:09:30.721 raid1 : 6.80 54.13 3.38 0.00 0.00 1768092.25 2108.55 4435020.58 00:09:30.721 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:09:30.721 Verification LBA range: start 0x0 length 0x4e 00:09:30.721 AIO0 : 6.88 64.23 4.01 0.00 0.00 877686.11 847.69 2815654.51 00:09:30.721 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:09:30.721 Verification LBA range: start 0x4e length 0x4e 00:09:30.721 AIO0 : 6.85 71.21 4.45 0.00 0.00 798735.47 861.94 2874010.05 00:09:30.721 =================================================================================================================== 00:09:30.721 Total : 1411.87 88.24 0.00 0.00 1478635.38 626.87 5514597.95 00:09:30.721 00:09:30.721 real 0m8.209s 00:09:30.721 user 0m15.379s 00:09:30.721 sys 0m0.440s 00:09:30.721 23:51:30 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:30.721 23:51:30 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:09:30.721 ************************************ 00:09:30.721 END TEST bdev_verify_big_io 00:09:30.721 ************************************ 00:09:30.721 23:51:30 blockdev_general -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:30.721 23:51:30 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:09:30.721 23:51:30 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:30.721 23:51:30 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:30.721 ************************************ 00:09:30.721 START TEST bdev_write_zeroes 00:09:30.721 ************************************ 00:09:30.721 23:51:30 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:30.721 [2024-05-14 23:51:30.587311] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:09:30.721 [2024-05-14 23:51:30.587373] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid377452 ] 00:09:30.721 [2024-05-14 23:51:30.716309] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:30.721 [2024-05-14 23:51:30.814718] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:30.721 [2024-05-14 23:51:30.971564] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:30.721 [2024-05-14 23:51:30.971629] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:30.721 [2024-05-14 23:51:30.971649] vbdev_passthru.c: 731:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:30.721 [2024-05-14 23:51:30.979572] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:30.721 [2024-05-14 23:51:30.979605] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:30.721 [2024-05-14 23:51:30.987579] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:30.721 [2024-05-14 23:51:30.987607] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:30.721 [2024-05-14 23:51:31.062372] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:30.721 [2024-05-14 23:51:31.062428] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:30.721 [2024-05-14 23:51:31.062453] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18af010 00:09:30.721 [2024-05-14 23:51:31.062471] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:30.721 [2024-05-14 23:51:31.063933] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:30.721 [2024-05-14 23:51:31.063965] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:30.721 Running I/O for 1 seconds... 00:09:32.095 00:09:32.095 Latency(us) 00:09:32.095 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:32.095 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.095 Malloc0 : 1.05 4995.05 19.51 0.00 0.00 25607.64 666.05 42854.85 00:09:32.095 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.095 Malloc1p0 : 1.05 4987.90 19.48 0.00 0.00 25599.50 904.68 41943.04 00:09:32.095 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.095 Malloc1p1 : 1.05 4980.87 19.46 0.00 0.00 25579.74 904.68 41031.23 00:09:32.095 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.095 Malloc2p0 : 1.06 4973.80 19.43 0.00 0.00 25560.22 904.68 40119.43 00:09:32.095 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.095 Malloc2p1 : 1.06 4966.82 19.40 0.00 0.00 25541.24 908.24 39207.62 00:09:32.095 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.095 Malloc2p2 : 1.06 4959.84 19.37 0.00 0.00 25523.37 904.68 38295.82 00:09:32.095 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.095 Malloc2p3 : 1.06 4952.85 19.35 0.00 0.00 25504.61 904.68 37384.01 00:09:32.095 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.095 Malloc2p4 : 1.06 4945.90 19.32 0.00 0.00 25485.99 897.56 36472.21 00:09:32.095 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.095 Malloc2p5 : 1.06 4939.00 19.29 0.00 0.00 25471.41 897.56 35788.35 00:09:32.095 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.095 Malloc2p6 : 1.06 4932.06 19.27 0.00 0.00 25447.97 904.68 34876.55 00:09:32.095 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.095 Malloc2p7 : 1.07 4925.18 19.24 0.00 0.00 25426.14 901.12 33964.74 00:09:32.095 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.095 TestPT : 1.07 4918.32 19.21 0.00 0.00 25404.60 940.30 33052.94 00:09:32.095 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.095 raid0 : 1.07 4910.43 19.18 0.00 0.00 25377.07 1602.78 31457.28 00:09:32.095 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.095 concat0 : 1.07 4902.69 19.15 0.00 0.00 25321.13 1588.54 29861.62 00:09:32.095 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.095 raid1 : 1.07 4893.06 19.11 0.00 0.00 25259.38 2521.71 27240.18 00:09:32.095 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:32.095 AIO0 : 1.07 4887.16 19.09 0.00 0.00 25171.54 1032.90 26214.40 00:09:32.095 =================================================================================================================== 00:09:32.095 Total : 79070.94 308.87 0.00 0.00 25455.10 666.05 42854.85 00:09:32.354 00:09:32.354 real 0m2.244s 00:09:32.354 user 0m1.859s 00:09:32.354 sys 0m0.333s 00:09:32.354 23:51:32 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:32.354 23:51:32 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:09:32.354 ************************************ 00:09:32.354 END TEST bdev_write_zeroes 00:09:32.354 ************************************ 00:09:32.354 23:51:32 blockdev_general -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:32.354 23:51:32 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:09:32.354 23:51:32 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:32.354 23:51:32 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:32.354 ************************************ 00:09:32.354 START TEST bdev_json_nonenclosed 00:09:32.354 ************************************ 00:09:32.354 23:51:32 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:32.354 [2024-05-14 23:51:32.918699] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:09:32.354 [2024-05-14 23:51:32.918760] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid377811 ] 00:09:32.612 [2024-05-14 23:51:33.047553] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:32.612 [2024-05-14 23:51:33.148929] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:32.612 [2024-05-14 23:51:33.149004] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:09:32.612 [2024-05-14 23:51:33.149030] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:09:32.612 [2024-05-14 23:51:33.149048] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:32.869 00:09:32.869 real 0m0.412s 00:09:32.869 user 0m0.248s 00:09:32.869 sys 0m0.161s 00:09:32.869 23:51:33 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:32.869 23:51:33 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:09:32.869 ************************************ 00:09:32.869 END TEST bdev_json_nonenclosed 00:09:32.870 ************************************ 00:09:32.870 23:51:33 blockdev_general -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:32.870 23:51:33 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:09:32.870 23:51:33 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:32.870 23:51:33 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:32.870 ************************************ 00:09:32.870 START TEST bdev_json_nonarray 00:09:32.870 ************************************ 00:09:32.870 23:51:33 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:32.870 [2024-05-14 23:51:33.417157] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:09:32.870 [2024-05-14 23:51:33.417206] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid377918 ] 00:09:33.127 [2024-05-14 23:51:33.527458] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:33.127 [2024-05-14 23:51:33.625455] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:33.127 [2024-05-14 23:51:33.625544] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:09:33.127 [2024-05-14 23:51:33.625572] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:09:33.127 [2024-05-14 23:51:33.625588] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:33.384 00:09:33.384 real 0m0.387s 00:09:33.384 user 0m0.239s 00:09:33.384 sys 0m0.145s 00:09:33.384 23:51:33 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:33.384 23:51:33 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:09:33.384 ************************************ 00:09:33.384 END TEST bdev_json_nonarray 00:09:33.384 ************************************ 00:09:33.384 23:51:33 blockdev_general -- bdev/blockdev.sh@787 -- # [[ bdev == bdev ]] 00:09:33.384 23:51:33 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qos qos_test_suite '' 00:09:33.384 23:51:33 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:09:33.384 23:51:33 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:33.384 23:51:33 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:33.384 ************************************ 00:09:33.384 START TEST bdev_qos 00:09:33.384 ************************************ 00:09:33.384 23:51:33 blockdev_general.bdev_qos -- common/autotest_common.sh@1121 -- # qos_test_suite '' 00:09:33.384 23:51:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # QOS_PID=378024 00:09:33.384 23:51:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # echo 'Process qos testing pid: 378024' 00:09:33.384 Process qos testing pid: 378024 00:09:33.384 23:51:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:09:33.384 23:51:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:09:33.384 23:51:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@449 -- # waitforlisten 378024 00:09:33.384 23:51:33 blockdev_general.bdev_qos -- common/autotest_common.sh@827 -- # '[' -z 378024 ']' 00:09:33.384 23:51:33 blockdev_general.bdev_qos -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:33.384 23:51:33 blockdev_general.bdev_qos -- common/autotest_common.sh@832 -- # local max_retries=100 00:09:33.384 23:51:33 blockdev_general.bdev_qos -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:33.384 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:33.384 23:51:33 blockdev_general.bdev_qos -- common/autotest_common.sh@836 -- # xtrace_disable 00:09:33.384 23:51:33 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:33.384 [2024-05-14 23:51:33.909602] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:09:33.384 [2024-05-14 23:51:33.909667] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid378024 ] 00:09:33.642 [2024-05-14 23:51:34.031304] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:33.642 [2024-05-14 23:51:34.137076] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:09:34.259 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:09:34.259 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@860 -- # return 0 00:09:34.259 23:51:34 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:09:34.259 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:34.259 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:34.516 Malloc_0 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # waitforbdev Malloc_0 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@895 -- # local bdev_name=Malloc_0 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local i 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # rpc_cmd bdev_wait_for_examine 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:34.516 [ 00:09:34.516 { 00:09:34.516 "name": "Malloc_0", 00:09:34.516 "aliases": [ 00:09:34.516 "a3b3baac-8108-4c6e-9a2e-c50cf315e158" 00:09:34.516 ], 00:09:34.516 "product_name": "Malloc disk", 00:09:34.516 "block_size": 512, 00:09:34.516 "num_blocks": 262144, 00:09:34.516 "uuid": "a3b3baac-8108-4c6e-9a2e-c50cf315e158", 00:09:34.516 "assigned_rate_limits": { 00:09:34.516 "rw_ios_per_sec": 0, 00:09:34.516 "rw_mbytes_per_sec": 0, 00:09:34.516 "r_mbytes_per_sec": 0, 00:09:34.516 "w_mbytes_per_sec": 0 00:09:34.516 }, 00:09:34.516 "claimed": false, 00:09:34.516 "zoned": false, 00:09:34.516 "supported_io_types": { 00:09:34.516 "read": true, 00:09:34.516 "write": true, 00:09:34.516 "unmap": true, 00:09:34.516 "write_zeroes": true, 00:09:34.516 "flush": true, 00:09:34.516 "reset": true, 00:09:34.516 "compare": false, 00:09:34.516 "compare_and_write": false, 00:09:34.516 "abort": true, 00:09:34.516 "nvme_admin": false, 00:09:34.516 "nvme_io": false 00:09:34.516 }, 00:09:34.516 "memory_domains": [ 00:09:34.516 { 00:09:34.516 "dma_device_id": "system", 00:09:34.516 "dma_device_type": 1 00:09:34.516 }, 00:09:34.516 { 00:09:34.516 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:34.516 "dma_device_type": 2 00:09:34.516 } 00:09:34.516 ], 00:09:34.516 "driver_specific": {} 00:09:34.516 } 00:09:34.516 ] 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@903 -- # return 0 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # rpc_cmd bdev_null_create Null_1 128 512 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:34.516 Null_1 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- bdev/blockdev.sh@454 -- # waitforbdev Null_1 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@895 -- # local bdev_name=Null_1 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local i 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # rpc_cmd bdev_wait_for_examine 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:34.516 [ 00:09:34.516 { 00:09:34.516 "name": "Null_1", 00:09:34.516 "aliases": [ 00:09:34.516 "f00c4797-d1b8-4818-a55c-31c46e93579f" 00:09:34.516 ], 00:09:34.516 "product_name": "Null disk", 00:09:34.516 "block_size": 512, 00:09:34.516 "num_blocks": 262144, 00:09:34.516 "uuid": "f00c4797-d1b8-4818-a55c-31c46e93579f", 00:09:34.516 "assigned_rate_limits": { 00:09:34.516 "rw_ios_per_sec": 0, 00:09:34.516 "rw_mbytes_per_sec": 0, 00:09:34.516 "r_mbytes_per_sec": 0, 00:09:34.516 "w_mbytes_per_sec": 0 00:09:34.516 }, 00:09:34.516 "claimed": false, 00:09:34.516 "zoned": false, 00:09:34.516 "supported_io_types": { 00:09:34.516 "read": true, 00:09:34.516 "write": true, 00:09:34.516 "unmap": false, 00:09:34.516 "write_zeroes": true, 00:09:34.516 "flush": false, 00:09:34.516 "reset": true, 00:09:34.516 "compare": false, 00:09:34.516 "compare_and_write": false, 00:09:34.516 "abort": true, 00:09:34.516 "nvme_admin": false, 00:09:34.516 "nvme_io": false 00:09:34.516 }, 00:09:34.516 "driver_specific": {} 00:09:34.516 } 00:09:34.516 ] 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- common/autotest_common.sh@903 -- # return 0 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- bdev/blockdev.sh@457 -- # qos_function_test 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_iops_limit=1000 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local qos_lower_bw_limit=2 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local io_result=0 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local iops_limit=0 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- bdev/blockdev.sh@414 -- # local bw_limit=0 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # get_io_result IOPS Malloc_0 00:09:34.516 23:51:34 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:09:34.517 23:51:34 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:34.517 23:51:34 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:34.517 23:51:34 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:34.517 23:51:34 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:34.517 23:51:34 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:09:34.517 Running I/O for 60 seconds... 00:09:39.777 23:51:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 60901.76 243607.06 0.00 0.00 244736.00 0.00 0.00 ' 00:09:39.777 23:51:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:09:39.777 23:51:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:09:39.777 23:51:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # iostat_result=60901.76 00:09:39.777 23:51:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 60901 00:09:39.777 23:51:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # io_result=60901 00:09:39.777 23:51:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # iops_limit=15000 00:09:39.777 23:51:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@419 -- # '[' 15000 -gt 1000 ']' 00:09:39.777 23:51:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 15000 Malloc_0 00:09:39.777 23:51:40 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:39.777 23:51:40 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:39.777 23:51:40 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:39.777 23:51:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@423 -- # run_test bdev_qos_iops run_qos_test 15000 IOPS Malloc_0 00:09:39.777 23:51:40 blockdev_general.bdev_qos -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:09:39.777 23:51:40 blockdev_general.bdev_qos -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:39.777 23:51:40 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:39.777 ************************************ 00:09:39.777 START TEST bdev_qos_iops 00:09:39.777 ************************************ 00:09:39.777 23:51:40 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1121 -- # run_qos_test 15000 IOPS Malloc_0 00:09:39.777 23:51:40 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_limit=15000 00:09:39.777 23:51:40 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@390 -- # local qos_result=0 00:09:39.777 23:51:40 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # get_io_result IOPS Malloc_0 00:09:39.777 23:51:40 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:09:39.777 23:51:40 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:39.777 23:51:40 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:39.777 23:51:40 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:39.777 23:51:40 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:39.777 23:51:40 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # tail -1 00:09:45.120 23:51:45 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 15000.09 60000.35 0.00 0.00 61140.00 0.00 0.00 ' 00:09:45.120 23:51:45 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:09:45.120 23:51:45 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:09:45.120 23:51:45 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # iostat_result=15000.09 00:09:45.120 23:51:45 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@385 -- # echo 15000 00:09:45.120 23:51:45 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # qos_result=15000 00:09:45.120 23:51:45 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@393 -- # '[' IOPS = BANDWIDTH ']' 00:09:45.120 23:51:45 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # lower_limit=13500 00:09:45.120 23:51:45 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@397 -- # upper_limit=16500 00:09:45.120 23:51:45 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 15000 -lt 13500 ']' 00:09:45.120 23:51:45 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 15000 -gt 16500 ']' 00:09:45.120 00:09:45.120 real 0m5.253s 00:09:45.120 user 0m0.101s 00:09:45.120 sys 0m0.057s 00:09:45.120 23:51:45 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:45.120 23:51:45 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:09:45.120 ************************************ 00:09:45.120 END TEST bdev_qos_iops 00:09:45.120 ************************************ 00:09:45.120 23:51:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # get_io_result BANDWIDTH Null_1 00:09:45.120 23:51:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:09:45.120 23:51:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:09:45.120 23:51:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:45.120 23:51:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:45.120 23:51:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Null_1 00:09:45.120 23:51:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:09:50.388 23:51:50 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 19864.87 79459.47 0.00 0.00 80896.00 0.00 0.00 ' 00:09:50.388 23:51:50 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:09:50.388 23:51:50 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:50.388 23:51:50 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:09:50.388 23:51:50 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # iostat_result=80896.00 00:09:50.388 23:51:50 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 80896 00:09:50.388 23:51:50 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=80896 00:09:50.388 23:51:50 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # bw_limit=7 00:09:50.388 23:51:50 blockdev_general.bdev_qos -- bdev/blockdev.sh@429 -- # '[' 7 -lt 2 ']' 00:09:50.388 23:51:50 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 7 Null_1 00:09:50.388 23:51:50 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:50.388 23:51:50 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:50.388 23:51:50 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:50.388 23:51:50 blockdev_general.bdev_qos -- bdev/blockdev.sh@433 -- # run_test bdev_qos_bw run_qos_test 7 BANDWIDTH Null_1 00:09:50.388 23:51:50 blockdev_general.bdev_qos -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:09:50.389 23:51:50 blockdev_general.bdev_qos -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:50.389 23:51:50 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:50.389 ************************************ 00:09:50.389 START TEST bdev_qos_bw 00:09:50.389 ************************************ 00:09:50.389 23:51:50 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1121 -- # run_qos_test 7 BANDWIDTH Null_1 00:09:50.389 23:51:50 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_limit=7 00:09:50.389 23:51:50 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:09:50.389 23:51:50 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Null_1 00:09:50.389 23:51:50 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:09:50.389 23:51:50 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:09:50.389 23:51:50 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:50.389 23:51:50 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:50.389 23:51:50 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # grep Null_1 00:09:50.389 23:51:50 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # tail -1 00:09:55.652 23:51:56 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 1791.76 7167.04 0.00 0.00 7288.00 0.00 0.00 ' 00:09:55.652 23:51:56 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:09:55.652 23:51:56 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:55.652 23:51:56 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:09:55.652 23:51:56 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # iostat_result=7288.00 00:09:55.652 23:51:56 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@385 -- # echo 7288 00:09:55.652 23:51:56 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # qos_result=7288 00:09:55.652 23:51:56 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:55.652 23:51:56 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@394 -- # qos_limit=7168 00:09:55.652 23:51:56 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # lower_limit=6451 00:09:55.652 23:51:56 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@397 -- # upper_limit=7884 00:09:55.652 23:51:56 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 7288 -lt 6451 ']' 00:09:55.652 23:51:56 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 7288 -gt 7884 ']' 00:09:55.652 00:09:55.652 real 0m5.251s 00:09:55.652 user 0m0.107s 00:09:55.652 sys 0m0.055s 00:09:55.652 23:51:56 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:55.652 23:51:56 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:09:55.652 ************************************ 00:09:55.652 END TEST bdev_qos_bw 00:09:55.652 ************************************ 00:09:55.652 23:51:56 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:09:55.652 23:51:56 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:55.652 23:51:56 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:55.652 23:51:56 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:55.652 23:51:56 blockdev_general.bdev_qos -- bdev/blockdev.sh@437 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:09:55.652 23:51:56 blockdev_general.bdev_qos -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:09:55.652 23:51:56 blockdev_general.bdev_qos -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:55.652 23:51:56 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:55.652 ************************************ 00:09:55.652 START TEST bdev_qos_ro_bw 00:09:55.652 ************************************ 00:09:55.652 23:51:56 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1121 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:09:55.652 23:51:56 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_limit=2 00:09:55.652 23:51:56 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:09:55.652 23:51:56 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Malloc_0 00:09:55.652 23:51:56 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:09:55.652 23:51:56 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:55.652 23:51:56 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:55.652 23:51:56 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:55.652 23:51:56 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:55.652 23:51:56 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # tail -1 00:10:00.917 23:52:01 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 511.61 2046.45 0.00 0.00 2060.00 0.00 0.00 ' 00:10:00.917 23:52:01 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:10:00.917 23:52:01 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:00.917 23:52:01 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:10:00.917 23:52:01 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # iostat_result=2060.00 00:10:00.917 23:52:01 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@385 -- # echo 2060 00:10:00.917 23:52:01 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # qos_result=2060 00:10:00.917 23:52:01 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:00.917 23:52:01 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@394 -- # qos_limit=2048 00:10:00.917 23:52:01 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # lower_limit=1843 00:10:00.917 23:52:01 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@397 -- # upper_limit=2252 00:10:00.917 23:52:01 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2060 -lt 1843 ']' 00:10:00.917 23:52:01 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2060 -gt 2252 ']' 00:10:00.917 00:10:00.917 real 0m5.186s 00:10:00.917 user 0m0.118s 00:10:00.917 sys 0m0.043s 00:10:00.917 23:52:01 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1122 -- # xtrace_disable 00:10:00.917 23:52:01 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:10:00.917 ************************************ 00:10:00.917 END TEST bdev_qos_ro_bw 00:10:00.917 ************************************ 00:10:00.917 23:52:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:10:00.917 23:52:01 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:00.917 23:52:01 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:01.483 23:52:01 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:01.483 23:52:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # rpc_cmd bdev_null_delete Null_1 00:10:01.483 23:52:01 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:01.483 23:52:01 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:01.741 00:10:01.741 Latency(us) 00:10:01.741 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:01.741 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:01.741 Malloc_0 : 26.82 20553.60 80.29 0.00 0.00 12339.08 2065.81 503316.48 00:10:01.741 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:01.741 Null_1 : 26.99 20004.47 78.14 0.00 0.00 12760.72 851.26 165948.55 00:10:01.741 =================================================================================================================== 00:10:01.741 Total : 40558.07 158.43 0.00 0.00 12547.70 851.26 503316.48 00:10:01.741 0 00:10:01.741 23:52:02 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:01.741 23:52:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # killprocess 378024 00:10:01.741 23:52:02 blockdev_general.bdev_qos -- common/autotest_common.sh@946 -- # '[' -z 378024 ']' 00:10:01.741 23:52:02 blockdev_general.bdev_qos -- common/autotest_common.sh@950 -- # kill -0 378024 00:10:01.741 23:52:02 blockdev_general.bdev_qos -- common/autotest_common.sh@951 -- # uname 00:10:01.741 23:52:02 blockdev_general.bdev_qos -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:10:01.741 23:52:02 blockdev_general.bdev_qos -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 378024 00:10:01.741 23:52:02 blockdev_general.bdev_qos -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:10:01.741 23:52:02 blockdev_general.bdev_qos -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:10:01.741 23:52:02 blockdev_general.bdev_qos -- common/autotest_common.sh@964 -- # echo 'killing process with pid 378024' 00:10:01.741 killing process with pid 378024 00:10:01.741 23:52:02 blockdev_general.bdev_qos -- common/autotest_common.sh@965 -- # kill 378024 00:10:01.741 Received shutdown signal, test time was about 27.055163 seconds 00:10:01.741 00:10:01.741 Latency(us) 00:10:01.741 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:01.741 =================================================================================================================== 00:10:01.741 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:01.741 23:52:02 blockdev_general.bdev_qos -- common/autotest_common.sh@970 -- # wait 378024 00:10:02.000 23:52:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@462 -- # trap - SIGINT SIGTERM EXIT 00:10:02.000 00:10:02.000 real 0m28.576s 00:10:02.000 user 0m29.371s 00:10:02.000 sys 0m0.842s 00:10:02.000 23:52:02 blockdev_general.bdev_qos -- common/autotest_common.sh@1122 -- # xtrace_disable 00:10:02.000 23:52:02 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:02.000 ************************************ 00:10:02.000 END TEST bdev_qos 00:10:02.000 ************************************ 00:10:02.000 23:52:02 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:10:02.000 23:52:02 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:10:02.000 23:52:02 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:10:02.000 23:52:02 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:02.000 ************************************ 00:10:02.000 START TEST bdev_qd_sampling 00:10:02.000 ************************************ 00:10:02.000 23:52:02 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1121 -- # qd_sampling_test_suite '' 00:10:02.000 23:52:02 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@538 -- # QD_DEV=Malloc_QD 00:10:02.000 23:52:02 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # QD_PID=381818 00:10:02.000 23:52:02 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # echo 'Process bdev QD sampling period testing pid: 381818' 00:10:02.000 Process bdev QD sampling period testing pid: 381818 00:10:02.000 23:52:02 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:10:02.000 23:52:02 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:10:02.000 23:52:02 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@544 -- # waitforlisten 381818 00:10:02.000 23:52:02 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@827 -- # '[' -z 381818 ']' 00:10:02.000 23:52:02 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:02.000 23:52:02 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@832 -- # local max_retries=100 00:10:02.000 23:52:02 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:02.000 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:02.000 23:52:02 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@836 -- # xtrace_disable 00:10:02.000 23:52:02 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:02.000 [2024-05-14 23:52:02.577770] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:10:02.000 [2024-05-14 23:52:02.577834] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid381818 ] 00:10:02.258 [2024-05-14 23:52:02.703966] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:02.258 [2024-05-14 23:52:02.807551] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:10:02.258 [2024-05-14 23:52:02.807556] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:03.191 23:52:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:10:03.191 23:52:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@860 -- # return 0 00:10:03.191 23:52:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:10:03.191 23:52:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:03.191 23:52:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:03.191 Malloc_QD 00:10:03.191 23:52:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:03.191 23:52:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@547 -- # waitforbdev Malloc_QD 00:10:03.191 23:52:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@895 -- # local bdev_name=Malloc_QD 00:10:03.191 23:52:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:03.191 23:52:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@897 -- # local i 00:10:03.191 23:52:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:03.191 23:52:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:03.191 23:52:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # rpc_cmd bdev_wait_for_examine 00:10:03.191 23:52:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:03.191 23:52:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:03.191 23:52:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:03.191 23:52:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:10:03.191 23:52:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:03.191 23:52:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:03.191 [ 00:10:03.191 { 00:10:03.191 "name": "Malloc_QD", 00:10:03.191 "aliases": [ 00:10:03.191 "787620de-878c-4759-8a60-c5933b5ee22e" 00:10:03.191 ], 00:10:03.191 "product_name": "Malloc disk", 00:10:03.191 "block_size": 512, 00:10:03.191 "num_blocks": 262144, 00:10:03.191 "uuid": "787620de-878c-4759-8a60-c5933b5ee22e", 00:10:03.191 "assigned_rate_limits": { 00:10:03.191 "rw_ios_per_sec": 0, 00:10:03.191 "rw_mbytes_per_sec": 0, 00:10:03.191 "r_mbytes_per_sec": 0, 00:10:03.191 "w_mbytes_per_sec": 0 00:10:03.191 }, 00:10:03.191 "claimed": false, 00:10:03.191 "zoned": false, 00:10:03.191 "supported_io_types": { 00:10:03.191 "read": true, 00:10:03.191 "write": true, 00:10:03.191 "unmap": true, 00:10:03.191 "write_zeroes": true, 00:10:03.191 "flush": true, 00:10:03.191 "reset": true, 00:10:03.191 "compare": false, 00:10:03.191 "compare_and_write": false, 00:10:03.191 "abort": true, 00:10:03.191 "nvme_admin": false, 00:10:03.191 "nvme_io": false 00:10:03.191 }, 00:10:03.191 "memory_domains": [ 00:10:03.191 { 00:10:03.191 "dma_device_id": "system", 00:10:03.191 "dma_device_type": 1 00:10:03.191 }, 00:10:03.191 { 00:10:03.191 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:03.191 "dma_device_type": 2 00:10:03.191 } 00:10:03.191 ], 00:10:03.191 "driver_specific": {} 00:10:03.191 } 00:10:03.191 ] 00:10:03.191 23:52:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:03.191 23:52:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@903 -- # return 0 00:10:03.191 23:52:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # sleep 2 00:10:03.191 23:52:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:03.191 Running I/O for 5 seconds... 00:10:05.090 23:52:05 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@551 -- # qd_sampling_function_test Malloc_QD 00:10:05.090 23:52:05 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local bdev_name=Malloc_QD 00:10:05.090 23:52:05 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local sampling_period=10 00:10:05.090 23:52:05 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@521 -- # local iostats 00:10:05.090 23:52:05 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@523 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:10:05.090 23:52:05 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:05.090 23:52:05 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:05.090 23:52:05 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:05.090 23:52:05 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:10:05.090 23:52:05 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:05.090 23:52:05 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:05.090 23:52:05 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:05.090 23:52:05 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # iostats='{ 00:10:05.090 "tick_rate": 2300000000, 00:10:05.090 "ticks": 7045913921239860, 00:10:05.090 "bdevs": [ 00:10:05.090 { 00:10:05.090 "name": "Malloc_QD", 00:10:05.090 "bytes_read": 768651776, 00:10:05.090 "num_read_ops": 187652, 00:10:05.090 "bytes_written": 0, 00:10:05.090 "num_write_ops": 0, 00:10:05.090 "bytes_unmapped": 0, 00:10:05.090 "num_unmap_ops": 0, 00:10:05.090 "bytes_copied": 0, 00:10:05.090 "num_copy_ops": 0, 00:10:05.090 "read_latency_ticks": 2243504704868, 00:10:05.090 "max_read_latency_ticks": 14608700, 00:10:05.090 "min_read_latency_ticks": 254612, 00:10:05.090 "write_latency_ticks": 0, 00:10:05.090 "max_write_latency_ticks": 0, 00:10:05.090 "min_write_latency_ticks": 0, 00:10:05.090 "unmap_latency_ticks": 0, 00:10:05.090 "max_unmap_latency_ticks": 0, 00:10:05.090 "min_unmap_latency_ticks": 0, 00:10:05.090 "copy_latency_ticks": 0, 00:10:05.090 "max_copy_latency_ticks": 0, 00:10:05.090 "min_copy_latency_ticks": 0, 00:10:05.090 "io_error": {}, 00:10:05.090 "queue_depth_polling_period": 10, 00:10:05.090 "queue_depth": 512, 00:10:05.090 "io_time": 30, 00:10:05.090 "weighted_io_time": 15360 00:10:05.090 } 00:10:05.090 ] 00:10:05.090 }' 00:10:05.090 23:52:05 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:10:05.090 23:52:05 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # qd_sampling_period=10 00:10:05.090 23:52:05 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 == null ']' 00:10:05.090 23:52:05 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 -ne 10 ']' 00:10:05.090 23:52:05 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:10:05.090 23:52:05 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:05.090 23:52:05 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:05.348 00:10:05.348 Latency(us) 00:10:05.348 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:05.348 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:10:05.348 Malloc_QD : 1.99 48714.21 190.29 0.00 0.00 5241.31 1631.28 5641.79 00:10:05.348 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:05.348 Malloc_QD : 1.99 49605.11 193.77 0.00 0.00 5147.87 1467.44 6354.14 00:10:05.348 =================================================================================================================== 00:10:05.348 Total : 98319.32 384.06 0.00 0.00 5194.16 1467.44 6354.14 00:10:05.348 0 00:10:05.348 23:52:05 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:05.348 23:52:05 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # killprocess 381818 00:10:05.348 23:52:05 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@946 -- # '[' -z 381818 ']' 00:10:05.348 23:52:05 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@950 -- # kill -0 381818 00:10:05.348 23:52:05 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@951 -- # uname 00:10:05.348 23:52:05 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:10:05.348 23:52:05 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 381818 00:10:05.348 23:52:05 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:10:05.348 23:52:05 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:10:05.348 23:52:05 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@964 -- # echo 'killing process with pid 381818' 00:10:05.348 killing process with pid 381818 00:10:05.348 23:52:05 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@965 -- # kill 381818 00:10:05.348 Received shutdown signal, test time was about 2.063470 seconds 00:10:05.348 00:10:05.348 Latency(us) 00:10:05.348 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:05.348 =================================================================================================================== 00:10:05.348 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:05.348 23:52:05 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@970 -- # wait 381818 00:10:05.606 23:52:06 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@555 -- # trap - SIGINT SIGTERM EXIT 00:10:05.606 00:10:05.606 real 0m3.485s 00:10:05.606 user 0m6.799s 00:10:05.606 sys 0m0.446s 00:10:05.606 23:52:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1122 -- # xtrace_disable 00:10:05.606 23:52:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:05.606 ************************************ 00:10:05.606 END TEST bdev_qd_sampling 00:10:05.606 ************************************ 00:10:05.606 23:52:06 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_error error_test_suite '' 00:10:05.606 23:52:06 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:10:05.606 23:52:06 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:10:05.606 23:52:06 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:05.606 ************************************ 00:10:05.606 START TEST bdev_error 00:10:05.606 ************************************ 00:10:05.606 23:52:06 blockdev_general.bdev_error -- common/autotest_common.sh@1121 -- # error_test_suite '' 00:10:05.606 23:52:06 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_1=Dev_1 00:10:05.606 23:52:06 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # DEV_2=Dev_2 00:10:05.606 23:52:06 blockdev_general.bdev_error -- bdev/blockdev.sh@468 -- # ERR_DEV=EE_Dev_1 00:10:05.606 23:52:06 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # ERR_PID=382329 00:10:05.606 23:52:06 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # echo 'Process error testing pid: 382329' 00:10:05.606 Process error testing pid: 382329 00:10:05.606 23:52:06 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:10:05.606 23:52:06 blockdev_general.bdev_error -- bdev/blockdev.sh@474 -- # waitforlisten 382329 00:10:05.606 23:52:06 blockdev_general.bdev_error -- common/autotest_common.sh@827 -- # '[' -z 382329 ']' 00:10:05.606 23:52:06 blockdev_general.bdev_error -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:05.606 23:52:06 blockdev_general.bdev_error -- common/autotest_common.sh@832 -- # local max_retries=100 00:10:05.606 23:52:06 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:05.606 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:05.606 23:52:06 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # xtrace_disable 00:10:05.606 23:52:06 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:05.606 [2024-05-14 23:52:06.159781] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:10:05.606 [2024-05-14 23:52:06.159852] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid382329 ] 00:10:05.865 [2024-05-14 23:52:06.283368] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:05.865 [2024-05-14 23:52:06.381290] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:10:06.799 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:10:06.799 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@860 -- # return 0 00:10:06.799 23:52:07 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:10:06.799 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:06.799 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:06.799 Dev_1 00:10:06.799 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:06.799 23:52:07 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # waitforbdev Dev_1 00:10:06.799 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@895 -- # local bdev_name=Dev_1 00:10:06.799 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:06.799 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local i 00:10:06.799 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:06.799 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:06.799 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # rpc_cmd bdev_wait_for_examine 00:10:06.799 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:06.799 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:06.799 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:06.799 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:10:06.799 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:06.799 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:06.799 [ 00:10:06.799 { 00:10:06.799 "name": "Dev_1", 00:10:06.799 "aliases": [ 00:10:06.799 "ea5bf05c-1f2c-4cde-bfc5-9721c812b052" 00:10:06.799 ], 00:10:06.799 "product_name": "Malloc disk", 00:10:06.799 "block_size": 512, 00:10:06.799 "num_blocks": 262144, 00:10:06.799 "uuid": "ea5bf05c-1f2c-4cde-bfc5-9721c812b052", 00:10:06.799 "assigned_rate_limits": { 00:10:06.799 "rw_ios_per_sec": 0, 00:10:06.799 "rw_mbytes_per_sec": 0, 00:10:06.799 "r_mbytes_per_sec": 0, 00:10:06.799 "w_mbytes_per_sec": 0 00:10:06.799 }, 00:10:06.799 "claimed": false, 00:10:06.799 "zoned": false, 00:10:06.799 "supported_io_types": { 00:10:06.799 "read": true, 00:10:06.799 "write": true, 00:10:06.799 "unmap": true, 00:10:06.799 "write_zeroes": true, 00:10:06.799 "flush": true, 00:10:06.799 "reset": true, 00:10:06.799 "compare": false, 00:10:06.799 "compare_and_write": false, 00:10:06.799 "abort": true, 00:10:06.799 "nvme_admin": false, 00:10:06.799 "nvme_io": false 00:10:06.799 }, 00:10:06.799 "memory_domains": [ 00:10:06.799 { 00:10:06.799 "dma_device_id": "system", 00:10:06.799 "dma_device_type": 1 00:10:06.799 }, 00:10:06.799 { 00:10:06.799 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:06.799 "dma_device_type": 2 00:10:06.799 } 00:10:06.799 ], 00:10:06.799 "driver_specific": {} 00:10:06.799 } 00:10:06.799 ] 00:10:06.799 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:06.799 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # return 0 00:10:06.799 23:52:07 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_error_create Dev_1 00:10:06.799 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:06.799 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:06.799 true 00:10:06.800 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:06.800 23:52:07 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:10:06.800 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:06.800 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:06.800 Dev_2 00:10:06.800 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:06.800 23:52:07 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # waitforbdev Dev_2 00:10:06.800 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@895 -- # local bdev_name=Dev_2 00:10:06.800 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:06.800 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local i 00:10:06.800 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:06.800 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:06.800 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # rpc_cmd bdev_wait_for_examine 00:10:06.800 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:06.800 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:06.800 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:06.800 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:10:06.800 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:06.800 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:06.800 [ 00:10:06.800 { 00:10:06.800 "name": "Dev_2", 00:10:06.800 "aliases": [ 00:10:06.800 "7692aed3-1137-4e7f-b003-e3ec522c2daf" 00:10:06.800 ], 00:10:06.800 "product_name": "Malloc disk", 00:10:06.800 "block_size": 512, 00:10:06.800 "num_blocks": 262144, 00:10:06.800 "uuid": "7692aed3-1137-4e7f-b003-e3ec522c2daf", 00:10:06.800 "assigned_rate_limits": { 00:10:06.800 "rw_ios_per_sec": 0, 00:10:06.800 "rw_mbytes_per_sec": 0, 00:10:06.800 "r_mbytes_per_sec": 0, 00:10:06.800 "w_mbytes_per_sec": 0 00:10:06.800 }, 00:10:06.800 "claimed": false, 00:10:06.800 "zoned": false, 00:10:06.800 "supported_io_types": { 00:10:06.800 "read": true, 00:10:06.800 "write": true, 00:10:06.800 "unmap": true, 00:10:06.800 "write_zeroes": true, 00:10:06.800 "flush": true, 00:10:06.800 "reset": true, 00:10:06.800 "compare": false, 00:10:06.800 "compare_and_write": false, 00:10:06.800 "abort": true, 00:10:06.800 "nvme_admin": false, 00:10:06.800 "nvme_io": false 00:10:06.800 }, 00:10:06.800 "memory_domains": [ 00:10:06.800 { 00:10:06.800 "dma_device_id": "system", 00:10:06.800 "dma_device_type": 1 00:10:06.800 }, 00:10:06.800 { 00:10:06.800 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:06.800 "dma_device_type": 2 00:10:06.800 } 00:10:06.800 ], 00:10:06.800 "driver_specific": {} 00:10:06.800 } 00:10:06.800 ] 00:10:06.800 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:06.800 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # return 0 00:10:06.800 23:52:07 blockdev_general.bdev_error -- bdev/blockdev.sh@481 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:10:06.800 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:06.800 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:06.800 23:52:07 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:06.800 23:52:07 blockdev_general.bdev_error -- bdev/blockdev.sh@484 -- # sleep 1 00:10:06.800 23:52:07 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:10:06.800 Running I/O for 5 seconds... 00:10:07.735 23:52:08 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # kill -0 382329 00:10:07.735 23:52:08 blockdev_general.bdev_error -- bdev/blockdev.sh@488 -- # echo 'Process is existed as continue on error is set. Pid: 382329' 00:10:07.735 Process is existed as continue on error is set. Pid: 382329 00:10:07.735 23:52:08 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:10:07.735 23:52:08 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:07.735 23:52:08 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:07.735 23:52:08 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:07.735 23:52:08 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # rpc_cmd bdev_malloc_delete Dev_1 00:10:07.735 23:52:08 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:07.735 23:52:08 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:07.735 23:52:08 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:07.735 23:52:08 blockdev_general.bdev_error -- bdev/blockdev.sh@497 -- # sleep 5 00:10:07.993 Timeout while waiting for response: 00:10:07.993 00:10:07.993 00:10:12.232 00:10:12.232 Latency(us) 00:10:12.232 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:12.232 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:12.232 EE_Dev_1 : 0.90 36986.24 144.48 5.57 0.00 428.74 132.67 701.66 00:10:12.232 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:12.232 Dev_2 : 5.00 79719.90 311.41 0.00 0.00 196.92 69.90 20629.59 00:10:12.232 =================================================================================================================== 00:10:12.232 Total : 116706.13 455.88 5.57 0.00 214.74 69.90 20629.59 00:10:12.798 23:52:13 blockdev_general.bdev_error -- bdev/blockdev.sh@499 -- # killprocess 382329 00:10:12.798 23:52:13 blockdev_general.bdev_error -- common/autotest_common.sh@946 -- # '[' -z 382329 ']' 00:10:12.798 23:52:13 blockdev_general.bdev_error -- common/autotest_common.sh@950 -- # kill -0 382329 00:10:12.798 23:52:13 blockdev_general.bdev_error -- common/autotest_common.sh@951 -- # uname 00:10:12.798 23:52:13 blockdev_general.bdev_error -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:10:12.798 23:52:13 blockdev_general.bdev_error -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 382329 00:10:12.798 23:52:13 blockdev_general.bdev_error -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:10:12.798 23:52:13 blockdev_general.bdev_error -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:10:12.798 23:52:13 blockdev_general.bdev_error -- common/autotest_common.sh@964 -- # echo 'killing process with pid 382329' 00:10:12.798 killing process with pid 382329 00:10:12.798 23:52:13 blockdev_general.bdev_error -- common/autotest_common.sh@965 -- # kill 382329 00:10:12.798 Received shutdown signal, test time was about 5.000000 seconds 00:10:12.798 00:10:12.798 Latency(us) 00:10:12.798 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:12.798 =================================================================================================================== 00:10:12.798 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:12.798 23:52:13 blockdev_general.bdev_error -- common/autotest_common.sh@970 -- # wait 382329 00:10:13.058 23:52:13 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # ERR_PID=383266 00:10:13.058 23:52:13 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # echo 'Process error testing pid: 383266' 00:10:13.058 Process error testing pid: 383266 00:10:13.058 23:52:13 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:10:13.058 23:52:13 blockdev_general.bdev_error -- bdev/blockdev.sh@505 -- # waitforlisten 383266 00:10:13.058 23:52:13 blockdev_general.bdev_error -- common/autotest_common.sh@827 -- # '[' -z 383266 ']' 00:10:13.058 23:52:13 blockdev_general.bdev_error -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:13.058 23:52:13 blockdev_general.bdev_error -- common/autotest_common.sh@832 -- # local max_retries=100 00:10:13.058 23:52:13 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:13.058 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:13.058 23:52:13 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # xtrace_disable 00:10:13.058 23:52:13 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:13.317 [2024-05-14 23:52:13.684965] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:10:13.317 [2024-05-14 23:52:13.685044] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid383266 ] 00:10:13.317 [2024-05-14 23:52:13.807748] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:13.576 [2024-05-14 23:52:13.912455] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:10:14.143 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:10:14.143 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@860 -- # return 0 00:10:14.143 23:52:14 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:10:14.143 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:14.143 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:14.143 Dev_1 00:10:14.143 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:14.143 23:52:14 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # waitforbdev Dev_1 00:10:14.143 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@895 -- # local bdev_name=Dev_1 00:10:14.143 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:14.143 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local i 00:10:14.143 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:14.143 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:14.143 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # rpc_cmd bdev_wait_for_examine 00:10:14.143 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:14.143 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:14.143 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:14.143 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:10:14.143 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:14.143 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:14.143 [ 00:10:14.143 { 00:10:14.143 "name": "Dev_1", 00:10:14.143 "aliases": [ 00:10:14.143 "84df1133-4746-431f-8f9e-c9533a8031b9" 00:10:14.143 ], 00:10:14.143 "product_name": "Malloc disk", 00:10:14.143 "block_size": 512, 00:10:14.143 "num_blocks": 262144, 00:10:14.144 "uuid": "84df1133-4746-431f-8f9e-c9533a8031b9", 00:10:14.144 "assigned_rate_limits": { 00:10:14.144 "rw_ios_per_sec": 0, 00:10:14.144 "rw_mbytes_per_sec": 0, 00:10:14.144 "r_mbytes_per_sec": 0, 00:10:14.144 "w_mbytes_per_sec": 0 00:10:14.144 }, 00:10:14.144 "claimed": false, 00:10:14.144 "zoned": false, 00:10:14.144 "supported_io_types": { 00:10:14.144 "read": true, 00:10:14.144 "write": true, 00:10:14.144 "unmap": true, 00:10:14.144 "write_zeroes": true, 00:10:14.144 "flush": true, 00:10:14.144 "reset": true, 00:10:14.144 "compare": false, 00:10:14.144 "compare_and_write": false, 00:10:14.144 "abort": true, 00:10:14.144 "nvme_admin": false, 00:10:14.144 "nvme_io": false 00:10:14.144 }, 00:10:14.144 "memory_domains": [ 00:10:14.144 { 00:10:14.144 "dma_device_id": "system", 00:10:14.144 "dma_device_type": 1 00:10:14.144 }, 00:10:14.144 { 00:10:14.144 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:14.144 "dma_device_type": 2 00:10:14.144 } 00:10:14.144 ], 00:10:14.144 "driver_specific": {} 00:10:14.144 } 00:10:14.144 ] 00:10:14.144 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:14.144 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # return 0 00:10:14.144 23:52:14 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_error_create Dev_1 00:10:14.144 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:14.144 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:14.144 true 00:10:14.144 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:14.144 23:52:14 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:10:14.144 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:14.144 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:14.144 Dev_2 00:10:14.144 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:14.144 23:52:14 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # waitforbdev Dev_2 00:10:14.144 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@895 -- # local bdev_name=Dev_2 00:10:14.144 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:14.144 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local i 00:10:14.144 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:14.144 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:14.144 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # rpc_cmd bdev_wait_for_examine 00:10:14.144 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:14.144 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:14.144 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:14.144 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:10:14.144 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:14.144 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:14.144 [ 00:10:14.144 { 00:10:14.144 "name": "Dev_2", 00:10:14.144 "aliases": [ 00:10:14.144 "e0650387-c728-437e-a845-6c50511954c9" 00:10:14.144 ], 00:10:14.144 "product_name": "Malloc disk", 00:10:14.144 "block_size": 512, 00:10:14.144 "num_blocks": 262144, 00:10:14.144 "uuid": "e0650387-c728-437e-a845-6c50511954c9", 00:10:14.144 "assigned_rate_limits": { 00:10:14.144 "rw_ios_per_sec": 0, 00:10:14.144 "rw_mbytes_per_sec": 0, 00:10:14.144 "r_mbytes_per_sec": 0, 00:10:14.144 "w_mbytes_per_sec": 0 00:10:14.144 }, 00:10:14.144 "claimed": false, 00:10:14.144 "zoned": false, 00:10:14.144 "supported_io_types": { 00:10:14.144 "read": true, 00:10:14.144 "write": true, 00:10:14.144 "unmap": true, 00:10:14.144 "write_zeroes": true, 00:10:14.144 "flush": true, 00:10:14.144 "reset": true, 00:10:14.144 "compare": false, 00:10:14.144 "compare_and_write": false, 00:10:14.144 "abort": true, 00:10:14.144 "nvme_admin": false, 00:10:14.144 "nvme_io": false 00:10:14.144 }, 00:10:14.144 "memory_domains": [ 00:10:14.144 { 00:10:14.144 "dma_device_id": "system", 00:10:14.144 "dma_device_type": 1 00:10:14.144 }, 00:10:14.144 { 00:10:14.144 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:14.144 "dma_device_type": 2 00:10:14.144 } 00:10:14.144 ], 00:10:14.144 "driver_specific": {} 00:10:14.144 } 00:10:14.144 ] 00:10:14.144 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:14.144 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # return 0 00:10:14.144 23:52:14 blockdev_general.bdev_error -- bdev/blockdev.sh@512 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:10:14.144 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:14.144 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:14.404 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:14.404 23:52:14 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:10:14.404 23:52:14 blockdev_general.bdev_error -- bdev/blockdev.sh@515 -- # NOT wait 383266 00:10:14.404 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@648 -- # local es=0 00:10:14.404 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@650 -- # valid_exec_arg wait 383266 00:10:14.404 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@636 -- # local arg=wait 00:10:14.404 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:14.404 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # type -t wait 00:10:14.404 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:14.404 23:52:14 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # wait 383266 00:10:14.404 Running I/O for 5 seconds... 00:10:14.404 task offset: 261936 on job bdev=EE_Dev_1 fails 00:10:14.404 00:10:14.404 Latency(us) 00:10:14.404 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:14.404 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:14.404 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:10:14.404 EE_Dev_1 : 0.00 29810.30 116.45 6775.07 0.00 361.70 132.67 644.67 00:10:14.404 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:14.404 Dev_2 : 0.00 18327.61 71.59 0.00 0.00 648.40 128.22 1203.87 00:10:14.404 =================================================================================================================== 00:10:14.404 Total : 48137.90 188.04 6775.07 0.00 517.20 128.22 1203.87 00:10:14.404 [2024-05-14 23:52:14.813190] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:14.404 request: 00:10:14.404 { 00:10:14.404 "method": "perform_tests", 00:10:14.404 "req_id": 1 00:10:14.404 } 00:10:14.404 Got JSON-RPC error response 00:10:14.404 response: 00:10:14.404 { 00:10:14.404 "code": -32603, 00:10:14.404 "message": "bdevperf failed with error Operation not permitted" 00:10:14.404 } 00:10:14.663 23:52:15 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # es=255 00:10:14.663 23:52:15 blockdev_general.bdev_error -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:14.663 23:52:15 blockdev_general.bdev_error -- common/autotest_common.sh@660 -- # es=127 00:10:14.663 23:52:15 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # case "$es" in 00:10:14.663 23:52:15 blockdev_general.bdev_error -- common/autotest_common.sh@668 -- # es=1 00:10:14.663 23:52:15 blockdev_general.bdev_error -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:14.663 00:10:14.663 real 0m9.038s 00:10:14.663 user 0m9.348s 00:10:14.663 sys 0m0.876s 00:10:14.663 23:52:15 blockdev_general.bdev_error -- common/autotest_common.sh@1122 -- # xtrace_disable 00:10:14.663 23:52:15 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:14.663 ************************************ 00:10:14.663 END TEST bdev_error 00:10:14.663 ************************************ 00:10:14.663 23:52:15 blockdev_general -- bdev/blockdev.sh@791 -- # run_test bdev_stat stat_test_suite '' 00:10:14.663 23:52:15 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:10:14.663 23:52:15 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:10:14.663 23:52:15 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:14.663 ************************************ 00:10:14.663 START TEST bdev_stat 00:10:14.663 ************************************ 00:10:14.663 23:52:15 blockdev_general.bdev_stat -- common/autotest_common.sh@1121 -- # stat_test_suite '' 00:10:14.663 23:52:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@592 -- # STAT_DEV=Malloc_STAT 00:10:14.663 23:52:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # STAT_PID=383466 00:10:14.663 23:52:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # echo 'Process Bdev IO statistics testing pid: 383466' 00:10:14.663 Process Bdev IO statistics testing pid: 383466 00:10:14.663 23:52:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:10:14.663 23:52:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:10:14.663 23:52:15 blockdev_general.bdev_stat -- bdev/blockdev.sh@599 -- # waitforlisten 383466 00:10:14.663 23:52:15 blockdev_general.bdev_stat -- common/autotest_common.sh@827 -- # '[' -z 383466 ']' 00:10:14.664 23:52:15 blockdev_general.bdev_stat -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:14.664 23:52:15 blockdev_general.bdev_stat -- common/autotest_common.sh@832 -- # local max_retries=100 00:10:14.664 23:52:15 blockdev_general.bdev_stat -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:14.664 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:14.664 23:52:15 blockdev_general.bdev_stat -- common/autotest_common.sh@836 -- # xtrace_disable 00:10:14.664 23:52:15 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:14.922 [2024-05-14 23:52:15.289073] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:10:14.922 [2024-05-14 23:52:15.289137] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid383466 ] 00:10:14.922 [2024-05-14 23:52:15.417425] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:15.182 [2024-05-14 23:52:15.528060] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:10:15.182 [2024-05-14 23:52:15.528065] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:15.750 23:52:16 blockdev_general.bdev_stat -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:10:15.750 23:52:16 blockdev_general.bdev_stat -- common/autotest_common.sh@860 -- # return 0 00:10:15.750 23:52:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:10:15.750 23:52:16 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:15.750 23:52:16 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:15.751 Malloc_STAT 00:10:15.751 23:52:16 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:15.751 23:52:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@602 -- # waitforbdev Malloc_STAT 00:10:15.751 23:52:16 blockdev_general.bdev_stat -- common/autotest_common.sh@895 -- # local bdev_name=Malloc_STAT 00:10:15.751 23:52:16 blockdev_general.bdev_stat -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:15.751 23:52:16 blockdev_general.bdev_stat -- common/autotest_common.sh@897 -- # local i 00:10:15.751 23:52:16 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:15.751 23:52:16 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:15.751 23:52:16 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # rpc_cmd bdev_wait_for_examine 00:10:15.751 23:52:16 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:15.751 23:52:16 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:15.751 23:52:16 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:15.751 23:52:16 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:10:15.751 23:52:16 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:15.751 23:52:16 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:15.751 [ 00:10:15.751 { 00:10:15.751 "name": "Malloc_STAT", 00:10:15.751 "aliases": [ 00:10:15.751 "9c03b0dc-91f9-4576-aa40-58dee63452f9" 00:10:15.751 ], 00:10:15.751 "product_name": "Malloc disk", 00:10:15.751 "block_size": 512, 00:10:15.751 "num_blocks": 262144, 00:10:15.751 "uuid": "9c03b0dc-91f9-4576-aa40-58dee63452f9", 00:10:15.751 "assigned_rate_limits": { 00:10:15.751 "rw_ios_per_sec": 0, 00:10:15.751 "rw_mbytes_per_sec": 0, 00:10:15.751 "r_mbytes_per_sec": 0, 00:10:15.751 "w_mbytes_per_sec": 0 00:10:15.751 }, 00:10:15.751 "claimed": false, 00:10:15.751 "zoned": false, 00:10:15.751 "supported_io_types": { 00:10:15.751 "read": true, 00:10:15.751 "write": true, 00:10:15.751 "unmap": true, 00:10:15.751 "write_zeroes": true, 00:10:15.751 "flush": true, 00:10:15.751 "reset": true, 00:10:15.751 "compare": false, 00:10:15.751 "compare_and_write": false, 00:10:15.751 "abort": true, 00:10:15.751 "nvme_admin": false, 00:10:15.751 "nvme_io": false 00:10:15.751 }, 00:10:15.751 "memory_domains": [ 00:10:15.751 { 00:10:15.751 "dma_device_id": "system", 00:10:15.751 "dma_device_type": 1 00:10:15.751 }, 00:10:15.751 { 00:10:15.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:15.751 "dma_device_type": 2 00:10:15.751 } 00:10:15.751 ], 00:10:15.751 "driver_specific": {} 00:10:15.751 } 00:10:15.751 ] 00:10:15.751 23:52:16 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:15.751 23:52:16 blockdev_general.bdev_stat -- common/autotest_common.sh@903 -- # return 0 00:10:15.751 23:52:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # sleep 2 00:10:15.751 23:52:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:16.010 Running I/O for 10 seconds... 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@606 -- # stat_function_test Malloc_STAT 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local bdev_name=Malloc_STAT 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local iostats 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count1 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local io_count2 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local iostats_per_channel 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel1 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel2 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@566 -- # local io_count_per_channel_all=0 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # iostats='{ 00:10:17.917 "tick_rate": 2300000000, 00:10:17.917 "ticks": 7045943069612720, 00:10:17.917 "bdevs": [ 00:10:17.917 { 00:10:17.917 "name": "Malloc_STAT", 00:10:17.917 "bytes_read": 760263168, 00:10:17.917 "num_read_ops": 185604, 00:10:17.917 "bytes_written": 0, 00:10:17.917 "num_write_ops": 0, 00:10:17.917 "bytes_unmapped": 0, 00:10:17.917 "num_unmap_ops": 0, 00:10:17.917 "bytes_copied": 0, 00:10:17.917 "num_copy_ops": 0, 00:10:17.917 "read_latency_ticks": 2232173529398, 00:10:17.917 "max_read_latency_ticks": 14458018, 00:10:17.917 "min_read_latency_ticks": 269638, 00:10:17.917 "write_latency_ticks": 0, 00:10:17.917 "max_write_latency_ticks": 0, 00:10:17.917 "min_write_latency_ticks": 0, 00:10:17.917 "unmap_latency_ticks": 0, 00:10:17.917 "max_unmap_latency_ticks": 0, 00:10:17.917 "min_unmap_latency_ticks": 0, 00:10:17.917 "copy_latency_ticks": 0, 00:10:17.917 "max_copy_latency_ticks": 0, 00:10:17.917 "min_copy_latency_ticks": 0, 00:10:17.917 "io_error": {} 00:10:17.917 } 00:10:17.917 ] 00:10:17.917 }' 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # jq -r '.bdevs[0].num_read_ops' 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # io_count1=185604 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # iostats_per_channel='{ 00:10:17.917 "tick_rate": 2300000000, 00:10:17.917 "ticks": 7045943228726012, 00:10:17.917 "name": "Malloc_STAT", 00:10:17.917 "channels": [ 00:10:17.917 { 00:10:17.917 "thread_id": 2, 00:10:17.917 "bytes_read": 387973120, 00:10:17.917 "num_read_ops": 94720, 00:10:17.917 "bytes_written": 0, 00:10:17.917 "num_write_ops": 0, 00:10:17.917 "bytes_unmapped": 0, 00:10:17.917 "num_unmap_ops": 0, 00:10:17.917 "bytes_copied": 0, 00:10:17.917 "num_copy_ops": 0, 00:10:17.917 "read_latency_ticks": 1156065321102, 00:10:17.917 "max_read_latency_ticks": 13254262, 00:10:17.917 "min_read_latency_ticks": 8099964, 00:10:17.917 "write_latency_ticks": 0, 00:10:17.917 "max_write_latency_ticks": 0, 00:10:17.917 "min_write_latency_ticks": 0, 00:10:17.917 "unmap_latency_ticks": 0, 00:10:17.917 "max_unmap_latency_ticks": 0, 00:10:17.917 "min_unmap_latency_ticks": 0, 00:10:17.917 "copy_latency_ticks": 0, 00:10:17.917 "max_copy_latency_ticks": 0, 00:10:17.917 "min_copy_latency_ticks": 0 00:10:17.917 }, 00:10:17.917 { 00:10:17.917 "thread_id": 3, 00:10:17.917 "bytes_read": 399507456, 00:10:17.917 "num_read_ops": 97536, 00:10:17.917 "bytes_written": 0, 00:10:17.917 "num_write_ops": 0, 00:10:17.917 "bytes_unmapped": 0, 00:10:17.917 "num_unmap_ops": 0, 00:10:17.917 "bytes_copied": 0, 00:10:17.917 "num_copy_ops": 0, 00:10:17.917 "read_latency_ticks": 1156410525584, 00:10:17.917 "max_read_latency_ticks": 14458018, 00:10:17.917 "min_read_latency_ticks": 8210638, 00:10:17.917 "write_latency_ticks": 0, 00:10:17.917 "max_write_latency_ticks": 0, 00:10:17.917 "min_write_latency_ticks": 0, 00:10:17.917 "unmap_latency_ticks": 0, 00:10:17.917 "max_unmap_latency_ticks": 0, 00:10:17.917 "min_unmap_latency_ticks": 0, 00:10:17.917 "copy_latency_ticks": 0, 00:10:17.917 "max_copy_latency_ticks": 0, 00:10:17.917 "min_copy_latency_ticks": 0 00:10:17.917 } 00:10:17.917 ] 00:10:17.917 }' 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # jq -r '.channels[0].num_read_ops' 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel1=94720 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel_all=94720 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # jq -r '.channels[1].num_read_ops' 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel2=97536 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@575 -- # io_count_per_channel_all=192256 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:17.917 23:52:18 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:18.177 23:52:18 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:18.177 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # iostats='{ 00:10:18.177 "tick_rate": 2300000000, 00:10:18.177 "ticks": 7045943512049992, 00:10:18.177 "bdevs": [ 00:10:18.177 { 00:10:18.177 "name": "Malloc_STAT", 00:10:18.177 "bytes_read": 836809216, 00:10:18.177 "num_read_ops": 204292, 00:10:18.177 "bytes_written": 0, 00:10:18.177 "num_write_ops": 0, 00:10:18.177 "bytes_unmapped": 0, 00:10:18.177 "num_unmap_ops": 0, 00:10:18.177 "bytes_copied": 0, 00:10:18.177 "num_copy_ops": 0, 00:10:18.177 "read_latency_ticks": 2457682292190, 00:10:18.177 "max_read_latency_ticks": 14458018, 00:10:18.177 "min_read_latency_ticks": 269638, 00:10:18.177 "write_latency_ticks": 0, 00:10:18.177 "max_write_latency_ticks": 0, 00:10:18.177 "min_write_latency_ticks": 0, 00:10:18.177 "unmap_latency_ticks": 0, 00:10:18.177 "max_unmap_latency_ticks": 0, 00:10:18.177 "min_unmap_latency_ticks": 0, 00:10:18.177 "copy_latency_ticks": 0, 00:10:18.177 "max_copy_latency_ticks": 0, 00:10:18.177 "min_copy_latency_ticks": 0, 00:10:18.177 "io_error": {} 00:10:18.177 } 00:10:18.177 ] 00:10:18.177 }' 00:10:18.177 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # jq -r '.bdevs[0].num_read_ops' 00:10:18.177 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # io_count2=204292 00:10:18.177 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 192256 -lt 185604 ']' 00:10:18.177 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 192256 -gt 204292 ']' 00:10:18.177 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:10:18.177 23:52:18 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:18.177 23:52:18 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:18.177 00:10:18.177 Latency(us) 00:10:18.177 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:18.177 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:10:18.177 Malloc_STAT : 2.17 48136.78 188.03 0.00 0.00 5305.16 1360.58 5784.26 00:10:18.177 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:18.177 Malloc_STAT : 2.17 49623.55 193.84 0.00 0.00 5146.65 975.92 6297.15 00:10:18.177 =================================================================================================================== 00:10:18.177 Total : 97760.33 381.88 0.00 0.00 5224.66 975.92 6297.15 00:10:18.177 0 00:10:18.177 23:52:18 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:18.177 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # killprocess 383466 00:10:18.177 23:52:18 blockdev_general.bdev_stat -- common/autotest_common.sh@946 -- # '[' -z 383466 ']' 00:10:18.177 23:52:18 blockdev_general.bdev_stat -- common/autotest_common.sh@950 -- # kill -0 383466 00:10:18.177 23:52:18 blockdev_general.bdev_stat -- common/autotest_common.sh@951 -- # uname 00:10:18.177 23:52:18 blockdev_general.bdev_stat -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:10:18.177 23:52:18 blockdev_general.bdev_stat -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 383466 00:10:18.177 23:52:18 blockdev_general.bdev_stat -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:10:18.177 23:52:18 blockdev_general.bdev_stat -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:10:18.177 23:52:18 blockdev_general.bdev_stat -- common/autotest_common.sh@964 -- # echo 'killing process with pid 383466' 00:10:18.177 killing process with pid 383466 00:10:18.177 23:52:18 blockdev_general.bdev_stat -- common/autotest_common.sh@965 -- # kill 383466 00:10:18.177 Received shutdown signal, test time was about 2.246675 seconds 00:10:18.177 00:10:18.177 Latency(us) 00:10:18.177 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:18.177 =================================================================================================================== 00:10:18.177 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:18.177 23:52:18 blockdev_general.bdev_stat -- common/autotest_common.sh@970 -- # wait 383466 00:10:18.437 23:52:18 blockdev_general.bdev_stat -- bdev/blockdev.sh@610 -- # trap - SIGINT SIGTERM EXIT 00:10:18.437 00:10:18.437 real 0m3.656s 00:10:18.437 user 0m7.311s 00:10:18.437 sys 0m0.455s 00:10:18.437 23:52:18 blockdev_general.bdev_stat -- common/autotest_common.sh@1122 -- # xtrace_disable 00:10:18.437 23:52:18 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:18.437 ************************************ 00:10:18.437 END TEST bdev_stat 00:10:18.437 ************************************ 00:10:18.437 23:52:18 blockdev_general -- bdev/blockdev.sh@794 -- # [[ bdev == gpt ]] 00:10:18.437 23:52:18 blockdev_general -- bdev/blockdev.sh@798 -- # [[ bdev == crypto_sw ]] 00:10:18.437 23:52:18 blockdev_general -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:10:18.437 23:52:18 blockdev_general -- bdev/blockdev.sh@811 -- # cleanup 00:10:18.437 23:52:18 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:10:18.437 23:52:18 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:10:18.437 23:52:18 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:10:18.437 23:52:18 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:10:18.437 23:52:18 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:10:18.437 23:52:18 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:10:18.437 00:10:18.437 real 1m56.401s 00:10:18.437 user 7m10.396s 00:10:18.437 sys 0m22.912s 00:10:18.437 23:52:18 blockdev_general -- common/autotest_common.sh@1122 -- # xtrace_disable 00:10:18.437 23:52:18 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:18.437 ************************************ 00:10:18.437 END TEST blockdev_general 00:10:18.437 ************************************ 00:10:18.437 23:52:18 -- spdk/autotest.sh@186 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:10:18.437 23:52:18 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:10:18.437 23:52:18 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:10:18.437 23:52:18 -- common/autotest_common.sh@10 -- # set +x 00:10:18.697 ************************************ 00:10:18.697 START TEST bdev_raid 00:10:18.697 ************************************ 00:10:18.697 23:52:19 bdev_raid -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:10:18.697 * Looking for test storage... 00:10:18.697 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:10:18.697 23:52:19 bdev_raid -- bdev/bdev_raid.sh@12 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:10:18.697 23:52:19 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:10:18.697 23:52:19 bdev_raid -- bdev/bdev_raid.sh@14 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:10:18.697 23:52:19 bdev_raid -- bdev/bdev_raid.sh@800 -- # trap 'on_error_exit;' ERR 00:10:18.697 23:52:19 bdev_raid -- bdev/bdev_raid.sh@802 -- # base_blocklen=512 00:10:18.697 23:52:19 bdev_raid -- bdev/bdev_raid.sh@804 -- # uname -s 00:10:18.697 23:52:19 bdev_raid -- bdev/bdev_raid.sh@804 -- # '[' Linux = Linux ']' 00:10:18.697 23:52:19 bdev_raid -- bdev/bdev_raid.sh@804 -- # modprobe -n nbd 00:10:18.697 23:52:19 bdev_raid -- bdev/bdev_raid.sh@805 -- # has_nbd=true 00:10:18.697 23:52:19 bdev_raid -- bdev/bdev_raid.sh@806 -- # modprobe nbd 00:10:18.697 23:52:19 bdev_raid -- bdev/bdev_raid.sh@807 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:10:18.697 23:52:19 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:10:18.697 23:52:19 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:10:18.697 23:52:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:18.697 ************************************ 00:10:18.697 START TEST raid_function_test_raid0 00:10:18.697 ************************************ 00:10:18.697 23:52:19 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1121 -- # raid_function_test raid0 00:10:18.697 23:52:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local raid_level=raid0 00:10:18.697 23:52:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local nbd=/dev/nbd0 00:10:18.697 23:52:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@83 -- # local raid_bdev 00:10:18.697 23:52:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # raid_pid=384081 00:10:18.697 23:52:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # echo 'Process raid pid: 384081' 00:10:18.697 Process raid pid: 384081 00:10:18.697 23:52:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@88 -- # waitforlisten 384081 /var/tmp/spdk-raid.sock 00:10:18.697 23:52:19 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@827 -- # '[' -z 384081 ']' 00:10:18.697 23:52:19 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:18.697 23:52:19 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@832 -- # local max_retries=100 00:10:18.697 23:52:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:18.697 23:52:19 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:18.697 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:18.697 23:52:19 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@836 -- # xtrace_disable 00:10:18.697 23:52:19 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:10:18.697 [2024-05-14 23:52:19.272029] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:10:18.697 [2024-05-14 23:52:19.272093] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:18.955 [2024-05-14 23:52:19.402500] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:18.955 [2024-05-14 23:52:19.508196] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:19.214 [2024-05-14 23:52:19.569835] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:19.214 [2024-05-14 23:52:19.569864] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:19.782 23:52:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:10:19.782 23:52:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@860 -- # return 0 00:10:19.782 23:52:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # configure_raid_bdev raid0 00:10:19.782 23:52:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # local raid_level=raid0 00:10:19.782 23:52:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@68 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:19.782 23:52:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@70 -- # cat 00:10:19.782 23:52:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:10:20.041 [2024-05-14 23:52:20.374600] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:20.041 [2024-05-14 23:52:20.375982] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:20.041 [2024-05-14 23:52:20.376039] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xadc670 00:10:20.041 [2024-05-14 23:52:20.376050] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:20.041 [2024-05-14 23:52:20.376231] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xacc570 00:10:20.041 [2024-05-14 23:52:20.376344] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xadc670 00:10:20.041 [2024-05-14 23:52:20.376354] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0xadc670 00:10:20.041 [2024-05-14 23:52:20.376468] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:20.041 Base_1 00:10:20.042 Base_2 00:10:20.042 23:52:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@77 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:20.042 23:52:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:10:20.042 23:52:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # jq -r '.[0]["name"] | select(.)' 00:10:20.300 23:52:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # raid_bdev=raid 00:10:20.300 23:52:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@92 -- # '[' raid = '' ']' 00:10:20.300 23:52:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:10:20.300 23:52:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:20.300 23:52:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:10:20.300 23:52:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:10:20.300 23:52:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:10:20.300 23:52:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:10:20.300 23:52:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:10:20.300 23:52:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:10:20.300 23:52:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:20.300 23:52:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:10:20.300 [2024-05-14 23:52:20.867921] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x928660 00:10:20.300 /dev/nbd0 00:10:20.560 23:52:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:10:20.560 23:52:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:10:20.560 23:52:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:10:20.560 23:52:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@865 -- # local i 00:10:20.560 23:52:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:10:20.560 23:52:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:10:20.560 23:52:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:10:20.560 23:52:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # break 00:10:20.560 23:52:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:10:20.560 23:52:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:10:20.560 23:52:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:20.560 1+0 records in 00:10:20.560 1+0 records out 00:10:20.560 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00023944 s, 17.1 MB/s 00:10:20.560 23:52:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:20.560 23:52:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # size=4096 00:10:20.560 23:52:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:20.560 23:52:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:10:20.560 23:52:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # return 0 00:10:20.560 23:52:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:20.560 23:52:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:20.560 23:52:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:20.560 23:52:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:20.560 23:52:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:10:20.819 { 00:10:20.819 "nbd_device": "/dev/nbd0", 00:10:20.819 "bdev_name": "raid" 00:10:20.819 } 00:10:20.819 ]' 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:10:20.819 { 00:10:20.819 "nbd_device": "/dev/nbd0", 00:10:20.819 "bdev_name": "raid" 00:10:20.819 } 00:10:20.819 ]' 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # count=1 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@99 -- # '[' 1 -ne 1 ']' 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@103 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@17 -- # hash blkdiscard 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # local nbd=/dev/nbd0 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local blksize 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # lsblk -o LOG-SEC /dev/nbd0 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # grep -v LOG-SEC 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # cut -d ' ' -f 5 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # blksize=512 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # local rw_blk_num=4096 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_len=2097152 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # unmap_blk_offs=('0' '1028' '321') 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local unmap_blk_offs 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_nums=('128' '2035' '456') 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_nums 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_off 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_len 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@30 -- # dd if=/dev/urandom of=/raidrandtest bs=512 count=4096 00:10:20.819 4096+0 records in 00:10:20.819 4096+0 records out 00:10:20.819 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0308222 s, 68.0 MB/s 00:10:20.819 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:10:21.080 4096+0 records in 00:10:21.080 4096+0 records out 00:10:21.080 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.226566 s, 9.3 MB/s 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # blockdev --flushbufs /dev/nbd0 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@35 -- # cmp -b -n 2097152 /raidrandtest /dev/nbd0 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@37 -- # (( i = 0 )) 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@37 -- # (( i < 3 )) 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # unmap_off=0 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_len=65536 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@42 -- # dd if=/dev/zero of=/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:10:21.080 128+0 records in 00:10:21.080 128+0 records out 00:10:21.080 65536 bytes (66 kB, 64 KiB) copied, 0.00035999 s, 182 MB/s 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@45 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blockdev --flushbufs /dev/nbd0 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@49 -- # cmp -b -n 2097152 /raidrandtest /dev/nbd0 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@37 -- # (( i++ )) 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@37 -- # (( i < 3 )) 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # unmap_off=526336 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_len=1041920 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@42 -- # dd if=/dev/zero of=/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:10:21.080 2035+0 records in 00:10:21.080 2035+0 records out 00:10:21.080 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0106517 s, 97.8 MB/s 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@45 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blockdev --flushbufs /dev/nbd0 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@49 -- # cmp -b -n 2097152 /raidrandtest /dev/nbd0 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@37 -- # (( i++ )) 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@37 -- # (( i < 3 )) 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # unmap_off=164352 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_len=233472 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@42 -- # dd if=/dev/zero of=/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:10:21.080 456+0 records in 00:10:21.080 456+0 records out 00:10:21.080 233472 bytes (233 kB, 228 KiB) copied, 0.00268199 s, 87.1 MB/s 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@45 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blockdev --flushbufs /dev/nbd0 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@49 -- # cmp -b -n 2097152 /raidrandtest /dev/nbd0 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@37 -- # (( i++ )) 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@37 -- # (( i < 3 )) 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@53 -- # return 0 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:21.080 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:10:21.340 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:21.340 [2024-05-14 23:52:21.865832] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:21.340 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:21.340 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:21.340 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:21.340 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:21.340 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:21.340 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:10:21.340 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:10:21.340 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:21.340 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:21.340 23:52:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:21.598 23:52:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:21.598 23:52:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:21.598 23:52:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:21.598 23:52:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:21.598 23:52:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:10:21.598 23:52:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:21.598 23:52:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:10:21.598 23:52:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:10:21.598 23:52:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:10:21.598 23:52:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # count=0 00:10:21.598 23:52:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@107 -- # '[' 0 -ne 0 ']' 00:10:21.598 23:52:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@111 -- # killprocess 384081 00:10:21.598 23:52:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@946 -- # '[' -z 384081 ']' 00:10:21.598 23:52:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@950 -- # kill -0 384081 00:10:21.598 23:52:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@951 -- # uname 00:10:21.857 23:52:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:10:21.857 23:52:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 384081 00:10:21.857 23:52:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:10:21.857 23:52:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:10:21.857 23:52:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@964 -- # echo 'killing process with pid 384081' 00:10:21.857 killing process with pid 384081 00:10:21.857 23:52:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@965 -- # kill 384081 00:10:21.857 [2024-05-14 23:52:22.229932] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:21.857 [2024-05-14 23:52:22.230009] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:21.857 [2024-05-14 23:52:22.230057] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:21.857 [2024-05-14 23:52:22.230069] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xadc670 name raid, state offline 00:10:21.857 23:52:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@970 -- # wait 384081 00:10:21.857 [2024-05-14 23:52:22.249646] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:22.116 23:52:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@113 -- # return 0 00:10:22.116 00:10:22.116 real 0m3.288s 00:10:22.116 user 0m4.350s 00:10:22.116 sys 0m1.197s 00:10:22.116 23:52:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:10:22.116 23:52:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:10:22.116 ************************************ 00:10:22.116 END TEST raid_function_test_raid0 00:10:22.116 ************************************ 00:10:22.116 23:52:22 bdev_raid -- bdev/bdev_raid.sh@808 -- # run_test raid_function_test_concat raid_function_test concat 00:10:22.116 23:52:22 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:10:22.116 23:52:22 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:10:22.116 23:52:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:22.116 ************************************ 00:10:22.116 START TEST raid_function_test_concat 00:10:22.116 ************************************ 00:10:22.116 23:52:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1121 -- # raid_function_test concat 00:10:22.116 23:52:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local raid_level=concat 00:10:22.116 23:52:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local nbd=/dev/nbd0 00:10:22.116 23:52:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@83 -- # local raid_bdev 00:10:22.116 23:52:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # raid_pid=384685 00:10:22.116 23:52:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # echo 'Process raid pid: 384685' 00:10:22.116 Process raid pid: 384685 00:10:22.116 23:52:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:22.116 23:52:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@88 -- # waitforlisten 384685 /var/tmp/spdk-raid.sock 00:10:22.116 23:52:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@827 -- # '[' -z 384685 ']' 00:10:22.116 23:52:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:22.116 23:52:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@832 -- # local max_retries=100 00:10:22.116 23:52:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:22.116 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:22.116 23:52:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@836 -- # xtrace_disable 00:10:22.116 23:52:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:10:22.116 [2024-05-14 23:52:22.660550] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:10:22.116 [2024-05-14 23:52:22.660613] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:22.374 [2024-05-14 23:52:22.790102] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:22.374 [2024-05-14 23:52:22.888256] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:22.374 [2024-05-14 23:52:22.946684] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:22.374 [2024-05-14 23:52:22.946720] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:23.310 23:52:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:10:23.310 23:52:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@860 -- # return 0 00:10:23.310 23:52:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # configure_raid_bdev concat 00:10:23.310 23:52:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # local raid_level=concat 00:10:23.310 23:52:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@68 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:23.310 23:52:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@70 -- # cat 00:10:23.310 23:52:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:10:23.310 [2024-05-14 23:52:23.852289] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:23.310 [2024-05-14 23:52:23.853691] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:23.310 [2024-05-14 23:52:23.853756] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b65670 00:10:23.310 [2024-05-14 23:52:23.853769] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:23.310 [2024-05-14 23:52:23.853952] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b55570 00:10:23.310 [2024-05-14 23:52:23.854070] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b65670 00:10:23.310 [2024-05-14 23:52:23.854081] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x1b65670 00:10:23.310 [2024-05-14 23:52:23.854182] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:23.310 Base_1 00:10:23.310 Base_2 00:10:23.310 23:52:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@77 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:23.310 23:52:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:10:23.310 23:52:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # jq -r '.[0]["name"] | select(.)' 00:10:23.569 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # raid_bdev=raid 00:10:23.569 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@92 -- # '[' raid = '' ']' 00:10:23.569 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:10:23.569 23:52:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:23.569 23:52:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:10:23.569 23:52:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:10:23.569 23:52:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:10:23.569 23:52:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:10:23.569 23:52:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:10:23.569 23:52:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:10:23.569 23:52:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:23.569 23:52:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:10:23.828 [2024-05-14 23:52:24.345608] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19b1660 00:10:23.828 /dev/nbd0 00:10:23.828 23:52:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:10:23.828 23:52:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:10:23.828 23:52:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:10:23.828 23:52:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@865 -- # local i 00:10:23.828 23:52:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:10:23.828 23:52:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:10:23.828 23:52:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:10:23.828 23:52:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # break 00:10:23.828 23:52:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:10:23.828 23:52:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:10:23.828 23:52:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:23.828 1+0 records in 00:10:23.828 1+0 records out 00:10:23.828 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00025514 s, 16.1 MB/s 00:10:23.828 23:52:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:23.828 23:52:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # size=4096 00:10:23.828 23:52:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:23.828 23:52:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:10:23.828 23:52:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # return 0 00:10:23.828 23:52:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:23.828 23:52:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:23.828 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:23.828 23:52:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:23.828 23:52:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:24.086 23:52:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:10:24.086 { 00:10:24.086 "nbd_device": "/dev/nbd0", 00:10:24.086 "bdev_name": "raid" 00:10:24.086 } 00:10:24.086 ]' 00:10:24.086 23:52:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:10:24.086 { 00:10:24.086 "nbd_device": "/dev/nbd0", 00:10:24.086 "bdev_name": "raid" 00:10:24.086 } 00:10:24.086 ]' 00:10:24.086 23:52:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:24.345 23:52:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:10:24.345 23:52:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:10:24.345 23:52:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:24.345 23:52:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:10:24.345 23:52:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:10:24.345 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # count=1 00:10:24.345 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@99 -- # '[' 1 -ne 1 ']' 00:10:24.345 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@103 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:10:24.345 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@17 -- # hash blkdiscard 00:10:24.345 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # local nbd=/dev/nbd0 00:10:24.345 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:24.345 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local blksize 00:10:24.345 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # lsblk -o LOG-SEC /dev/nbd0 00:10:24.345 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # grep -v LOG-SEC 00:10:24.345 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # cut -d ' ' -f 5 00:10:24.345 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # blksize=512 00:10:24.345 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # local rw_blk_num=4096 00:10:24.345 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_len=2097152 00:10:24.345 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # unmap_blk_offs=('0' '1028' '321') 00:10:24.345 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local unmap_blk_offs 00:10:24.345 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_nums=('128' '2035' '456') 00:10:24.345 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_nums 00:10:24.345 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_off 00:10:24.345 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_len 00:10:24.345 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@30 -- # dd if=/dev/urandom of=/raidrandtest bs=512 count=4096 00:10:24.345 4096+0 records in 00:10:24.345 4096+0 records out 00:10:24.345 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0276198 s, 75.9 MB/s 00:10:24.345 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:10:24.604 4096+0 records in 00:10:24.604 4096+0 records out 00:10:24.604 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.200676 s, 10.5 MB/s 00:10:24.604 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # blockdev --flushbufs /dev/nbd0 00:10:24.604 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@35 -- # cmp -b -n 2097152 /raidrandtest /dev/nbd0 00:10:24.604 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@37 -- # (( i = 0 )) 00:10:24.604 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@37 -- # (( i < 3 )) 00:10:24.604 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # unmap_off=0 00:10:24.604 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_len=65536 00:10:24.604 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@42 -- # dd if=/dev/zero of=/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:10:24.604 128+0 records in 00:10:24.604 128+0 records out 00:10:24.604 65536 bytes (66 kB, 64 KiB) copied, 0.000883967 s, 74.1 MB/s 00:10:24.604 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@45 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:10:24.604 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blockdev --flushbufs /dev/nbd0 00:10:24.604 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@49 -- # cmp -b -n 2097152 /raidrandtest /dev/nbd0 00:10:24.604 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@37 -- # (( i++ )) 00:10:24.604 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@37 -- # (( i < 3 )) 00:10:24.604 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # unmap_off=526336 00:10:24.604 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_len=1041920 00:10:24.604 23:52:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@42 -- # dd if=/dev/zero of=/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:10:24.604 2035+0 records in 00:10:24.604 2035+0 records out 00:10:24.604 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0113306 s, 92.0 MB/s 00:10:24.604 23:52:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@45 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:10:24.604 23:52:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blockdev --flushbufs /dev/nbd0 00:10:24.604 23:52:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@49 -- # cmp -b -n 2097152 /raidrandtest /dev/nbd0 00:10:24.604 23:52:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@37 -- # (( i++ )) 00:10:24.604 23:52:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@37 -- # (( i < 3 )) 00:10:24.604 23:52:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # unmap_off=164352 00:10:24.604 23:52:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_len=233472 00:10:24.604 23:52:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@42 -- # dd if=/dev/zero of=/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:10:24.604 456+0 records in 00:10:24.604 456+0 records out 00:10:24.604 233472 bytes (233 kB, 228 KiB) copied, 0.00271672 s, 85.9 MB/s 00:10:24.604 23:52:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@45 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:10:24.604 23:52:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blockdev --flushbufs /dev/nbd0 00:10:24.604 23:52:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@49 -- # cmp -b -n 2097152 /raidrandtest /dev/nbd0 00:10:24.604 23:52:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@37 -- # (( i++ )) 00:10:24.604 23:52:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@37 -- # (( i < 3 )) 00:10:24.604 23:52:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@53 -- # return 0 00:10:24.604 23:52:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:10:24.604 23:52:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:24.604 23:52:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:10:24.604 23:52:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:24.604 23:52:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:10:24.604 23:52:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:24.604 23:52:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:10:24.863 23:52:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:24.863 [2024-05-14 23:52:25.320950] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:24.863 23:52:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:24.863 23:52:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:24.863 23:52:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:24.863 23:52:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:24.863 23:52:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:24.863 23:52:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:10:24.863 23:52:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:10:24.863 23:52:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:24.863 23:52:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:24.863 23:52:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:25.146 23:52:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:25.146 23:52:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:25.146 23:52:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:25.146 23:52:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:25.146 23:52:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:25.146 23:52:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:10:25.146 23:52:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:10:25.146 23:52:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:10:25.146 23:52:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:10:25.146 23:52:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # count=0 00:10:25.146 23:52:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@107 -- # '[' 0 -ne 0 ']' 00:10:25.146 23:52:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@111 -- # killprocess 384685 00:10:25.146 23:52:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@946 -- # '[' -z 384685 ']' 00:10:25.146 23:52:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@950 -- # kill -0 384685 00:10:25.146 23:52:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@951 -- # uname 00:10:25.146 23:52:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:10:25.146 23:52:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 384685 00:10:25.146 23:52:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:10:25.146 23:52:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:10:25.146 23:52:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@964 -- # echo 'killing process with pid 384685' 00:10:25.146 killing process with pid 384685 00:10:25.146 23:52:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@965 -- # kill 384685 00:10:25.146 [2024-05-14 23:52:25.684405] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:25.146 [2024-05-14 23:52:25.684485] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:25.146 [2024-05-14 23:52:25.684531] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:25.146 [2024-05-14 23:52:25.684547] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b65670 name raid, state offline 00:10:25.146 23:52:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@970 -- # wait 384685 00:10:25.146 [2024-05-14 23:52:25.703476] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:25.412 23:52:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@113 -- # return 0 00:10:25.412 00:10:25.412 real 0m3.343s 00:10:25.412 user 0m4.491s 00:10:25.412 sys 0m1.202s 00:10:25.412 23:52:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1122 -- # xtrace_disable 00:10:25.412 23:52:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:10:25.412 ************************************ 00:10:25.412 END TEST raid_function_test_concat 00:10:25.412 ************************************ 00:10:25.412 23:52:25 bdev_raid -- bdev/bdev_raid.sh@811 -- # run_test raid0_resize_test raid0_resize_test 00:10:25.412 23:52:25 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:10:25.412 23:52:25 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:10:25.412 23:52:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:25.671 ************************************ 00:10:25.671 START TEST raid0_resize_test 00:10:25.671 ************************************ 00:10:25.671 23:52:26 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1121 -- # raid0_resize_test 00:10:25.671 23:52:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local blksize=512 00:10:25.671 23:52:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local bdev_size_mb=32 00:10:25.671 23:52:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local new_bdev_size_mb=64 00:10:25.671 23:52:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local blkcnt 00:10:25.671 23:52:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local raid_size_mb 00:10:25.671 23:52:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@353 -- # local new_raid_size_mb 00:10:25.671 23:52:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # raid_pid=385135 00:10:25.671 23:52:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # echo 'Process raid pid: 385135' 00:10:25.671 Process raid pid: 385135 00:10:25.671 23:52:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:25.671 23:52:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@358 -- # waitforlisten 385135 /var/tmp/spdk-raid.sock 00:10:25.671 23:52:26 bdev_raid.raid0_resize_test -- common/autotest_common.sh@827 -- # '[' -z 385135 ']' 00:10:25.671 23:52:26 bdev_raid.raid0_resize_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:25.671 23:52:26 bdev_raid.raid0_resize_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:10:25.671 23:52:26 bdev_raid.raid0_resize_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:25.671 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:25.671 23:52:26 bdev_raid.raid0_resize_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:10:25.671 23:52:26 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:10:25.671 [2024-05-14 23:52:26.073766] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:10:25.671 [2024-05-14 23:52:26.073809] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:25.671 [2024-05-14 23:52:26.185906] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:25.928 [2024-05-14 23:52:26.290312] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:25.928 [2024-05-14 23:52:26.358509] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:25.928 [2024-05-14 23:52:26.358543] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:26.492 23:52:27 bdev_raid.raid0_resize_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:10:26.492 23:52:27 bdev_raid.raid0_resize_test -- common/autotest_common.sh@860 -- # return 0 00:10:26.492 23:52:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:10:26.749 Base_1 00:10:26.749 23:52:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@361 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:10:27.007 Base_2 00:10:27.007 23:52:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@363 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:10:27.265 [2024-05-14 23:52:27.731060] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:27.265 [2024-05-14 23:52:27.732587] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:27.265 [2024-05-14 23:52:27.732636] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xe1fdb0 00:10:27.265 [2024-05-14 23:52:27.732646] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:27.265 [2024-05-14 23:52:27.732854] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfd3840 00:10:27.265 [2024-05-14 23:52:27.732956] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe1fdb0 00:10:27.265 [2024-05-14 23:52:27.732965] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0xe1fdb0 00:10:27.265 [2024-05-14 23:52:27.733073] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:27.265 23:52:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@366 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:10:27.524 [2024-05-14 23:52:27.983685] bdev_raid.c:2216:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:10:27.524 [2024-05-14 23:52:27.983708] bdev_raid.c:2229:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:10:27.524 true 00:10:27.524 23:52:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:10:27.524 23:52:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # jq '.[].num_blocks' 00:10:27.781 [2024-05-14 23:52:28.224458] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:27.781 23:52:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # blkcnt=131072 00:10:27.781 23:52:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # raid_size_mb=64 00:10:27.781 23:52:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@371 -- # '[' 64 '!=' 64 ']' 00:10:27.781 23:52:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:10:28.039 [2024-05-14 23:52:28.464932] bdev_raid.c:2216:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:10:28.039 [2024-05-14 23:52:28.464952] bdev_raid.c:2229:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:10:28.039 [2024-05-14 23:52:28.464977] bdev_raid.c:2243:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:10:28.039 true 00:10:28.039 23:52:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:10:28.039 23:52:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # jq '.[].num_blocks' 00:10:28.298 [2024-05-14 23:52:28.705705] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:28.298 23:52:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # blkcnt=262144 00:10:28.298 23:52:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # raid_size_mb=128 00:10:28.298 23:52:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@382 -- # '[' 128 '!=' 128 ']' 00:10:28.298 23:52:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@387 -- # killprocess 385135 00:10:28.298 23:52:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@946 -- # '[' -z 385135 ']' 00:10:28.298 23:52:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@950 -- # kill -0 385135 00:10:28.298 23:52:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@951 -- # uname 00:10:28.298 23:52:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:10:28.298 23:52:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 385135 00:10:28.298 23:52:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:10:28.298 23:52:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:10:28.298 23:52:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 385135' 00:10:28.298 killing process with pid 385135 00:10:28.298 23:52:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@965 -- # kill 385135 00:10:28.298 [2024-05-14 23:52:28.777665] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:28.298 [2024-05-14 23:52:28.777734] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:28.298 [2024-05-14 23:52:28.777778] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:28.298 [2024-05-14 23:52:28.777789] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe1fdb0 name Raid, state offline 00:10:28.298 23:52:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@970 -- # wait 385135 00:10:28.298 [2024-05-14 23:52:28.779103] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:28.556 23:52:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@389 -- # return 0 00:10:28.556 00:10:28.556 real 0m2.955s 00:10:28.556 user 0m4.551s 00:10:28.556 sys 0m0.637s 00:10:28.556 23:52:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:10:28.556 23:52:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:10:28.556 ************************************ 00:10:28.556 END TEST raid0_resize_test 00:10:28.556 ************************************ 00:10:28.556 23:52:29 bdev_raid -- bdev/bdev_raid.sh@813 -- # for n in {2..4} 00:10:28.556 23:52:29 bdev_raid -- bdev/bdev_raid.sh@814 -- # for level in raid0 concat raid1 00:10:28.556 23:52:29 bdev_raid -- bdev/bdev_raid.sh@815 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:10:28.556 23:52:29 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:10:28.556 23:52:29 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:10:28.556 23:52:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:28.556 ************************************ 00:10:28.556 START TEST raid_state_function_test 00:10:28.556 ************************************ 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1121 -- # raid_state_function_test raid0 2 false 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local raid_level=raid0 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=2 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local superblock=false 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # '[' raid0 '!=' raid1 ']' 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@238 -- # '[' false = true ']' 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@241 -- # superblock_create_arg= 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # raid_pid=385613 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 385613' 00:10:28.556 Process raid pid: 385613 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@247 -- # waitforlisten 385613 /var/tmp/spdk-raid.sock 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@827 -- # '[' -z 385613 ']' 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:28.556 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:10:28.556 23:52:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:28.814 [2024-05-14 23:52:29.149940] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:10:28.814 [2024-05-14 23:52:29.150001] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:28.814 [2024-05-14 23:52:29.277372] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:28.814 [2024-05-14 23:52:29.375268] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:29.072 [2024-05-14 23:52:29.435307] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:29.072 [2024-05-14 23:52:29.435345] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:29.639 23:52:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:10:29.639 23:52:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # return 0 00:10:29.639 23:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:29.896 [2024-05-14 23:52:30.314039] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:29.896 [2024-05-14 23:52:30.314088] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:29.896 [2024-05-14 23:52:30.314100] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:29.896 [2024-05-14 23:52:30.314113] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:29.897 23:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:29.897 23:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:29.897 23:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:29.897 23:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:29.897 23:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:29.897 23:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:29.897 23:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:29.897 23:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:29.897 23:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:29.897 23:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:29.897 23:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:29.897 23:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:30.155 23:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:30.155 "name": "Existed_Raid", 00:10:30.155 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:30.155 "strip_size_kb": 64, 00:10:30.155 "state": "configuring", 00:10:30.155 "raid_level": "raid0", 00:10:30.155 "superblock": false, 00:10:30.155 "num_base_bdevs": 2, 00:10:30.155 "num_base_bdevs_discovered": 0, 00:10:30.155 "num_base_bdevs_operational": 2, 00:10:30.155 "base_bdevs_list": [ 00:10:30.155 { 00:10:30.155 "name": "BaseBdev1", 00:10:30.155 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:30.155 "is_configured": false, 00:10:30.155 "data_offset": 0, 00:10:30.155 "data_size": 0 00:10:30.155 }, 00:10:30.155 { 00:10:30.155 "name": "BaseBdev2", 00:10:30.155 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:30.155 "is_configured": false, 00:10:30.155 "data_offset": 0, 00:10:30.155 "data_size": 0 00:10:30.155 } 00:10:30.155 ] 00:10:30.155 }' 00:10:30.155 23:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:30.155 23:52:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:30.721 23:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:30.979 [2024-05-14 23:52:31.328586] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:30.979 [2024-05-14 23:52:31.328621] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ffcbc0 name Existed_Raid, state configuring 00:10:30.979 23:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:30.979 [2024-05-14 23:52:31.501058] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:30.979 [2024-05-14 23:52:31.501095] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:30.979 [2024-05-14 23:52:31.501106] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:30.979 [2024-05-14 23:52:31.501118] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:30.979 23:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:31.237 [2024-05-14 23:52:31.689201] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:31.237 BaseBdev1 00:10:31.237 23:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:10:31.237 23:52:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:10:31.237 23:52:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:31.237 23:52:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:10:31.237 23:52:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:31.237 23:52:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:31.237 23:52:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:31.496 23:52:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:31.496 [ 00:10:31.496 { 00:10:31.496 "name": "BaseBdev1", 00:10:31.496 "aliases": [ 00:10:31.496 "75cb82d0-9bc0-4d35-944f-545aaa4d1a07" 00:10:31.496 ], 00:10:31.496 "product_name": "Malloc disk", 00:10:31.496 "block_size": 512, 00:10:31.496 "num_blocks": 65536, 00:10:31.496 "uuid": "75cb82d0-9bc0-4d35-944f-545aaa4d1a07", 00:10:31.496 "assigned_rate_limits": { 00:10:31.496 "rw_ios_per_sec": 0, 00:10:31.496 "rw_mbytes_per_sec": 0, 00:10:31.496 "r_mbytes_per_sec": 0, 00:10:31.496 "w_mbytes_per_sec": 0 00:10:31.496 }, 00:10:31.496 "claimed": true, 00:10:31.496 "claim_type": "exclusive_write", 00:10:31.496 "zoned": false, 00:10:31.496 "supported_io_types": { 00:10:31.496 "read": true, 00:10:31.496 "write": true, 00:10:31.496 "unmap": true, 00:10:31.496 "write_zeroes": true, 00:10:31.496 "flush": true, 00:10:31.496 "reset": true, 00:10:31.496 "compare": false, 00:10:31.496 "compare_and_write": false, 00:10:31.496 "abort": true, 00:10:31.496 "nvme_admin": false, 00:10:31.496 "nvme_io": false 00:10:31.496 }, 00:10:31.496 "memory_domains": [ 00:10:31.496 { 00:10:31.496 "dma_device_id": "system", 00:10:31.496 "dma_device_type": 1 00:10:31.496 }, 00:10:31.496 { 00:10:31.496 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:31.496 "dma_device_type": 2 00:10:31.496 } 00:10:31.496 ], 00:10:31.496 "driver_specific": {} 00:10:31.496 } 00:10:31.496 ] 00:10:31.496 23:52:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:10:31.496 23:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:31.496 23:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:31.496 23:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:31.496 23:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:31.496 23:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:31.496 23:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:31.496 23:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:31.496 23:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:31.496 23:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:31.496 23:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:31.496 23:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:31.496 23:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:31.754 23:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:31.754 "name": "Existed_Raid", 00:10:31.754 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:31.754 "strip_size_kb": 64, 00:10:31.754 "state": "configuring", 00:10:31.754 "raid_level": "raid0", 00:10:31.754 "superblock": false, 00:10:31.754 "num_base_bdevs": 2, 00:10:31.754 "num_base_bdevs_discovered": 1, 00:10:31.754 "num_base_bdevs_operational": 2, 00:10:31.754 "base_bdevs_list": [ 00:10:31.754 { 00:10:31.754 "name": "BaseBdev1", 00:10:31.754 "uuid": "75cb82d0-9bc0-4d35-944f-545aaa4d1a07", 00:10:31.754 "is_configured": true, 00:10:31.754 "data_offset": 0, 00:10:31.754 "data_size": 65536 00:10:31.754 }, 00:10:31.754 { 00:10:31.754 "name": "BaseBdev2", 00:10:31.754 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:31.754 "is_configured": false, 00:10:31.754 "data_offset": 0, 00:10:31.754 "data_size": 0 00:10:31.754 } 00:10:31.754 ] 00:10:31.754 }' 00:10:31.754 23:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:31.754 23:52:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:32.320 23:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:32.577 [2024-05-14 23:52:33.129001] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:32.577 [2024-05-14 23:52:33.129041] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ffce60 name Existed_Raid, state configuring 00:10:32.577 23:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:32.835 [2024-05-14 23:52:33.381693] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:32.835 [2024-05-14 23:52:33.383200] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:32.835 [2024-05-14 23:52:33.383234] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:32.835 23:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:10:32.835 23:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:10:32.835 23:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:32.835 23:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:32.835 23:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:32.835 23:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:32.835 23:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:32.835 23:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:32.835 23:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:32.835 23:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:32.835 23:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:32.835 23:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:32.835 23:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:32.835 23:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:33.090 23:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:33.090 "name": "Existed_Raid", 00:10:33.090 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:33.090 "strip_size_kb": 64, 00:10:33.090 "state": "configuring", 00:10:33.090 "raid_level": "raid0", 00:10:33.090 "superblock": false, 00:10:33.090 "num_base_bdevs": 2, 00:10:33.090 "num_base_bdevs_discovered": 1, 00:10:33.090 "num_base_bdevs_operational": 2, 00:10:33.090 "base_bdevs_list": [ 00:10:33.090 { 00:10:33.090 "name": "BaseBdev1", 00:10:33.090 "uuid": "75cb82d0-9bc0-4d35-944f-545aaa4d1a07", 00:10:33.090 "is_configured": true, 00:10:33.090 "data_offset": 0, 00:10:33.090 "data_size": 65536 00:10:33.090 }, 00:10:33.090 { 00:10:33.090 "name": "BaseBdev2", 00:10:33.090 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:33.090 "is_configured": false, 00:10:33.090 "data_offset": 0, 00:10:33.090 "data_size": 0 00:10:33.090 } 00:10:33.090 ] 00:10:33.090 }' 00:10:33.090 23:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:33.090 23:52:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:34.020 23:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:34.020 [2024-05-14 23:52:34.493757] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:34.020 [2024-05-14 23:52:34.493795] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ffc4b0 00:10:34.020 [2024-05-14 23:52:34.493805] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:34.020 [2024-05-14 23:52:34.494002] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ffca70 00:10:34.020 [2024-05-14 23:52:34.494147] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ffc4b0 00:10:34.020 [2024-05-14 23:52:34.494158] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1ffc4b0 00:10:34.020 [2024-05-14 23:52:34.494340] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:34.020 BaseBdev2 00:10:34.020 23:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:10:34.020 23:52:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:10:34.020 23:52:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:34.020 23:52:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:10:34.020 23:52:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:34.020 23:52:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:34.020 23:52:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:34.278 23:52:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:34.536 [ 00:10:34.536 { 00:10:34.536 "name": "BaseBdev2", 00:10:34.536 "aliases": [ 00:10:34.536 "e40a6ab3-241f-449f-8cde-064368f4bba3" 00:10:34.536 ], 00:10:34.536 "product_name": "Malloc disk", 00:10:34.536 "block_size": 512, 00:10:34.536 "num_blocks": 65536, 00:10:34.536 "uuid": "e40a6ab3-241f-449f-8cde-064368f4bba3", 00:10:34.536 "assigned_rate_limits": { 00:10:34.536 "rw_ios_per_sec": 0, 00:10:34.537 "rw_mbytes_per_sec": 0, 00:10:34.537 "r_mbytes_per_sec": 0, 00:10:34.537 "w_mbytes_per_sec": 0 00:10:34.537 }, 00:10:34.537 "claimed": true, 00:10:34.537 "claim_type": "exclusive_write", 00:10:34.537 "zoned": false, 00:10:34.537 "supported_io_types": { 00:10:34.537 "read": true, 00:10:34.537 "write": true, 00:10:34.537 "unmap": true, 00:10:34.537 "write_zeroes": true, 00:10:34.537 "flush": true, 00:10:34.537 "reset": true, 00:10:34.537 "compare": false, 00:10:34.537 "compare_and_write": false, 00:10:34.537 "abort": true, 00:10:34.537 "nvme_admin": false, 00:10:34.537 "nvme_io": false 00:10:34.537 }, 00:10:34.537 "memory_domains": [ 00:10:34.537 { 00:10:34.537 "dma_device_id": "system", 00:10:34.537 "dma_device_type": 1 00:10:34.537 }, 00:10:34.537 { 00:10:34.537 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:34.537 "dma_device_type": 2 00:10:34.537 } 00:10:34.537 ], 00:10:34.537 "driver_specific": {} 00:10:34.537 } 00:10:34.537 ] 00:10:34.537 23:52:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:10:34.537 23:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:10:34.537 23:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:10:34.537 23:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:10:34.537 23:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:34.537 23:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:10:34.537 23:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:34.537 23:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:34.537 23:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:34.537 23:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:34.537 23:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:34.537 23:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:34.537 23:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:34.537 23:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:34.537 23:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:34.795 23:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:34.795 "name": "Existed_Raid", 00:10:34.795 "uuid": "dc0efc9c-3e03-436c-a3b2-da2d718b9758", 00:10:34.795 "strip_size_kb": 64, 00:10:34.795 "state": "online", 00:10:34.795 "raid_level": "raid0", 00:10:34.795 "superblock": false, 00:10:34.795 "num_base_bdevs": 2, 00:10:34.795 "num_base_bdevs_discovered": 2, 00:10:34.795 "num_base_bdevs_operational": 2, 00:10:34.795 "base_bdevs_list": [ 00:10:34.795 { 00:10:34.795 "name": "BaseBdev1", 00:10:34.795 "uuid": "75cb82d0-9bc0-4d35-944f-545aaa4d1a07", 00:10:34.795 "is_configured": true, 00:10:34.795 "data_offset": 0, 00:10:34.795 "data_size": 65536 00:10:34.795 }, 00:10:34.795 { 00:10:34.795 "name": "BaseBdev2", 00:10:34.795 "uuid": "e40a6ab3-241f-449f-8cde-064368f4bba3", 00:10:34.795 "is_configured": true, 00:10:34.795 "data_offset": 0, 00:10:34.795 "data_size": 65536 00:10:34.795 } 00:10:34.795 ] 00:10:34.795 }' 00:10:34.795 23:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:34.795 23:52:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:35.360 23:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:10:35.360 23:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:10:35.360 23:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:10:35.360 23:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:10:35.360 23:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:10:35.360 23:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:10:35.360 23:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:35.360 23:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:10:35.618 [2024-05-14 23:52:35.985958] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:35.618 23:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:10:35.618 "name": "Existed_Raid", 00:10:35.618 "aliases": [ 00:10:35.618 "dc0efc9c-3e03-436c-a3b2-da2d718b9758" 00:10:35.618 ], 00:10:35.618 "product_name": "Raid Volume", 00:10:35.618 "block_size": 512, 00:10:35.618 "num_blocks": 131072, 00:10:35.618 "uuid": "dc0efc9c-3e03-436c-a3b2-da2d718b9758", 00:10:35.618 "assigned_rate_limits": { 00:10:35.618 "rw_ios_per_sec": 0, 00:10:35.618 "rw_mbytes_per_sec": 0, 00:10:35.618 "r_mbytes_per_sec": 0, 00:10:35.618 "w_mbytes_per_sec": 0 00:10:35.618 }, 00:10:35.618 "claimed": false, 00:10:35.618 "zoned": false, 00:10:35.618 "supported_io_types": { 00:10:35.618 "read": true, 00:10:35.618 "write": true, 00:10:35.618 "unmap": true, 00:10:35.618 "write_zeroes": true, 00:10:35.618 "flush": true, 00:10:35.618 "reset": true, 00:10:35.618 "compare": false, 00:10:35.618 "compare_and_write": false, 00:10:35.618 "abort": false, 00:10:35.618 "nvme_admin": false, 00:10:35.618 "nvme_io": false 00:10:35.618 }, 00:10:35.618 "memory_domains": [ 00:10:35.618 { 00:10:35.618 "dma_device_id": "system", 00:10:35.618 "dma_device_type": 1 00:10:35.618 }, 00:10:35.618 { 00:10:35.618 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:35.618 "dma_device_type": 2 00:10:35.618 }, 00:10:35.618 { 00:10:35.618 "dma_device_id": "system", 00:10:35.618 "dma_device_type": 1 00:10:35.618 }, 00:10:35.618 { 00:10:35.618 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:35.618 "dma_device_type": 2 00:10:35.618 } 00:10:35.618 ], 00:10:35.618 "driver_specific": { 00:10:35.618 "raid": { 00:10:35.618 "uuid": "dc0efc9c-3e03-436c-a3b2-da2d718b9758", 00:10:35.618 "strip_size_kb": 64, 00:10:35.619 "state": "online", 00:10:35.619 "raid_level": "raid0", 00:10:35.619 "superblock": false, 00:10:35.619 "num_base_bdevs": 2, 00:10:35.619 "num_base_bdevs_discovered": 2, 00:10:35.619 "num_base_bdevs_operational": 2, 00:10:35.619 "base_bdevs_list": [ 00:10:35.619 { 00:10:35.619 "name": "BaseBdev1", 00:10:35.619 "uuid": "75cb82d0-9bc0-4d35-944f-545aaa4d1a07", 00:10:35.619 "is_configured": true, 00:10:35.619 "data_offset": 0, 00:10:35.619 "data_size": 65536 00:10:35.619 }, 00:10:35.619 { 00:10:35.619 "name": "BaseBdev2", 00:10:35.619 "uuid": "e40a6ab3-241f-449f-8cde-064368f4bba3", 00:10:35.619 "is_configured": true, 00:10:35.619 "data_offset": 0, 00:10:35.619 "data_size": 65536 00:10:35.619 } 00:10:35.619 ] 00:10:35.619 } 00:10:35.619 } 00:10:35.619 }' 00:10:35.619 23:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:35.619 23:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:10:35.619 BaseBdev2' 00:10:35.619 23:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:35.619 23:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:35.619 23:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:35.877 23:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:35.877 "name": "BaseBdev1", 00:10:35.877 "aliases": [ 00:10:35.877 "75cb82d0-9bc0-4d35-944f-545aaa4d1a07" 00:10:35.877 ], 00:10:35.877 "product_name": "Malloc disk", 00:10:35.877 "block_size": 512, 00:10:35.877 "num_blocks": 65536, 00:10:35.877 "uuid": "75cb82d0-9bc0-4d35-944f-545aaa4d1a07", 00:10:35.877 "assigned_rate_limits": { 00:10:35.877 "rw_ios_per_sec": 0, 00:10:35.877 "rw_mbytes_per_sec": 0, 00:10:35.877 "r_mbytes_per_sec": 0, 00:10:35.877 "w_mbytes_per_sec": 0 00:10:35.877 }, 00:10:35.877 "claimed": true, 00:10:35.877 "claim_type": "exclusive_write", 00:10:35.877 "zoned": false, 00:10:35.877 "supported_io_types": { 00:10:35.877 "read": true, 00:10:35.877 "write": true, 00:10:35.877 "unmap": true, 00:10:35.877 "write_zeroes": true, 00:10:35.877 "flush": true, 00:10:35.877 "reset": true, 00:10:35.877 "compare": false, 00:10:35.877 "compare_and_write": false, 00:10:35.877 "abort": true, 00:10:35.877 "nvme_admin": false, 00:10:35.877 "nvme_io": false 00:10:35.877 }, 00:10:35.877 "memory_domains": [ 00:10:35.877 { 00:10:35.877 "dma_device_id": "system", 00:10:35.877 "dma_device_type": 1 00:10:35.877 }, 00:10:35.877 { 00:10:35.877 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:35.877 "dma_device_type": 2 00:10:35.877 } 00:10:35.877 ], 00:10:35.877 "driver_specific": {} 00:10:35.877 }' 00:10:35.877 23:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:35.877 23:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:35.877 23:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:35.877 23:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:35.877 23:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:36.135 23:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:36.135 23:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:36.135 23:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:36.135 23:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:36.135 23:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:36.135 23:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:36.135 23:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:36.135 23:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:36.135 23:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:36.135 23:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:36.394 23:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:36.394 "name": "BaseBdev2", 00:10:36.394 "aliases": [ 00:10:36.394 "e40a6ab3-241f-449f-8cde-064368f4bba3" 00:10:36.394 ], 00:10:36.394 "product_name": "Malloc disk", 00:10:36.394 "block_size": 512, 00:10:36.394 "num_blocks": 65536, 00:10:36.394 "uuid": "e40a6ab3-241f-449f-8cde-064368f4bba3", 00:10:36.394 "assigned_rate_limits": { 00:10:36.394 "rw_ios_per_sec": 0, 00:10:36.394 "rw_mbytes_per_sec": 0, 00:10:36.394 "r_mbytes_per_sec": 0, 00:10:36.394 "w_mbytes_per_sec": 0 00:10:36.394 }, 00:10:36.394 "claimed": true, 00:10:36.394 "claim_type": "exclusive_write", 00:10:36.394 "zoned": false, 00:10:36.394 "supported_io_types": { 00:10:36.394 "read": true, 00:10:36.394 "write": true, 00:10:36.394 "unmap": true, 00:10:36.394 "write_zeroes": true, 00:10:36.394 "flush": true, 00:10:36.394 "reset": true, 00:10:36.394 "compare": false, 00:10:36.394 "compare_and_write": false, 00:10:36.394 "abort": true, 00:10:36.394 "nvme_admin": false, 00:10:36.394 "nvme_io": false 00:10:36.394 }, 00:10:36.394 "memory_domains": [ 00:10:36.394 { 00:10:36.394 "dma_device_id": "system", 00:10:36.394 "dma_device_type": 1 00:10:36.394 }, 00:10:36.394 { 00:10:36.394 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:36.394 "dma_device_type": 2 00:10:36.394 } 00:10:36.394 ], 00:10:36.394 "driver_specific": {} 00:10:36.394 }' 00:10:36.394 23:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:36.394 23:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:36.652 23:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:36.652 23:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:36.652 23:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:36.652 23:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:36.652 23:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:36.652 23:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:36.652 23:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:36.652 23:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:36.652 23:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:36.910 23:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:36.910 23:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:36.910 [2024-05-14 23:52:37.393495] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:36.910 [2024-05-14 23:52:37.393522] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:36.910 [2024-05-14 23:52:37.393563] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:36.910 23:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # local expected_state 00:10:36.910 23:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # has_redundancy raid0 00:10:36.910 23:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:10:36.910 23:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@216 -- # return 1 00:10:36.910 23:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:10:36.910 23:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:10:36.910 23:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:36.910 23:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:10:36.910 23:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:36.910 23:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:36.910 23:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:10:36.910 23:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:36.910 23:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:36.910 23:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:36.910 23:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:36.910 23:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:36.910 23:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:37.168 23:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:37.168 "name": "Existed_Raid", 00:10:37.168 "uuid": "dc0efc9c-3e03-436c-a3b2-da2d718b9758", 00:10:37.168 "strip_size_kb": 64, 00:10:37.168 "state": "offline", 00:10:37.168 "raid_level": "raid0", 00:10:37.168 "superblock": false, 00:10:37.168 "num_base_bdevs": 2, 00:10:37.168 "num_base_bdevs_discovered": 1, 00:10:37.168 "num_base_bdevs_operational": 1, 00:10:37.168 "base_bdevs_list": [ 00:10:37.168 { 00:10:37.168 "name": null, 00:10:37.168 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:37.168 "is_configured": false, 00:10:37.168 "data_offset": 0, 00:10:37.168 "data_size": 65536 00:10:37.168 }, 00:10:37.168 { 00:10:37.168 "name": "BaseBdev2", 00:10:37.168 "uuid": "e40a6ab3-241f-449f-8cde-064368f4bba3", 00:10:37.168 "is_configured": true, 00:10:37.168 "data_offset": 0, 00:10:37.168 "data_size": 65536 00:10:37.168 } 00:10:37.168 ] 00:10:37.168 }' 00:10:37.168 23:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:37.168 23:52:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:37.733 23:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:10:37.733 23:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:10:37.733 23:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:37.733 23:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:10:37.991 23:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:10:37.991 23:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:37.991 23:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:38.250 [2024-05-14 23:52:38.586170] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:38.250 [2024-05-14 23:52:38.586223] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ffc4b0 name Existed_Raid, state offline 00:10:38.250 23:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:10:38.250 23:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:10:38.250 23:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:38.250 23:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:10:38.509 23:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:10:38.509 23:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:10:38.509 23:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@300 -- # '[' 2 -gt 2 ']' 00:10:38.509 23:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@342 -- # killprocess 385613 00:10:38.509 23:52:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@946 -- # '[' -z 385613 ']' 00:10:38.509 23:52:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # kill -0 385613 00:10:38.509 23:52:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # uname 00:10:38.509 23:52:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:10:38.509 23:52:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 385613 00:10:38.509 23:52:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:10:38.509 23:52:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:10:38.509 23:52:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 385613' 00:10:38.509 killing process with pid 385613 00:10:38.509 23:52:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@965 -- # kill 385613 00:10:38.509 [2024-05-14 23:52:38.922943] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:38.509 23:52:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@970 -- # wait 385613 00:10:38.509 [2024-05-14 23:52:38.924728] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:38.769 23:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@344 -- # return 0 00:10:38.769 00:10:38.769 real 0m10.269s 00:10:38.769 user 0m18.100s 00:10:38.769 sys 0m1.795s 00:10:38.769 23:52:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:10:38.769 23:52:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:38.769 ************************************ 00:10:38.769 END TEST raid_state_function_test 00:10:38.769 ************************************ 00:10:39.029 23:52:39 bdev_raid -- bdev/bdev_raid.sh@816 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:10:39.029 23:52:39 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:10:39.029 23:52:39 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:10:39.029 23:52:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:39.029 ************************************ 00:10:39.029 START TEST raid_state_function_test_sb 00:10:39.029 ************************************ 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1121 -- # raid_state_function_test raid0 2 true 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local raid_level=raid0 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=2 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # '[' raid0 '!=' raid1 ']' 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # raid_pid=387157 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 387157' 00:10:39.029 Process raid pid: 387157 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@247 -- # waitforlisten 387157 /var/tmp/spdk-raid.sock 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@827 -- # '[' -z 387157 ']' 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:39.029 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:10:39.029 23:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:39.029 [2024-05-14 23:52:39.508464] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:10:39.029 [2024-05-14 23:52:39.508515] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:39.412 [2024-05-14 23:52:39.619106] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:39.412 [2024-05-14 23:52:39.726117] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:39.412 [2024-05-14 23:52:39.783277] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:39.412 [2024-05-14 23:52:39.783305] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:39.976 23:52:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:10:39.976 23:52:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # return 0 00:10:39.976 23:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:40.234 [2024-05-14 23:52:40.604376] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:40.234 [2024-05-14 23:52:40.604426] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:40.234 [2024-05-14 23:52:40.604437] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:40.234 [2024-05-14 23:52:40.604449] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:40.234 23:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:40.234 23:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:40.234 23:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:40.234 23:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:40.234 23:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:40.234 23:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:40.234 23:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:40.234 23:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:40.234 23:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:40.234 23:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:40.234 23:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:40.234 23:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:40.493 23:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:40.493 "name": "Existed_Raid", 00:10:40.493 "uuid": "9e0f789a-284f-4145-b13a-cb3e3e613a34", 00:10:40.493 "strip_size_kb": 64, 00:10:40.493 "state": "configuring", 00:10:40.493 "raid_level": "raid0", 00:10:40.493 "superblock": true, 00:10:40.493 "num_base_bdevs": 2, 00:10:40.493 "num_base_bdevs_discovered": 0, 00:10:40.493 "num_base_bdevs_operational": 2, 00:10:40.493 "base_bdevs_list": [ 00:10:40.493 { 00:10:40.493 "name": "BaseBdev1", 00:10:40.493 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:40.493 "is_configured": false, 00:10:40.493 "data_offset": 0, 00:10:40.493 "data_size": 0 00:10:40.493 }, 00:10:40.493 { 00:10:40.493 "name": "BaseBdev2", 00:10:40.493 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:40.493 "is_configured": false, 00:10:40.493 "data_offset": 0, 00:10:40.493 "data_size": 0 00:10:40.493 } 00:10:40.493 ] 00:10:40.493 }' 00:10:40.493 23:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:40.493 23:52:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:41.059 23:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:41.059 [2024-05-14 23:52:41.618909] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:41.059 [2024-05-14 23:52:41.618944] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f40bc0 name Existed_Raid, state configuring 00:10:41.059 23:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:41.317 [2024-05-14 23:52:41.791388] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:41.317 [2024-05-14 23:52:41.791427] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:41.317 [2024-05-14 23:52:41.791438] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:41.317 [2024-05-14 23:52:41.791450] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:41.317 23:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:41.575 [2024-05-14 23:52:41.977674] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:41.575 BaseBdev1 00:10:41.575 23:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:10:41.575 23:52:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:10:41.575 23:52:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:41.575 23:52:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:10:41.575 23:52:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:41.575 23:52:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:41.575 23:52:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:41.834 23:52:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:41.834 [ 00:10:41.834 { 00:10:41.834 "name": "BaseBdev1", 00:10:41.834 "aliases": [ 00:10:41.834 "a575b8ff-a409-4bb3-b61a-215624fb0f8b" 00:10:41.834 ], 00:10:41.834 "product_name": "Malloc disk", 00:10:41.834 "block_size": 512, 00:10:41.834 "num_blocks": 65536, 00:10:41.834 "uuid": "a575b8ff-a409-4bb3-b61a-215624fb0f8b", 00:10:41.834 "assigned_rate_limits": { 00:10:41.834 "rw_ios_per_sec": 0, 00:10:41.834 "rw_mbytes_per_sec": 0, 00:10:41.834 "r_mbytes_per_sec": 0, 00:10:41.834 "w_mbytes_per_sec": 0 00:10:41.834 }, 00:10:41.834 "claimed": true, 00:10:41.834 "claim_type": "exclusive_write", 00:10:41.834 "zoned": false, 00:10:41.834 "supported_io_types": { 00:10:41.834 "read": true, 00:10:41.834 "write": true, 00:10:41.834 "unmap": true, 00:10:41.834 "write_zeroes": true, 00:10:41.834 "flush": true, 00:10:41.834 "reset": true, 00:10:41.834 "compare": false, 00:10:41.834 "compare_and_write": false, 00:10:41.834 "abort": true, 00:10:41.834 "nvme_admin": false, 00:10:41.834 "nvme_io": false 00:10:41.834 }, 00:10:41.834 "memory_domains": [ 00:10:41.834 { 00:10:41.834 "dma_device_id": "system", 00:10:41.834 "dma_device_type": 1 00:10:41.834 }, 00:10:41.834 { 00:10:41.834 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:41.834 "dma_device_type": 2 00:10:41.834 } 00:10:41.834 ], 00:10:41.834 "driver_specific": {} 00:10:41.834 } 00:10:41.834 ] 00:10:41.834 23:52:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:10:41.834 23:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:41.834 23:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:41.834 23:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:41.834 23:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:41.834 23:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:41.834 23:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:41.834 23:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:41.834 23:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:41.834 23:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:41.834 23:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:41.834 23:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:41.834 23:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:42.095 23:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:42.095 "name": "Existed_Raid", 00:10:42.095 "uuid": "ccfdcf03-add1-4613-b80c-2471814a1497", 00:10:42.095 "strip_size_kb": 64, 00:10:42.095 "state": "configuring", 00:10:42.095 "raid_level": "raid0", 00:10:42.095 "superblock": true, 00:10:42.095 "num_base_bdevs": 2, 00:10:42.095 "num_base_bdevs_discovered": 1, 00:10:42.095 "num_base_bdevs_operational": 2, 00:10:42.095 "base_bdevs_list": [ 00:10:42.095 { 00:10:42.095 "name": "BaseBdev1", 00:10:42.095 "uuid": "a575b8ff-a409-4bb3-b61a-215624fb0f8b", 00:10:42.095 "is_configured": true, 00:10:42.095 "data_offset": 2048, 00:10:42.095 "data_size": 63488 00:10:42.095 }, 00:10:42.095 { 00:10:42.095 "name": "BaseBdev2", 00:10:42.095 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:42.095 "is_configured": false, 00:10:42.095 "data_offset": 0, 00:10:42.095 "data_size": 0 00:10:42.095 } 00:10:42.095 ] 00:10:42.095 }' 00:10:42.095 23:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:42.095 23:52:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:42.661 23:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:42.919 [2024-05-14 23:52:43.269090] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:42.919 [2024-05-14 23:52:43.269132] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f40e60 name Existed_Raid, state configuring 00:10:42.919 23:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:43.177 [2024-05-14 23:52:43.513784] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:43.177 [2024-05-14 23:52:43.515263] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:43.177 [2024-05-14 23:52:43.515294] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:43.177 23:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:10:43.177 23:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:10:43.177 23:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:43.177 23:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:43.177 23:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:43.177 23:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:43.177 23:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:43.177 23:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:43.177 23:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:43.177 23:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:43.177 23:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:43.177 23:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:43.177 23:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:43.177 23:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:43.436 23:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:43.436 "name": "Existed_Raid", 00:10:43.436 "uuid": "2154880e-ebaa-4ea9-aae7-e7853e7dae3c", 00:10:43.436 "strip_size_kb": 64, 00:10:43.436 "state": "configuring", 00:10:43.436 "raid_level": "raid0", 00:10:43.436 "superblock": true, 00:10:43.436 "num_base_bdevs": 2, 00:10:43.436 "num_base_bdevs_discovered": 1, 00:10:43.436 "num_base_bdevs_operational": 2, 00:10:43.436 "base_bdevs_list": [ 00:10:43.436 { 00:10:43.436 "name": "BaseBdev1", 00:10:43.436 "uuid": "a575b8ff-a409-4bb3-b61a-215624fb0f8b", 00:10:43.436 "is_configured": true, 00:10:43.436 "data_offset": 2048, 00:10:43.436 "data_size": 63488 00:10:43.436 }, 00:10:43.436 { 00:10:43.436 "name": "BaseBdev2", 00:10:43.436 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:43.436 "is_configured": false, 00:10:43.436 "data_offset": 0, 00:10:43.436 "data_size": 0 00:10:43.436 } 00:10:43.436 ] 00:10:43.436 }' 00:10:43.436 23:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:43.436 23:52:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:44.004 23:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:44.004 [2024-05-14 23:52:44.555856] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:44.004 [2024-05-14 23:52:44.556006] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f404b0 00:10:44.004 [2024-05-14 23:52:44.556020] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:44.004 [2024-05-14 23:52:44.556194] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f40a70 00:10:44.004 [2024-05-14 23:52:44.556305] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f404b0 00:10:44.004 [2024-05-14 23:52:44.556315] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f404b0 00:10:44.004 [2024-05-14 23:52:44.556420] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:44.004 BaseBdev2 00:10:44.004 23:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:10:44.004 23:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:10:44.004 23:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:44.004 23:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:10:44.004 23:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:44.005 23:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:44.005 23:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:44.263 23:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:44.522 [ 00:10:44.522 { 00:10:44.522 "name": "BaseBdev2", 00:10:44.522 "aliases": [ 00:10:44.522 "4bfd47cd-8243-4d63-bce9-1ca4e53a6f42" 00:10:44.522 ], 00:10:44.522 "product_name": "Malloc disk", 00:10:44.522 "block_size": 512, 00:10:44.522 "num_blocks": 65536, 00:10:44.522 "uuid": "4bfd47cd-8243-4d63-bce9-1ca4e53a6f42", 00:10:44.522 "assigned_rate_limits": { 00:10:44.522 "rw_ios_per_sec": 0, 00:10:44.522 "rw_mbytes_per_sec": 0, 00:10:44.522 "r_mbytes_per_sec": 0, 00:10:44.523 "w_mbytes_per_sec": 0 00:10:44.523 }, 00:10:44.523 "claimed": true, 00:10:44.523 "claim_type": "exclusive_write", 00:10:44.523 "zoned": false, 00:10:44.523 "supported_io_types": { 00:10:44.523 "read": true, 00:10:44.523 "write": true, 00:10:44.523 "unmap": true, 00:10:44.523 "write_zeroes": true, 00:10:44.523 "flush": true, 00:10:44.523 "reset": true, 00:10:44.523 "compare": false, 00:10:44.523 "compare_and_write": false, 00:10:44.523 "abort": true, 00:10:44.523 "nvme_admin": false, 00:10:44.523 "nvme_io": false 00:10:44.523 }, 00:10:44.523 "memory_domains": [ 00:10:44.523 { 00:10:44.523 "dma_device_id": "system", 00:10:44.523 "dma_device_type": 1 00:10:44.523 }, 00:10:44.523 { 00:10:44.523 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:44.523 "dma_device_type": 2 00:10:44.523 } 00:10:44.523 ], 00:10:44.523 "driver_specific": {} 00:10:44.523 } 00:10:44.523 ] 00:10:44.523 23:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:10:44.523 23:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:10:44.523 23:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:10:44.523 23:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:10:44.523 23:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:44.523 23:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:10:44.523 23:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:44.523 23:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:44.523 23:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:44.523 23:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:44.523 23:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:44.523 23:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:44.523 23:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:44.523 23:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:44.523 23:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:44.782 23:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:44.782 "name": "Existed_Raid", 00:10:44.782 "uuid": "2154880e-ebaa-4ea9-aae7-e7853e7dae3c", 00:10:44.782 "strip_size_kb": 64, 00:10:44.782 "state": "online", 00:10:44.782 "raid_level": "raid0", 00:10:44.782 "superblock": true, 00:10:44.782 "num_base_bdevs": 2, 00:10:44.782 "num_base_bdevs_discovered": 2, 00:10:44.782 "num_base_bdevs_operational": 2, 00:10:44.782 "base_bdevs_list": [ 00:10:44.782 { 00:10:44.782 "name": "BaseBdev1", 00:10:44.782 "uuid": "a575b8ff-a409-4bb3-b61a-215624fb0f8b", 00:10:44.782 "is_configured": true, 00:10:44.782 "data_offset": 2048, 00:10:44.782 "data_size": 63488 00:10:44.782 }, 00:10:44.782 { 00:10:44.782 "name": "BaseBdev2", 00:10:44.782 "uuid": "4bfd47cd-8243-4d63-bce9-1ca4e53a6f42", 00:10:44.782 "is_configured": true, 00:10:44.782 "data_offset": 2048, 00:10:44.782 "data_size": 63488 00:10:44.782 } 00:10:44.782 ] 00:10:44.782 }' 00:10:44.782 23:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:44.782 23:52:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:45.349 23:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:10:45.349 23:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:10:45.349 23:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:10:45.349 23:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:10:45.349 23:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:10:45.349 23:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:10:45.349 23:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:45.349 23:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:10:45.349 [2024-05-14 23:52:45.875619] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:45.349 23:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:10:45.349 "name": "Existed_Raid", 00:10:45.349 "aliases": [ 00:10:45.349 "2154880e-ebaa-4ea9-aae7-e7853e7dae3c" 00:10:45.349 ], 00:10:45.349 "product_name": "Raid Volume", 00:10:45.350 "block_size": 512, 00:10:45.350 "num_blocks": 126976, 00:10:45.350 "uuid": "2154880e-ebaa-4ea9-aae7-e7853e7dae3c", 00:10:45.350 "assigned_rate_limits": { 00:10:45.350 "rw_ios_per_sec": 0, 00:10:45.350 "rw_mbytes_per_sec": 0, 00:10:45.350 "r_mbytes_per_sec": 0, 00:10:45.350 "w_mbytes_per_sec": 0 00:10:45.350 }, 00:10:45.350 "claimed": false, 00:10:45.350 "zoned": false, 00:10:45.350 "supported_io_types": { 00:10:45.350 "read": true, 00:10:45.350 "write": true, 00:10:45.350 "unmap": true, 00:10:45.350 "write_zeroes": true, 00:10:45.350 "flush": true, 00:10:45.350 "reset": true, 00:10:45.350 "compare": false, 00:10:45.350 "compare_and_write": false, 00:10:45.350 "abort": false, 00:10:45.350 "nvme_admin": false, 00:10:45.350 "nvme_io": false 00:10:45.350 }, 00:10:45.350 "memory_domains": [ 00:10:45.350 { 00:10:45.350 "dma_device_id": "system", 00:10:45.350 "dma_device_type": 1 00:10:45.350 }, 00:10:45.350 { 00:10:45.350 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:45.350 "dma_device_type": 2 00:10:45.350 }, 00:10:45.350 { 00:10:45.350 "dma_device_id": "system", 00:10:45.350 "dma_device_type": 1 00:10:45.350 }, 00:10:45.350 { 00:10:45.350 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:45.350 "dma_device_type": 2 00:10:45.350 } 00:10:45.350 ], 00:10:45.350 "driver_specific": { 00:10:45.350 "raid": { 00:10:45.350 "uuid": "2154880e-ebaa-4ea9-aae7-e7853e7dae3c", 00:10:45.350 "strip_size_kb": 64, 00:10:45.350 "state": "online", 00:10:45.350 "raid_level": "raid0", 00:10:45.350 "superblock": true, 00:10:45.350 "num_base_bdevs": 2, 00:10:45.350 "num_base_bdevs_discovered": 2, 00:10:45.350 "num_base_bdevs_operational": 2, 00:10:45.350 "base_bdevs_list": [ 00:10:45.350 { 00:10:45.350 "name": "BaseBdev1", 00:10:45.350 "uuid": "a575b8ff-a409-4bb3-b61a-215624fb0f8b", 00:10:45.350 "is_configured": true, 00:10:45.350 "data_offset": 2048, 00:10:45.350 "data_size": 63488 00:10:45.350 }, 00:10:45.350 { 00:10:45.350 "name": "BaseBdev2", 00:10:45.350 "uuid": "4bfd47cd-8243-4d63-bce9-1ca4e53a6f42", 00:10:45.350 "is_configured": true, 00:10:45.350 "data_offset": 2048, 00:10:45.350 "data_size": 63488 00:10:45.350 } 00:10:45.350 ] 00:10:45.350 } 00:10:45.350 } 00:10:45.350 }' 00:10:45.350 23:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:45.609 23:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:10:45.609 BaseBdev2' 00:10:45.609 23:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:45.609 23:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:45.609 23:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:45.609 23:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:45.609 "name": "BaseBdev1", 00:10:45.609 "aliases": [ 00:10:45.609 "a575b8ff-a409-4bb3-b61a-215624fb0f8b" 00:10:45.609 ], 00:10:45.609 "product_name": "Malloc disk", 00:10:45.609 "block_size": 512, 00:10:45.609 "num_blocks": 65536, 00:10:45.609 "uuid": "a575b8ff-a409-4bb3-b61a-215624fb0f8b", 00:10:45.609 "assigned_rate_limits": { 00:10:45.609 "rw_ios_per_sec": 0, 00:10:45.609 "rw_mbytes_per_sec": 0, 00:10:45.609 "r_mbytes_per_sec": 0, 00:10:45.609 "w_mbytes_per_sec": 0 00:10:45.609 }, 00:10:45.609 "claimed": true, 00:10:45.609 "claim_type": "exclusive_write", 00:10:45.609 "zoned": false, 00:10:45.609 "supported_io_types": { 00:10:45.609 "read": true, 00:10:45.609 "write": true, 00:10:45.609 "unmap": true, 00:10:45.609 "write_zeroes": true, 00:10:45.609 "flush": true, 00:10:45.609 "reset": true, 00:10:45.609 "compare": false, 00:10:45.609 "compare_and_write": false, 00:10:45.609 "abort": true, 00:10:45.609 "nvme_admin": false, 00:10:45.609 "nvme_io": false 00:10:45.609 }, 00:10:45.609 "memory_domains": [ 00:10:45.609 { 00:10:45.609 "dma_device_id": "system", 00:10:45.609 "dma_device_type": 1 00:10:45.609 }, 00:10:45.609 { 00:10:45.609 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:45.609 "dma_device_type": 2 00:10:45.609 } 00:10:45.609 ], 00:10:45.609 "driver_specific": {} 00:10:45.609 }' 00:10:45.609 23:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:45.868 23:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:45.868 23:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:45.868 23:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:45.868 23:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:45.868 23:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:45.868 23:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:45.868 23:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:45.868 23:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:45.868 23:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:45.868 23:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:46.127 23:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:46.127 23:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:46.127 23:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:46.127 23:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:46.385 23:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:46.385 "name": "BaseBdev2", 00:10:46.385 "aliases": [ 00:10:46.385 "4bfd47cd-8243-4d63-bce9-1ca4e53a6f42" 00:10:46.385 ], 00:10:46.385 "product_name": "Malloc disk", 00:10:46.385 "block_size": 512, 00:10:46.385 "num_blocks": 65536, 00:10:46.385 "uuid": "4bfd47cd-8243-4d63-bce9-1ca4e53a6f42", 00:10:46.385 "assigned_rate_limits": { 00:10:46.385 "rw_ios_per_sec": 0, 00:10:46.385 "rw_mbytes_per_sec": 0, 00:10:46.385 "r_mbytes_per_sec": 0, 00:10:46.385 "w_mbytes_per_sec": 0 00:10:46.385 }, 00:10:46.385 "claimed": true, 00:10:46.385 "claim_type": "exclusive_write", 00:10:46.385 "zoned": false, 00:10:46.385 "supported_io_types": { 00:10:46.385 "read": true, 00:10:46.385 "write": true, 00:10:46.385 "unmap": true, 00:10:46.385 "write_zeroes": true, 00:10:46.385 "flush": true, 00:10:46.385 "reset": true, 00:10:46.385 "compare": false, 00:10:46.385 "compare_and_write": false, 00:10:46.385 "abort": true, 00:10:46.385 "nvme_admin": false, 00:10:46.385 "nvme_io": false 00:10:46.385 }, 00:10:46.385 "memory_domains": [ 00:10:46.385 { 00:10:46.385 "dma_device_id": "system", 00:10:46.385 "dma_device_type": 1 00:10:46.385 }, 00:10:46.385 { 00:10:46.385 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:46.385 "dma_device_type": 2 00:10:46.385 } 00:10:46.385 ], 00:10:46.385 "driver_specific": {} 00:10:46.385 }' 00:10:46.385 23:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:46.385 23:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:46.385 23:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:46.385 23:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:46.385 23:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:46.385 23:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:46.385 23:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:46.385 23:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:46.385 23:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:46.385 23:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:46.644 23:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:46.644 23:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:46.644 23:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:46.644 [2024-05-14 23:52:47.166849] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:46.644 [2024-05-14 23:52:47.166880] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:46.644 [2024-05-14 23:52:47.166925] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:46.644 23:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # local expected_state 00:10:46.644 23:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # has_redundancy raid0 00:10:46.644 23:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # case $1 in 00:10:46.644 23:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@216 -- # return 1 00:10:46.644 23:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:10:46.644 23:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:10:46.644 23:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:46.644 23:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:10:46.644 23:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:46.644 23:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:46.644 23:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:10:46.644 23:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:46.644 23:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:46.644 23:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:46.644 23:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:46.644 23:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:46.644 23:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:46.903 23:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:46.903 "name": "Existed_Raid", 00:10:46.903 "uuid": "2154880e-ebaa-4ea9-aae7-e7853e7dae3c", 00:10:46.903 "strip_size_kb": 64, 00:10:46.903 "state": "offline", 00:10:46.903 "raid_level": "raid0", 00:10:46.903 "superblock": true, 00:10:46.903 "num_base_bdevs": 2, 00:10:46.903 "num_base_bdevs_discovered": 1, 00:10:46.903 "num_base_bdevs_operational": 1, 00:10:46.903 "base_bdevs_list": [ 00:10:46.903 { 00:10:46.903 "name": null, 00:10:46.903 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:46.903 "is_configured": false, 00:10:46.903 "data_offset": 2048, 00:10:46.903 "data_size": 63488 00:10:46.903 }, 00:10:46.903 { 00:10:46.903 "name": "BaseBdev2", 00:10:46.903 "uuid": "4bfd47cd-8243-4d63-bce9-1ca4e53a6f42", 00:10:46.903 "is_configured": true, 00:10:46.903 "data_offset": 2048, 00:10:46.903 "data_size": 63488 00:10:46.903 } 00:10:46.903 ] 00:10:46.903 }' 00:10:46.903 23:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:46.903 23:52:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:47.471 23:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:10:47.471 23:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:10:47.471 23:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:47.471 23:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:10:47.729 23:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:10:47.729 23:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:47.729 23:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:47.729 [2024-05-14 23:52:48.311821] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:47.729 [2024-05-14 23:52:48.311876] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f404b0 name Existed_Raid, state offline 00:10:47.988 23:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:10:47.988 23:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:10:47.988 23:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:47.988 23:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:10:48.247 23:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:10:48.247 23:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:10:48.247 23:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@300 -- # '[' 2 -gt 2 ']' 00:10:48.247 23:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@342 -- # killprocess 387157 00:10:48.247 23:52:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@946 -- # '[' -z 387157 ']' 00:10:48.247 23:52:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # kill -0 387157 00:10:48.247 23:52:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # uname 00:10:48.247 23:52:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:10:48.247 23:52:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 387157 00:10:48.247 23:52:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:10:48.247 23:52:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:10:48.247 23:52:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 387157' 00:10:48.247 killing process with pid 387157 00:10:48.247 23:52:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@965 -- # kill 387157 00:10:48.247 [2024-05-14 23:52:48.638306] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:48.247 23:52:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@970 -- # wait 387157 00:10:48.247 [2024-05-14 23:52:48.639323] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:48.506 23:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@344 -- # return 0 00:10:48.506 00:10:48.506 real 0m9.446s 00:10:48.506 user 0m16.692s 00:10:48.506 sys 0m1.788s 00:10:48.506 23:52:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:10:48.506 23:52:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:48.506 ************************************ 00:10:48.506 END TEST raid_state_function_test_sb 00:10:48.506 ************************************ 00:10:48.506 23:52:48 bdev_raid -- bdev/bdev_raid.sh@817 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:10:48.506 23:52:48 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:10:48.506 23:52:48 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:10:48.506 23:52:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:48.506 ************************************ 00:10:48.506 START TEST raid_superblock_test 00:10:48.506 ************************************ 00:10:48.506 23:52:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1121 -- # raid_superblock_test raid0 2 00:10:48.506 23:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local raid_level=raid0 00:10:48.506 23:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=2 00:10:48.506 23:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:10:48.506 23:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:10:48.506 23:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:10:48.506 23:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:10:48.506 23:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:10:48.506 23:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:10:48.506 23:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:10:48.506 23:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size 00:10:48.506 23:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:10:48.506 23:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:10:48.506 23:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:10:48.506 23:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # '[' raid0 '!=' raid1 ']' 00:10:48.506 23:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size=64 00:10:48.506 23:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@406 -- # strip_size_create_arg='-z 64' 00:10:48.506 23:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # raid_pid=388617 00:10:48.506 23:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:10:48.506 23:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # waitforlisten 388617 /var/tmp/spdk-raid.sock 00:10:48.506 23:52:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@827 -- # '[' -z 388617 ']' 00:10:48.506 23:52:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:48.506 23:52:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:10:48.506 23:52:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:48.506 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:48.506 23:52:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:10:48.506 23:52:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:48.506 [2024-05-14 23:52:49.037323] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:10:48.506 [2024-05-14 23:52:49.037390] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid388617 ] 00:10:48.766 [2024-05-14 23:52:49.165953] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:48.766 [2024-05-14 23:52:49.267748] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:48.766 [2024-05-14 23:52:49.333793] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:48.766 [2024-05-14 23:52:49.333829] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:49.334 23:52:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:10:49.334 23:52:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # return 0 00:10:49.334 23:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:10:49.334 23:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:10:49.334 23:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:10:49.334 23:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:10:49.334 23:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:10:49.334 23:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:49.334 23:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:10:49.334 23:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:49.334 23:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:10:49.593 malloc1 00:10:49.593 23:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:49.853 [2024-05-14 23:52:50.392860] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:49.853 [2024-05-14 23:52:50.392906] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:49.853 [2024-05-14 23:52:50.392929] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20cf780 00:10:49.853 [2024-05-14 23:52:50.392942] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:49.853 [2024-05-14 23:52:50.394720] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:49.853 [2024-05-14 23:52:50.394750] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:49.853 pt1 00:10:49.853 23:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:10:49.853 23:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:10:49.853 23:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:10:49.853 23:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:10:49.853 23:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:10:49.853 23:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:49.853 23:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:10:49.853 23:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:49.853 23:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:10:50.112 malloc2 00:10:50.112 23:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:50.371 [2024-05-14 23:52:50.888280] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:50.371 [2024-05-14 23:52:50.888327] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:50.371 [2024-05-14 23:52:50.888347] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20d0b60 00:10:50.371 [2024-05-14 23:52:50.888360] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:50.371 [2024-05-14 23:52:50.889960] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:50.371 [2024-05-14 23:52:50.889989] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:50.371 pt2 00:10:50.371 23:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:10:50.371 23:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:10:50.371 23:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:10:50.630 [2024-05-14 23:52:51.128936] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:50.630 [2024-05-14 23:52:51.130257] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:50.630 [2024-05-14 23:52:51.130412] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x227c1f0 00:10:50.630 [2024-05-14 23:52:51.130426] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:50.630 [2024-05-14 23:52:51.130621] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20e6670 00:10:50.630 [2024-05-14 23:52:51.130765] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x227c1f0 00:10:50.630 [2024-05-14 23:52:51.130775] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x227c1f0 00:10:50.630 [2024-05-14 23:52:51.130878] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:50.630 23:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:50.630 23:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:10:50.630 23:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:10:50.630 23:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:50.630 23:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:50.630 23:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:50.630 23:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:50.630 23:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:50.630 23:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:50.630 23:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:50.630 23:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:50.630 23:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:50.889 23:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:50.889 "name": "raid_bdev1", 00:10:50.889 "uuid": "597bb95a-6dc2-47a0-8c0a-928cf5bc3368", 00:10:50.889 "strip_size_kb": 64, 00:10:50.889 "state": "online", 00:10:50.889 "raid_level": "raid0", 00:10:50.889 "superblock": true, 00:10:50.889 "num_base_bdevs": 2, 00:10:50.889 "num_base_bdevs_discovered": 2, 00:10:50.889 "num_base_bdevs_operational": 2, 00:10:50.889 "base_bdevs_list": [ 00:10:50.889 { 00:10:50.889 "name": "pt1", 00:10:50.889 "uuid": "685bebbe-aaa9-5cb8-b5ae-087826d7c0af", 00:10:50.889 "is_configured": true, 00:10:50.889 "data_offset": 2048, 00:10:50.889 "data_size": 63488 00:10:50.889 }, 00:10:50.889 { 00:10:50.889 "name": "pt2", 00:10:50.889 "uuid": "d9102b99-5fbd-575f-b357-55d2f9d8736f", 00:10:50.889 "is_configured": true, 00:10:50.889 "data_offset": 2048, 00:10:50.889 "data_size": 63488 00:10:50.889 } 00:10:50.889 ] 00:10:50.889 }' 00:10:50.889 23:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:50.889 23:52:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:51.457 23:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:10:51.457 23:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:10:51.457 23:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:10:51.457 23:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:10:51.457 23:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:10:51.457 23:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:10:51.457 23:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:51.457 23:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:10:51.715 [2024-05-14 23:52:52.103718] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:51.715 23:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:10:51.715 "name": "raid_bdev1", 00:10:51.715 "aliases": [ 00:10:51.715 "597bb95a-6dc2-47a0-8c0a-928cf5bc3368" 00:10:51.715 ], 00:10:51.715 "product_name": "Raid Volume", 00:10:51.715 "block_size": 512, 00:10:51.715 "num_blocks": 126976, 00:10:51.715 "uuid": "597bb95a-6dc2-47a0-8c0a-928cf5bc3368", 00:10:51.715 "assigned_rate_limits": { 00:10:51.715 "rw_ios_per_sec": 0, 00:10:51.715 "rw_mbytes_per_sec": 0, 00:10:51.715 "r_mbytes_per_sec": 0, 00:10:51.715 "w_mbytes_per_sec": 0 00:10:51.715 }, 00:10:51.715 "claimed": false, 00:10:51.715 "zoned": false, 00:10:51.715 "supported_io_types": { 00:10:51.715 "read": true, 00:10:51.715 "write": true, 00:10:51.715 "unmap": true, 00:10:51.715 "write_zeroes": true, 00:10:51.715 "flush": true, 00:10:51.715 "reset": true, 00:10:51.715 "compare": false, 00:10:51.715 "compare_and_write": false, 00:10:51.715 "abort": false, 00:10:51.715 "nvme_admin": false, 00:10:51.715 "nvme_io": false 00:10:51.715 }, 00:10:51.715 "memory_domains": [ 00:10:51.715 { 00:10:51.715 "dma_device_id": "system", 00:10:51.715 "dma_device_type": 1 00:10:51.715 }, 00:10:51.715 { 00:10:51.715 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:51.715 "dma_device_type": 2 00:10:51.715 }, 00:10:51.715 { 00:10:51.715 "dma_device_id": "system", 00:10:51.715 "dma_device_type": 1 00:10:51.715 }, 00:10:51.715 { 00:10:51.715 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:51.715 "dma_device_type": 2 00:10:51.715 } 00:10:51.715 ], 00:10:51.715 "driver_specific": { 00:10:51.715 "raid": { 00:10:51.715 "uuid": "597bb95a-6dc2-47a0-8c0a-928cf5bc3368", 00:10:51.715 "strip_size_kb": 64, 00:10:51.715 "state": "online", 00:10:51.715 "raid_level": "raid0", 00:10:51.715 "superblock": true, 00:10:51.715 "num_base_bdevs": 2, 00:10:51.715 "num_base_bdevs_discovered": 2, 00:10:51.715 "num_base_bdevs_operational": 2, 00:10:51.715 "base_bdevs_list": [ 00:10:51.715 { 00:10:51.715 "name": "pt1", 00:10:51.715 "uuid": "685bebbe-aaa9-5cb8-b5ae-087826d7c0af", 00:10:51.715 "is_configured": true, 00:10:51.715 "data_offset": 2048, 00:10:51.715 "data_size": 63488 00:10:51.715 }, 00:10:51.715 { 00:10:51.715 "name": "pt2", 00:10:51.715 "uuid": "d9102b99-5fbd-575f-b357-55d2f9d8736f", 00:10:51.715 "is_configured": true, 00:10:51.715 "data_offset": 2048, 00:10:51.715 "data_size": 63488 00:10:51.715 } 00:10:51.715 ] 00:10:51.715 } 00:10:51.715 } 00:10:51.715 }' 00:10:51.715 23:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:51.715 23:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:10:51.715 pt2' 00:10:51.715 23:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:51.715 23:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:51.715 23:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:51.973 23:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:51.973 "name": "pt1", 00:10:51.974 "aliases": [ 00:10:51.974 "685bebbe-aaa9-5cb8-b5ae-087826d7c0af" 00:10:51.974 ], 00:10:51.974 "product_name": "passthru", 00:10:51.974 "block_size": 512, 00:10:51.974 "num_blocks": 65536, 00:10:51.974 "uuid": "685bebbe-aaa9-5cb8-b5ae-087826d7c0af", 00:10:51.974 "assigned_rate_limits": { 00:10:51.974 "rw_ios_per_sec": 0, 00:10:51.974 "rw_mbytes_per_sec": 0, 00:10:51.974 "r_mbytes_per_sec": 0, 00:10:51.974 "w_mbytes_per_sec": 0 00:10:51.974 }, 00:10:51.974 "claimed": true, 00:10:51.974 "claim_type": "exclusive_write", 00:10:51.974 "zoned": false, 00:10:51.974 "supported_io_types": { 00:10:51.974 "read": true, 00:10:51.974 "write": true, 00:10:51.974 "unmap": true, 00:10:51.974 "write_zeroes": true, 00:10:51.974 "flush": true, 00:10:51.974 "reset": true, 00:10:51.974 "compare": false, 00:10:51.974 "compare_and_write": false, 00:10:51.974 "abort": true, 00:10:51.974 "nvme_admin": false, 00:10:51.974 "nvme_io": false 00:10:51.974 }, 00:10:51.974 "memory_domains": [ 00:10:51.974 { 00:10:51.974 "dma_device_id": "system", 00:10:51.974 "dma_device_type": 1 00:10:51.974 }, 00:10:51.974 { 00:10:51.974 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:51.974 "dma_device_type": 2 00:10:51.974 } 00:10:51.974 ], 00:10:51.974 "driver_specific": { 00:10:51.974 "passthru": { 00:10:51.974 "name": "pt1", 00:10:51.974 "base_bdev_name": "malloc1" 00:10:51.974 } 00:10:51.974 } 00:10:51.974 }' 00:10:51.974 23:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:51.974 23:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:51.974 23:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:51.974 23:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:51.974 23:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:52.232 23:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:52.232 23:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:52.232 23:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:52.232 23:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:52.232 23:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:52.232 23:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:52.232 23:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:52.232 23:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:52.232 23:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:52.232 23:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:52.490 23:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:52.490 "name": "pt2", 00:10:52.490 "aliases": [ 00:10:52.491 "d9102b99-5fbd-575f-b357-55d2f9d8736f" 00:10:52.491 ], 00:10:52.491 "product_name": "passthru", 00:10:52.491 "block_size": 512, 00:10:52.491 "num_blocks": 65536, 00:10:52.491 "uuid": "d9102b99-5fbd-575f-b357-55d2f9d8736f", 00:10:52.491 "assigned_rate_limits": { 00:10:52.491 "rw_ios_per_sec": 0, 00:10:52.491 "rw_mbytes_per_sec": 0, 00:10:52.491 "r_mbytes_per_sec": 0, 00:10:52.491 "w_mbytes_per_sec": 0 00:10:52.491 }, 00:10:52.491 "claimed": true, 00:10:52.491 "claim_type": "exclusive_write", 00:10:52.491 "zoned": false, 00:10:52.491 "supported_io_types": { 00:10:52.491 "read": true, 00:10:52.491 "write": true, 00:10:52.491 "unmap": true, 00:10:52.491 "write_zeroes": true, 00:10:52.491 "flush": true, 00:10:52.491 "reset": true, 00:10:52.491 "compare": false, 00:10:52.491 "compare_and_write": false, 00:10:52.491 "abort": true, 00:10:52.491 "nvme_admin": false, 00:10:52.491 "nvme_io": false 00:10:52.491 }, 00:10:52.491 "memory_domains": [ 00:10:52.491 { 00:10:52.491 "dma_device_id": "system", 00:10:52.491 "dma_device_type": 1 00:10:52.491 }, 00:10:52.491 { 00:10:52.491 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:52.491 "dma_device_type": 2 00:10:52.491 } 00:10:52.491 ], 00:10:52.491 "driver_specific": { 00:10:52.491 "passthru": { 00:10:52.491 "name": "pt2", 00:10:52.491 "base_bdev_name": "malloc2" 00:10:52.491 } 00:10:52.491 } 00:10:52.491 }' 00:10:52.491 23:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:52.491 23:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:52.491 23:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:52.491 23:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:52.748 23:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:52.748 23:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:52.748 23:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:52.748 23:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:52.748 23:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:52.748 23:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:52.748 23:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:52.748 23:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:52.748 23:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:52.748 23:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:10:53.005 [2024-05-14 23:52:53.539531] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:53.005 23:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=597bb95a-6dc2-47a0-8c0a-928cf5bc3368 00:10:53.005 23:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # '[' -z 597bb95a-6dc2-47a0-8c0a-928cf5bc3368 ']' 00:10:53.005 23:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:53.265 [2024-05-14 23:52:53.703722] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:53.265 [2024-05-14 23:52:53.703748] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:53.265 [2024-05-14 23:52:53.703807] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:53.265 [2024-05-14 23:52:53.703852] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:53.265 [2024-05-14 23:52:53.703864] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x227c1f0 name raid_bdev1, state offline 00:10:53.265 23:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:53.265 23:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:10:53.523 23:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:10:53.523 23:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:10:53.523 23:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:10:53.523 23:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:10:53.782 23:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:10:53.782 23:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:10:54.041 23:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:10:54.041 23:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:10:54.300 23:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:10:54.300 23:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:10:54.300 23:52:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:10:54.300 23:52:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:10:54.300 23:52:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:54.300 23:52:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:54.300 23:52:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:54.300 23:52:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:54.300 23:52:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:54.300 23:52:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:54.300 23:52:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:54.300 23:52:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:10:54.300 23:52:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:10:54.300 [2024-05-14 23:52:54.826636] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:10:54.300 [2024-05-14 23:52:54.828026] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:10:54.300 [2024-05-14 23:52:54.828084] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:10:54.300 [2024-05-14 23:52:54.828123] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:10:54.300 [2024-05-14 23:52:54.828143] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:54.300 [2024-05-14 23:52:54.828153] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20cfc00 name raid_bdev1, state configuring 00:10:54.300 request: 00:10:54.300 { 00:10:54.300 "name": "raid_bdev1", 00:10:54.300 "raid_level": "raid0", 00:10:54.300 "base_bdevs": [ 00:10:54.300 "malloc1", 00:10:54.300 "malloc2" 00:10:54.300 ], 00:10:54.300 "superblock": false, 00:10:54.300 "strip_size_kb": 64, 00:10:54.300 "method": "bdev_raid_create", 00:10:54.300 "req_id": 1 00:10:54.300 } 00:10:54.300 Got JSON-RPC error response 00:10:54.300 response: 00:10:54.300 { 00:10:54.300 "code": -17, 00:10:54.300 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:10:54.300 } 00:10:54.300 23:52:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:10:54.300 23:52:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:54.300 23:52:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:10:54.300 23:52:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:54.300 23:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:54.300 23:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:10:54.559 23:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:10:54.559 23:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:10:54.559 23:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:54.818 [2024-05-14 23:52:55.231655] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:54.819 [2024-05-14 23:52:55.231704] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:54.819 [2024-05-14 23:52:55.231728] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20cf9b0 00:10:54.819 [2024-05-14 23:52:55.231740] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:54.819 [2024-05-14 23:52:55.233394] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:54.819 [2024-05-14 23:52:55.233441] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:54.819 [2024-05-14 23:52:55.233511] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt1 00:10:54.819 [2024-05-14 23:52:55.233538] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:54.819 pt1 00:10:54.819 23:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:10:54.819 23:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:10:54.819 23:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:54.819 23:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:54.819 23:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:54.819 23:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:54.819 23:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:54.819 23:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:54.819 23:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:54.819 23:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:54.819 23:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:54.819 23:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:55.078 23:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:55.078 "name": "raid_bdev1", 00:10:55.078 "uuid": "597bb95a-6dc2-47a0-8c0a-928cf5bc3368", 00:10:55.078 "strip_size_kb": 64, 00:10:55.078 "state": "configuring", 00:10:55.078 "raid_level": "raid0", 00:10:55.078 "superblock": true, 00:10:55.078 "num_base_bdevs": 2, 00:10:55.078 "num_base_bdevs_discovered": 1, 00:10:55.078 "num_base_bdevs_operational": 2, 00:10:55.078 "base_bdevs_list": [ 00:10:55.078 { 00:10:55.078 "name": "pt1", 00:10:55.078 "uuid": "685bebbe-aaa9-5cb8-b5ae-087826d7c0af", 00:10:55.078 "is_configured": true, 00:10:55.078 "data_offset": 2048, 00:10:55.078 "data_size": 63488 00:10:55.078 }, 00:10:55.078 { 00:10:55.078 "name": null, 00:10:55.078 "uuid": "d9102b99-5fbd-575f-b357-55d2f9d8736f", 00:10:55.078 "is_configured": false, 00:10:55.078 "data_offset": 2048, 00:10:55.078 "data_size": 63488 00:10:55.078 } 00:10:55.078 ] 00:10:55.078 }' 00:10:55.078 23:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:55.078 23:52:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:55.644 23:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@470 -- # '[' 2 -gt 2 ']' 00:10:55.644 23:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:10:55.644 23:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:10:55.644 23:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:55.902 [2024-05-14 23:52:56.310539] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:55.902 [2024-05-14 23:52:56.310586] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:55.902 [2024-05-14 23:52:56.310605] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22807a0 00:10:55.902 [2024-05-14 23:52:56.310617] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:55.902 [2024-05-14 23:52:56.310950] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:55.902 [2024-05-14 23:52:56.310967] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:55.902 [2024-05-14 23:52:56.311028] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:10:55.902 [2024-05-14 23:52:56.311047] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:55.902 [2024-05-14 23:52:56.311146] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x22809f0 00:10:55.902 [2024-05-14 23:52:56.311156] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:55.902 [2024-05-14 23:52:56.311329] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x227a380 00:10:55.902 [2024-05-14 23:52:56.311465] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22809f0 00:10:55.902 [2024-05-14 23:52:56.311476] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x22809f0 00:10:55.902 [2024-05-14 23:52:56.311577] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:55.902 pt2 00:10:55.902 23:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:10:55.903 23:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:10:55.903 23:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:55.903 23:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:10:55.903 23:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:10:55.903 23:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:55.903 23:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:55.903 23:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:55.903 23:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:55.903 23:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:55.903 23:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:55.903 23:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:55.903 23:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:55.903 23:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:56.161 23:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:56.161 "name": "raid_bdev1", 00:10:56.161 "uuid": "597bb95a-6dc2-47a0-8c0a-928cf5bc3368", 00:10:56.161 "strip_size_kb": 64, 00:10:56.161 "state": "online", 00:10:56.161 "raid_level": "raid0", 00:10:56.161 "superblock": true, 00:10:56.161 "num_base_bdevs": 2, 00:10:56.161 "num_base_bdevs_discovered": 2, 00:10:56.161 "num_base_bdevs_operational": 2, 00:10:56.161 "base_bdevs_list": [ 00:10:56.161 { 00:10:56.161 "name": "pt1", 00:10:56.161 "uuid": "685bebbe-aaa9-5cb8-b5ae-087826d7c0af", 00:10:56.161 "is_configured": true, 00:10:56.161 "data_offset": 2048, 00:10:56.161 "data_size": 63488 00:10:56.161 }, 00:10:56.161 { 00:10:56.161 "name": "pt2", 00:10:56.161 "uuid": "d9102b99-5fbd-575f-b357-55d2f9d8736f", 00:10:56.161 "is_configured": true, 00:10:56.161 "data_offset": 2048, 00:10:56.161 "data_size": 63488 00:10:56.161 } 00:10:56.161 ] 00:10:56.161 }' 00:10:56.161 23:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:56.161 23:52:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:56.727 23:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:10:56.727 23:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:10:56.727 23:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:10:56.727 23:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:10:56.727 23:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:10:56.727 23:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:10:56.727 23:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:56.727 23:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:10:56.986 [2024-05-14 23:52:57.397653] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:56.986 23:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:10:56.986 "name": "raid_bdev1", 00:10:56.986 "aliases": [ 00:10:56.986 "597bb95a-6dc2-47a0-8c0a-928cf5bc3368" 00:10:56.986 ], 00:10:56.986 "product_name": "Raid Volume", 00:10:56.986 "block_size": 512, 00:10:56.986 "num_blocks": 126976, 00:10:56.986 "uuid": "597bb95a-6dc2-47a0-8c0a-928cf5bc3368", 00:10:56.986 "assigned_rate_limits": { 00:10:56.986 "rw_ios_per_sec": 0, 00:10:56.986 "rw_mbytes_per_sec": 0, 00:10:56.986 "r_mbytes_per_sec": 0, 00:10:56.986 "w_mbytes_per_sec": 0 00:10:56.986 }, 00:10:56.986 "claimed": false, 00:10:56.986 "zoned": false, 00:10:56.986 "supported_io_types": { 00:10:56.986 "read": true, 00:10:56.986 "write": true, 00:10:56.986 "unmap": true, 00:10:56.986 "write_zeroes": true, 00:10:56.986 "flush": true, 00:10:56.986 "reset": true, 00:10:56.986 "compare": false, 00:10:56.986 "compare_and_write": false, 00:10:56.986 "abort": false, 00:10:56.986 "nvme_admin": false, 00:10:56.986 "nvme_io": false 00:10:56.986 }, 00:10:56.986 "memory_domains": [ 00:10:56.986 { 00:10:56.986 "dma_device_id": "system", 00:10:56.986 "dma_device_type": 1 00:10:56.986 }, 00:10:56.986 { 00:10:56.986 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:56.986 "dma_device_type": 2 00:10:56.986 }, 00:10:56.986 { 00:10:56.986 "dma_device_id": "system", 00:10:56.986 "dma_device_type": 1 00:10:56.986 }, 00:10:56.986 { 00:10:56.986 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:56.986 "dma_device_type": 2 00:10:56.986 } 00:10:56.986 ], 00:10:56.986 "driver_specific": { 00:10:56.986 "raid": { 00:10:56.986 "uuid": "597bb95a-6dc2-47a0-8c0a-928cf5bc3368", 00:10:56.986 "strip_size_kb": 64, 00:10:56.986 "state": "online", 00:10:56.986 "raid_level": "raid0", 00:10:56.986 "superblock": true, 00:10:56.986 "num_base_bdevs": 2, 00:10:56.986 "num_base_bdevs_discovered": 2, 00:10:56.986 "num_base_bdevs_operational": 2, 00:10:56.986 "base_bdevs_list": [ 00:10:56.986 { 00:10:56.986 "name": "pt1", 00:10:56.986 "uuid": "685bebbe-aaa9-5cb8-b5ae-087826d7c0af", 00:10:56.986 "is_configured": true, 00:10:56.986 "data_offset": 2048, 00:10:56.986 "data_size": 63488 00:10:56.986 }, 00:10:56.986 { 00:10:56.986 "name": "pt2", 00:10:56.986 "uuid": "d9102b99-5fbd-575f-b357-55d2f9d8736f", 00:10:56.986 "is_configured": true, 00:10:56.986 "data_offset": 2048, 00:10:56.986 "data_size": 63488 00:10:56.986 } 00:10:56.986 ] 00:10:56.986 } 00:10:56.986 } 00:10:56.986 }' 00:10:56.986 23:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:56.986 23:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:10:56.986 pt2' 00:10:56.986 23:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:56.986 23:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:56.986 23:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:57.245 23:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:57.245 "name": "pt1", 00:10:57.245 "aliases": [ 00:10:57.245 "685bebbe-aaa9-5cb8-b5ae-087826d7c0af" 00:10:57.245 ], 00:10:57.245 "product_name": "passthru", 00:10:57.245 "block_size": 512, 00:10:57.245 "num_blocks": 65536, 00:10:57.245 "uuid": "685bebbe-aaa9-5cb8-b5ae-087826d7c0af", 00:10:57.245 "assigned_rate_limits": { 00:10:57.245 "rw_ios_per_sec": 0, 00:10:57.245 "rw_mbytes_per_sec": 0, 00:10:57.245 "r_mbytes_per_sec": 0, 00:10:57.245 "w_mbytes_per_sec": 0 00:10:57.245 }, 00:10:57.245 "claimed": true, 00:10:57.245 "claim_type": "exclusive_write", 00:10:57.245 "zoned": false, 00:10:57.245 "supported_io_types": { 00:10:57.245 "read": true, 00:10:57.245 "write": true, 00:10:57.245 "unmap": true, 00:10:57.245 "write_zeroes": true, 00:10:57.245 "flush": true, 00:10:57.245 "reset": true, 00:10:57.245 "compare": false, 00:10:57.245 "compare_and_write": false, 00:10:57.245 "abort": true, 00:10:57.245 "nvme_admin": false, 00:10:57.245 "nvme_io": false 00:10:57.245 }, 00:10:57.245 "memory_domains": [ 00:10:57.245 { 00:10:57.245 "dma_device_id": "system", 00:10:57.245 "dma_device_type": 1 00:10:57.245 }, 00:10:57.245 { 00:10:57.245 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:57.245 "dma_device_type": 2 00:10:57.245 } 00:10:57.245 ], 00:10:57.245 "driver_specific": { 00:10:57.245 "passthru": { 00:10:57.245 "name": "pt1", 00:10:57.245 "base_bdev_name": "malloc1" 00:10:57.245 } 00:10:57.245 } 00:10:57.245 }' 00:10:57.245 23:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:57.245 23:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:57.245 23:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:57.245 23:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:57.245 23:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:57.504 23:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:57.504 23:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:57.504 23:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:57.504 23:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:57.504 23:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:57.504 23:52:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:57.504 23:52:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:57.504 23:52:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:57.504 23:52:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:57.504 23:52:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:57.763 23:52:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:57.763 "name": "pt2", 00:10:57.763 "aliases": [ 00:10:57.763 "d9102b99-5fbd-575f-b357-55d2f9d8736f" 00:10:57.763 ], 00:10:57.763 "product_name": "passthru", 00:10:57.763 "block_size": 512, 00:10:57.763 "num_blocks": 65536, 00:10:57.763 "uuid": "d9102b99-5fbd-575f-b357-55d2f9d8736f", 00:10:57.763 "assigned_rate_limits": { 00:10:57.763 "rw_ios_per_sec": 0, 00:10:57.763 "rw_mbytes_per_sec": 0, 00:10:57.763 "r_mbytes_per_sec": 0, 00:10:57.763 "w_mbytes_per_sec": 0 00:10:57.763 }, 00:10:57.763 "claimed": true, 00:10:57.763 "claim_type": "exclusive_write", 00:10:57.763 "zoned": false, 00:10:57.763 "supported_io_types": { 00:10:57.763 "read": true, 00:10:57.763 "write": true, 00:10:57.763 "unmap": true, 00:10:57.763 "write_zeroes": true, 00:10:57.763 "flush": true, 00:10:57.763 "reset": true, 00:10:57.763 "compare": false, 00:10:57.763 "compare_and_write": false, 00:10:57.763 "abort": true, 00:10:57.763 "nvme_admin": false, 00:10:57.763 "nvme_io": false 00:10:57.763 }, 00:10:57.763 "memory_domains": [ 00:10:57.763 { 00:10:57.763 "dma_device_id": "system", 00:10:57.763 "dma_device_type": 1 00:10:57.763 }, 00:10:57.763 { 00:10:57.763 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:57.763 "dma_device_type": 2 00:10:57.763 } 00:10:57.763 ], 00:10:57.763 "driver_specific": { 00:10:57.763 "passthru": { 00:10:57.763 "name": "pt2", 00:10:57.763 "base_bdev_name": "malloc2" 00:10:57.763 } 00:10:57.763 } 00:10:57.763 }' 00:10:57.763 23:52:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:57.763 23:52:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:57.763 23:52:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:57.763 23:52:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:57.763 23:52:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:57.763 23:52:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:57.763 23:52:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:58.021 23:52:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:58.021 23:52:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:58.021 23:52:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:58.021 23:52:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:58.021 23:52:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:58.022 23:52:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:58.022 23:52:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:10:58.281 [2024-05-14 23:52:58.729176] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:58.281 23:52:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # '[' 597bb95a-6dc2-47a0-8c0a-928cf5bc3368 '!=' 597bb95a-6dc2-47a0-8c0a-928cf5bc3368 ']' 00:10:58.281 23:52:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@491 -- # has_redundancy raid0 00:10:58.281 23:52:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:10:58.281 23:52:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@216 -- # return 1 00:10:58.281 23:52:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@568 -- # killprocess 388617 00:10:58.281 23:52:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@946 -- # '[' -z 388617 ']' 00:10:58.281 23:52:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # kill -0 388617 00:10:58.281 23:52:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # uname 00:10:58.281 23:52:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:10:58.281 23:52:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 388617 00:10:58.281 23:52:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:10:58.281 23:52:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:10:58.281 23:52:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 388617' 00:10:58.281 killing process with pid 388617 00:10:58.281 23:52:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@965 -- # kill 388617 00:10:58.281 [2024-05-14 23:52:58.797828] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:58.281 [2024-05-14 23:52:58.797889] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:58.281 [2024-05-14 23:52:58.797933] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:58.281 [2024-05-14 23:52:58.797945] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22809f0 name raid_bdev1, state offline 00:10:58.281 23:52:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@970 -- # wait 388617 00:10:58.281 [2024-05-14 23:52:58.814481] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:58.539 23:52:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # return 0 00:10:58.539 00:10:58.539 real 0m10.068s 00:10:58.539 user 0m17.964s 00:10:58.539 sys 0m1.834s 00:10:58.539 23:52:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:10:58.539 23:52:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:58.539 ************************************ 00:10:58.539 END TEST raid_superblock_test 00:10:58.539 ************************************ 00:10:58.539 23:52:59 bdev_raid -- bdev/bdev_raid.sh@814 -- # for level in raid0 concat raid1 00:10:58.539 23:52:59 bdev_raid -- bdev/bdev_raid.sh@815 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:10:58.539 23:52:59 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:10:58.539 23:52:59 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:10:58.539 23:52:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:58.798 ************************************ 00:10:58.798 START TEST raid_state_function_test 00:10:58.798 ************************************ 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1121 -- # raid_state_function_test concat 2 false 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local raid_level=concat 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=2 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local superblock=false 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # '[' concat '!=' raid1 ']' 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@238 -- # '[' false = true ']' 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@241 -- # superblock_create_arg= 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # raid_pid=390239 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 390239' 00:10:58.798 Process raid pid: 390239 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@247 -- # waitforlisten 390239 /var/tmp/spdk-raid.sock 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@827 -- # '[' -z 390239 ']' 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:58.798 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:10:58.798 23:52:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:58.798 [2024-05-14 23:52:59.199045] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:10:58.798 [2024-05-14 23:52:59.199106] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:58.798 [2024-05-14 23:52:59.329387] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:59.057 [2024-05-14 23:52:59.436619] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:59.057 [2024-05-14 23:52:59.502559] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:59.057 [2024-05-14 23:52:59.502594] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:59.623 23:53:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:10:59.623 23:53:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # return 0 00:10:59.623 23:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:59.882 [2024-05-14 23:53:00.358036] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:59.882 [2024-05-14 23:53:00.358077] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:59.882 [2024-05-14 23:53:00.358088] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:59.882 [2024-05-14 23:53:00.358100] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:59.882 23:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:59.882 23:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:59.882 23:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:59.882 23:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:10:59.882 23:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:59.882 23:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:59.882 23:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:59.882 23:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:59.882 23:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:59.882 23:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:59.882 23:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:59.882 23:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:00.140 23:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:00.140 "name": "Existed_Raid", 00:11:00.140 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:00.140 "strip_size_kb": 64, 00:11:00.140 "state": "configuring", 00:11:00.140 "raid_level": "concat", 00:11:00.140 "superblock": false, 00:11:00.140 "num_base_bdevs": 2, 00:11:00.140 "num_base_bdevs_discovered": 0, 00:11:00.140 "num_base_bdevs_operational": 2, 00:11:00.140 "base_bdevs_list": [ 00:11:00.140 { 00:11:00.140 "name": "BaseBdev1", 00:11:00.140 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:00.140 "is_configured": false, 00:11:00.140 "data_offset": 0, 00:11:00.140 "data_size": 0 00:11:00.140 }, 00:11:00.140 { 00:11:00.140 "name": "BaseBdev2", 00:11:00.140 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:00.140 "is_configured": false, 00:11:00.140 "data_offset": 0, 00:11:00.140 "data_size": 0 00:11:00.140 } 00:11:00.141 ] 00:11:00.141 }' 00:11:00.141 23:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:00.141 23:53:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:00.706 23:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:00.963 [2024-05-14 23:53:01.456824] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:00.963 [2024-05-14 23:53:01.456856] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa70bc0 name Existed_Raid, state configuring 00:11:00.963 23:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:01.222 [2024-05-14 23:53:01.693467] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:01.222 [2024-05-14 23:53:01.693501] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:01.222 [2024-05-14 23:53:01.693511] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:01.222 [2024-05-14 23:53:01.693523] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:01.222 23:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:01.480 [2024-05-14 23:53:01.928632] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:01.480 BaseBdev1 00:11:01.480 23:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:11:01.480 23:53:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:11:01.480 23:53:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:01.480 23:53:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:11:01.480 23:53:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:01.480 23:53:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:01.480 23:53:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:01.745 23:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:02.010 [ 00:11:02.010 { 00:11:02.010 "name": "BaseBdev1", 00:11:02.010 "aliases": [ 00:11:02.010 "cf3b61c6-1723-48a9-b485-432c7e1c57ad" 00:11:02.010 ], 00:11:02.010 "product_name": "Malloc disk", 00:11:02.010 "block_size": 512, 00:11:02.010 "num_blocks": 65536, 00:11:02.010 "uuid": "cf3b61c6-1723-48a9-b485-432c7e1c57ad", 00:11:02.010 "assigned_rate_limits": { 00:11:02.010 "rw_ios_per_sec": 0, 00:11:02.010 "rw_mbytes_per_sec": 0, 00:11:02.010 "r_mbytes_per_sec": 0, 00:11:02.010 "w_mbytes_per_sec": 0 00:11:02.010 }, 00:11:02.010 "claimed": true, 00:11:02.010 "claim_type": "exclusive_write", 00:11:02.010 "zoned": false, 00:11:02.010 "supported_io_types": { 00:11:02.010 "read": true, 00:11:02.010 "write": true, 00:11:02.010 "unmap": true, 00:11:02.010 "write_zeroes": true, 00:11:02.010 "flush": true, 00:11:02.010 "reset": true, 00:11:02.010 "compare": false, 00:11:02.010 "compare_and_write": false, 00:11:02.010 "abort": true, 00:11:02.010 "nvme_admin": false, 00:11:02.010 "nvme_io": false 00:11:02.010 }, 00:11:02.010 "memory_domains": [ 00:11:02.010 { 00:11:02.010 "dma_device_id": "system", 00:11:02.010 "dma_device_type": 1 00:11:02.010 }, 00:11:02.010 { 00:11:02.010 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:02.010 "dma_device_type": 2 00:11:02.010 } 00:11:02.010 ], 00:11:02.010 "driver_specific": {} 00:11:02.010 } 00:11:02.010 ] 00:11:02.010 23:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:11:02.010 23:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:02.010 23:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:02.010 23:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:02.010 23:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:02.010 23:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:02.010 23:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:02.010 23:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:02.010 23:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:02.011 23:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:02.011 23:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:02.011 23:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:02.011 23:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:02.268 23:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:02.268 "name": "Existed_Raid", 00:11:02.269 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:02.269 "strip_size_kb": 64, 00:11:02.269 "state": "configuring", 00:11:02.269 "raid_level": "concat", 00:11:02.269 "superblock": false, 00:11:02.269 "num_base_bdevs": 2, 00:11:02.269 "num_base_bdevs_discovered": 1, 00:11:02.269 "num_base_bdevs_operational": 2, 00:11:02.269 "base_bdevs_list": [ 00:11:02.269 { 00:11:02.269 "name": "BaseBdev1", 00:11:02.269 "uuid": "cf3b61c6-1723-48a9-b485-432c7e1c57ad", 00:11:02.269 "is_configured": true, 00:11:02.269 "data_offset": 0, 00:11:02.269 "data_size": 65536 00:11:02.269 }, 00:11:02.269 { 00:11:02.269 "name": "BaseBdev2", 00:11:02.269 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:02.269 "is_configured": false, 00:11:02.269 "data_offset": 0, 00:11:02.269 "data_size": 0 00:11:02.269 } 00:11:02.269 ] 00:11:02.269 }' 00:11:02.269 23:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:02.269 23:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:02.835 23:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:02.835 [2024-05-14 23:53:03.408556] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:02.835 [2024-05-14 23:53:03.408598] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa70e60 name Existed_Raid, state configuring 00:11:02.835 23:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:03.093 [2024-05-14 23:53:03.645215] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:03.093 [2024-05-14 23:53:03.646705] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:03.093 [2024-05-14 23:53:03.646737] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:03.093 23:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:11:03.093 23:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:11:03.093 23:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:03.093 23:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:03.093 23:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:03.093 23:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:03.093 23:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:03.093 23:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:03.093 23:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:03.093 23:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:03.093 23:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:03.094 23:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:03.094 23:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:03.094 23:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:03.352 23:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:03.352 "name": "Existed_Raid", 00:11:03.352 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:03.352 "strip_size_kb": 64, 00:11:03.352 "state": "configuring", 00:11:03.352 "raid_level": "concat", 00:11:03.352 "superblock": false, 00:11:03.352 "num_base_bdevs": 2, 00:11:03.352 "num_base_bdevs_discovered": 1, 00:11:03.352 "num_base_bdevs_operational": 2, 00:11:03.352 "base_bdevs_list": [ 00:11:03.352 { 00:11:03.352 "name": "BaseBdev1", 00:11:03.352 "uuid": "cf3b61c6-1723-48a9-b485-432c7e1c57ad", 00:11:03.352 "is_configured": true, 00:11:03.352 "data_offset": 0, 00:11:03.352 "data_size": 65536 00:11:03.352 }, 00:11:03.352 { 00:11:03.352 "name": "BaseBdev2", 00:11:03.352 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:03.352 "is_configured": false, 00:11:03.352 "data_offset": 0, 00:11:03.352 "data_size": 0 00:11:03.352 } 00:11:03.352 ] 00:11:03.352 }' 00:11:03.352 23:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:03.352 23:53:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:03.940 23:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:04.198 [2024-05-14 23:53:04.667337] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:04.198 [2024-05-14 23:53:04.667375] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xa704b0 00:11:04.198 [2024-05-14 23:53:04.667384] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:04.198 [2024-05-14 23:53:04.667582] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa70a70 00:11:04.198 [2024-05-14 23:53:04.667702] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa704b0 00:11:04.198 [2024-05-14 23:53:04.667712] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xa704b0 00:11:04.198 [2024-05-14 23:53:04.667877] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:04.198 BaseBdev2 00:11:04.198 23:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:11:04.198 23:53:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:11:04.198 23:53:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:04.198 23:53:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:11:04.198 23:53:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:04.198 23:53:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:04.198 23:53:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:04.456 23:53:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:04.714 [ 00:11:04.714 { 00:11:04.714 "name": "BaseBdev2", 00:11:04.714 "aliases": [ 00:11:04.714 "35e83b4e-9234-4679-a497-0f2bb45a3103" 00:11:04.714 ], 00:11:04.714 "product_name": "Malloc disk", 00:11:04.714 "block_size": 512, 00:11:04.714 "num_blocks": 65536, 00:11:04.714 "uuid": "35e83b4e-9234-4679-a497-0f2bb45a3103", 00:11:04.714 "assigned_rate_limits": { 00:11:04.714 "rw_ios_per_sec": 0, 00:11:04.714 "rw_mbytes_per_sec": 0, 00:11:04.714 "r_mbytes_per_sec": 0, 00:11:04.714 "w_mbytes_per_sec": 0 00:11:04.714 }, 00:11:04.714 "claimed": true, 00:11:04.714 "claim_type": "exclusive_write", 00:11:04.714 "zoned": false, 00:11:04.714 "supported_io_types": { 00:11:04.714 "read": true, 00:11:04.714 "write": true, 00:11:04.714 "unmap": true, 00:11:04.714 "write_zeroes": true, 00:11:04.714 "flush": true, 00:11:04.715 "reset": true, 00:11:04.715 "compare": false, 00:11:04.715 "compare_and_write": false, 00:11:04.715 "abort": true, 00:11:04.715 "nvme_admin": false, 00:11:04.715 "nvme_io": false 00:11:04.715 }, 00:11:04.715 "memory_domains": [ 00:11:04.715 { 00:11:04.715 "dma_device_id": "system", 00:11:04.715 "dma_device_type": 1 00:11:04.715 }, 00:11:04.715 { 00:11:04.715 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:04.715 "dma_device_type": 2 00:11:04.715 } 00:11:04.715 ], 00:11:04.715 "driver_specific": {} 00:11:04.715 } 00:11:04.715 ] 00:11:04.715 23:53:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:11:04.715 23:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:11:04.715 23:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:11:04.715 23:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:11:04.715 23:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:04.715 23:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:11:04.715 23:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:04.715 23:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:04.715 23:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:04.715 23:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:04.715 23:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:04.715 23:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:04.715 23:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:04.715 23:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:04.715 23:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:04.973 23:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:04.973 "name": "Existed_Raid", 00:11:04.973 "uuid": "0455c8c1-3ec0-4d9d-9571-d32605414467", 00:11:04.973 "strip_size_kb": 64, 00:11:04.973 "state": "online", 00:11:04.973 "raid_level": "concat", 00:11:04.973 "superblock": false, 00:11:04.973 "num_base_bdevs": 2, 00:11:04.973 "num_base_bdevs_discovered": 2, 00:11:04.973 "num_base_bdevs_operational": 2, 00:11:04.973 "base_bdevs_list": [ 00:11:04.973 { 00:11:04.973 "name": "BaseBdev1", 00:11:04.973 "uuid": "cf3b61c6-1723-48a9-b485-432c7e1c57ad", 00:11:04.973 "is_configured": true, 00:11:04.973 "data_offset": 0, 00:11:04.973 "data_size": 65536 00:11:04.973 }, 00:11:04.973 { 00:11:04.973 "name": "BaseBdev2", 00:11:04.973 "uuid": "35e83b4e-9234-4679-a497-0f2bb45a3103", 00:11:04.973 "is_configured": true, 00:11:04.973 "data_offset": 0, 00:11:04.973 "data_size": 65536 00:11:04.973 } 00:11:04.973 ] 00:11:04.973 }' 00:11:04.973 23:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:04.973 23:53:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:05.539 23:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:11:05.539 23:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:11:05.539 23:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:11:05.539 23:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:11:05.539 23:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:11:05.539 23:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:11:05.539 23:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:11:05.539 23:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:05.539 [2024-05-14 23:53:06.103403] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:05.539 23:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:11:05.539 "name": "Existed_Raid", 00:11:05.539 "aliases": [ 00:11:05.539 "0455c8c1-3ec0-4d9d-9571-d32605414467" 00:11:05.539 ], 00:11:05.539 "product_name": "Raid Volume", 00:11:05.539 "block_size": 512, 00:11:05.539 "num_blocks": 131072, 00:11:05.539 "uuid": "0455c8c1-3ec0-4d9d-9571-d32605414467", 00:11:05.539 "assigned_rate_limits": { 00:11:05.539 "rw_ios_per_sec": 0, 00:11:05.539 "rw_mbytes_per_sec": 0, 00:11:05.539 "r_mbytes_per_sec": 0, 00:11:05.539 "w_mbytes_per_sec": 0 00:11:05.539 }, 00:11:05.539 "claimed": false, 00:11:05.539 "zoned": false, 00:11:05.539 "supported_io_types": { 00:11:05.539 "read": true, 00:11:05.539 "write": true, 00:11:05.539 "unmap": true, 00:11:05.539 "write_zeroes": true, 00:11:05.539 "flush": true, 00:11:05.539 "reset": true, 00:11:05.539 "compare": false, 00:11:05.539 "compare_and_write": false, 00:11:05.539 "abort": false, 00:11:05.539 "nvme_admin": false, 00:11:05.539 "nvme_io": false 00:11:05.539 }, 00:11:05.539 "memory_domains": [ 00:11:05.539 { 00:11:05.539 "dma_device_id": "system", 00:11:05.539 "dma_device_type": 1 00:11:05.539 }, 00:11:05.539 { 00:11:05.539 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:05.539 "dma_device_type": 2 00:11:05.539 }, 00:11:05.539 { 00:11:05.539 "dma_device_id": "system", 00:11:05.539 "dma_device_type": 1 00:11:05.539 }, 00:11:05.539 { 00:11:05.539 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:05.539 "dma_device_type": 2 00:11:05.539 } 00:11:05.539 ], 00:11:05.539 "driver_specific": { 00:11:05.539 "raid": { 00:11:05.539 "uuid": "0455c8c1-3ec0-4d9d-9571-d32605414467", 00:11:05.539 "strip_size_kb": 64, 00:11:05.539 "state": "online", 00:11:05.539 "raid_level": "concat", 00:11:05.539 "superblock": false, 00:11:05.539 "num_base_bdevs": 2, 00:11:05.539 "num_base_bdevs_discovered": 2, 00:11:05.539 "num_base_bdevs_operational": 2, 00:11:05.539 "base_bdevs_list": [ 00:11:05.539 { 00:11:05.539 "name": "BaseBdev1", 00:11:05.539 "uuid": "cf3b61c6-1723-48a9-b485-432c7e1c57ad", 00:11:05.539 "is_configured": true, 00:11:05.539 "data_offset": 0, 00:11:05.539 "data_size": 65536 00:11:05.539 }, 00:11:05.539 { 00:11:05.539 "name": "BaseBdev2", 00:11:05.539 "uuid": "35e83b4e-9234-4679-a497-0f2bb45a3103", 00:11:05.539 "is_configured": true, 00:11:05.539 "data_offset": 0, 00:11:05.539 "data_size": 65536 00:11:05.539 } 00:11:05.539 ] 00:11:05.539 } 00:11:05.539 } 00:11:05.539 }' 00:11:05.797 23:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:05.797 23:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:11:05.797 BaseBdev2' 00:11:05.797 23:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:05.797 23:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:05.797 23:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:06.055 23:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:06.055 "name": "BaseBdev1", 00:11:06.055 "aliases": [ 00:11:06.055 "cf3b61c6-1723-48a9-b485-432c7e1c57ad" 00:11:06.055 ], 00:11:06.055 "product_name": "Malloc disk", 00:11:06.055 "block_size": 512, 00:11:06.055 "num_blocks": 65536, 00:11:06.055 "uuid": "cf3b61c6-1723-48a9-b485-432c7e1c57ad", 00:11:06.055 "assigned_rate_limits": { 00:11:06.055 "rw_ios_per_sec": 0, 00:11:06.055 "rw_mbytes_per_sec": 0, 00:11:06.055 "r_mbytes_per_sec": 0, 00:11:06.055 "w_mbytes_per_sec": 0 00:11:06.055 }, 00:11:06.055 "claimed": true, 00:11:06.055 "claim_type": "exclusive_write", 00:11:06.055 "zoned": false, 00:11:06.055 "supported_io_types": { 00:11:06.055 "read": true, 00:11:06.055 "write": true, 00:11:06.055 "unmap": true, 00:11:06.055 "write_zeroes": true, 00:11:06.055 "flush": true, 00:11:06.055 "reset": true, 00:11:06.055 "compare": false, 00:11:06.055 "compare_and_write": false, 00:11:06.055 "abort": true, 00:11:06.055 "nvme_admin": false, 00:11:06.055 "nvme_io": false 00:11:06.055 }, 00:11:06.055 "memory_domains": [ 00:11:06.055 { 00:11:06.055 "dma_device_id": "system", 00:11:06.055 "dma_device_type": 1 00:11:06.055 }, 00:11:06.055 { 00:11:06.055 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:06.055 "dma_device_type": 2 00:11:06.055 } 00:11:06.055 ], 00:11:06.055 "driver_specific": {} 00:11:06.055 }' 00:11:06.055 23:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:06.055 23:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:06.055 23:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:06.055 23:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:06.055 23:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:06.055 23:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:06.055 23:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:06.055 23:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:06.313 23:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:06.313 23:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:06.313 23:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:06.313 23:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:06.313 23:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:06.313 23:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:06.313 23:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:06.572 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:06.572 "name": "BaseBdev2", 00:11:06.572 "aliases": [ 00:11:06.572 "35e83b4e-9234-4679-a497-0f2bb45a3103" 00:11:06.572 ], 00:11:06.572 "product_name": "Malloc disk", 00:11:06.572 "block_size": 512, 00:11:06.572 "num_blocks": 65536, 00:11:06.572 "uuid": "35e83b4e-9234-4679-a497-0f2bb45a3103", 00:11:06.572 "assigned_rate_limits": { 00:11:06.572 "rw_ios_per_sec": 0, 00:11:06.572 "rw_mbytes_per_sec": 0, 00:11:06.572 "r_mbytes_per_sec": 0, 00:11:06.572 "w_mbytes_per_sec": 0 00:11:06.572 }, 00:11:06.572 "claimed": true, 00:11:06.572 "claim_type": "exclusive_write", 00:11:06.572 "zoned": false, 00:11:06.572 "supported_io_types": { 00:11:06.572 "read": true, 00:11:06.572 "write": true, 00:11:06.572 "unmap": true, 00:11:06.572 "write_zeroes": true, 00:11:06.572 "flush": true, 00:11:06.572 "reset": true, 00:11:06.572 "compare": false, 00:11:06.572 "compare_and_write": false, 00:11:06.572 "abort": true, 00:11:06.572 "nvme_admin": false, 00:11:06.572 "nvme_io": false 00:11:06.572 }, 00:11:06.572 "memory_domains": [ 00:11:06.572 { 00:11:06.572 "dma_device_id": "system", 00:11:06.572 "dma_device_type": 1 00:11:06.572 }, 00:11:06.572 { 00:11:06.572 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:06.572 "dma_device_type": 2 00:11:06.572 } 00:11:06.572 ], 00:11:06.572 "driver_specific": {} 00:11:06.572 }' 00:11:06.572 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:06.572 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:06.572 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:06.572 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:06.572 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:06.831 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:06.831 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:06.831 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:06.831 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:06.831 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:06.831 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:06.831 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:06.831 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:07.099 [2024-05-14 23:53:07.567130] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:07.099 [2024-05-14 23:53:07.567162] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:07.099 [2024-05-14 23:53:07.567204] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:07.099 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # local expected_state 00:11:07.099 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # has_redundancy concat 00:11:07.099 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:11:07.099 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@216 -- # return 1 00:11:07.099 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:11:07.099 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:11:07.099 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:07.099 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:11:07.099 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:07.099 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:07.099 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:11:07.099 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:07.099 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:07.099 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:07.099 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:07.099 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:07.099 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:07.395 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:07.395 "name": "Existed_Raid", 00:11:07.395 "uuid": "0455c8c1-3ec0-4d9d-9571-d32605414467", 00:11:07.395 "strip_size_kb": 64, 00:11:07.395 "state": "offline", 00:11:07.395 "raid_level": "concat", 00:11:07.395 "superblock": false, 00:11:07.395 "num_base_bdevs": 2, 00:11:07.395 "num_base_bdevs_discovered": 1, 00:11:07.395 "num_base_bdevs_operational": 1, 00:11:07.395 "base_bdevs_list": [ 00:11:07.395 { 00:11:07.395 "name": null, 00:11:07.395 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:07.395 "is_configured": false, 00:11:07.395 "data_offset": 0, 00:11:07.395 "data_size": 65536 00:11:07.395 }, 00:11:07.395 { 00:11:07.395 "name": "BaseBdev2", 00:11:07.395 "uuid": "35e83b4e-9234-4679-a497-0f2bb45a3103", 00:11:07.395 "is_configured": true, 00:11:07.396 "data_offset": 0, 00:11:07.396 "data_size": 65536 00:11:07.396 } 00:11:07.396 ] 00:11:07.396 }' 00:11:07.396 23:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:07.396 23:53:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:07.974 23:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:11:07.974 23:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:11:07.974 23:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:11:07.974 23:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:08.233 23:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:11:08.233 23:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:08.233 23:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:08.491 [2024-05-14 23:53:08.851583] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:08.491 [2024-05-14 23:53:08.851638] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa704b0 name Existed_Raid, state offline 00:11:08.491 23:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:11:08.491 23:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:11:08.492 23:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:08.492 23:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:11:08.750 23:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:11:08.750 23:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:11:08.750 23:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@300 -- # '[' 2 -gt 2 ']' 00:11:08.750 23:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@342 -- # killprocess 390239 00:11:08.750 23:53:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@946 -- # '[' -z 390239 ']' 00:11:08.750 23:53:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # kill -0 390239 00:11:08.750 23:53:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # uname 00:11:08.750 23:53:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:11:08.750 23:53:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 390239 00:11:08.750 23:53:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:11:08.750 23:53:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:11:08.750 23:53:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 390239' 00:11:08.750 killing process with pid 390239 00:11:08.750 23:53:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@965 -- # kill 390239 00:11:08.750 [2024-05-14 23:53:09.186820] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:08.750 23:53:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@970 -- # wait 390239 00:11:08.750 [2024-05-14 23:53:09.187731] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:09.009 23:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@344 -- # return 0 00:11:09.009 00:11:09.009 real 0m10.295s 00:11:09.009 user 0m18.172s 00:11:09.009 sys 0m1.948s 00:11:09.009 23:53:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:11:09.009 23:53:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:09.009 ************************************ 00:11:09.009 END TEST raid_state_function_test 00:11:09.009 ************************************ 00:11:09.009 23:53:09 bdev_raid -- bdev/bdev_raid.sh@816 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:11:09.009 23:53:09 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:11:09.009 23:53:09 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:11:09.009 23:53:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:09.009 ************************************ 00:11:09.009 START TEST raid_state_function_test_sb 00:11:09.009 ************************************ 00:11:09.009 23:53:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1121 -- # raid_state_function_test concat 2 true 00:11:09.009 23:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local raid_level=concat 00:11:09.009 23:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=2 00:11:09.009 23:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:11:09.009 23:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:11:09.009 23:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:11:09.009 23:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:11:09.009 23:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:11:09.009 23:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:11:09.009 23:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:11:09.009 23:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:11:09.009 23:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:11:09.009 23:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:11:09.009 23:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:09.009 23:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:11:09.009 23:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:11:09.009 23:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size 00:11:09.009 23:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:11:09.009 23:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:11:09.009 23:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # '[' concat '!=' raid1 ']' 00:11:09.009 23:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:11:09.010 23:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:11:09.010 23:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:11:09.010 23:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:11:09.010 23:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # raid_pid=391797 00:11:09.010 23:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 391797' 00:11:09.010 Process raid pid: 391797 00:11:09.010 23:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:09.010 23:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@247 -- # waitforlisten 391797 /var/tmp/spdk-raid.sock 00:11:09.010 23:53:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@827 -- # '[' -z 391797 ']' 00:11:09.010 23:53:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:09.010 23:53:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:11:09.010 23:53:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:09.010 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:09.010 23:53:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:11:09.010 23:53:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:09.010 [2024-05-14 23:53:09.594599] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:11:09.010 [2024-05-14 23:53:09.594664] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:09.268 [2024-05-14 23:53:09.724143] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:09.268 [2024-05-14 23:53:09.831992] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:11:09.526 [2024-05-14 23:53:09.899344] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:09.526 [2024-05-14 23:53:09.899375] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:10.093 23:53:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:11:10.093 23:53:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # return 0 00:11:10.093 23:53:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:10.352 [2024-05-14 23:53:10.730904] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:10.352 [2024-05-14 23:53:10.730946] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:10.352 [2024-05-14 23:53:10.730959] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:10.352 [2024-05-14 23:53:10.730975] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:10.352 23:53:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:10.352 23:53:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:10.352 23:53:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:10.352 23:53:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:10.352 23:53:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:10.352 23:53:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:10.352 23:53:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:10.352 23:53:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:10.352 23:53:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:10.352 23:53:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:10.352 23:53:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:10.352 23:53:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:10.611 23:53:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:10.611 "name": "Existed_Raid", 00:11:10.611 "uuid": "c968fe15-4d2e-44ff-bd70-9c496157b432", 00:11:10.611 "strip_size_kb": 64, 00:11:10.611 "state": "configuring", 00:11:10.611 "raid_level": "concat", 00:11:10.611 "superblock": true, 00:11:10.611 "num_base_bdevs": 2, 00:11:10.611 "num_base_bdevs_discovered": 0, 00:11:10.611 "num_base_bdevs_operational": 2, 00:11:10.611 "base_bdevs_list": [ 00:11:10.611 { 00:11:10.611 "name": "BaseBdev1", 00:11:10.611 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:10.611 "is_configured": false, 00:11:10.611 "data_offset": 0, 00:11:10.611 "data_size": 0 00:11:10.611 }, 00:11:10.611 { 00:11:10.611 "name": "BaseBdev2", 00:11:10.611 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:10.611 "is_configured": false, 00:11:10.611 "data_offset": 0, 00:11:10.611 "data_size": 0 00:11:10.611 } 00:11:10.611 ] 00:11:10.611 }' 00:11:10.611 23:53:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:10.611 23:53:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:11.178 23:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:11.178 [2024-05-14 23:53:11.733422] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:11.178 [2024-05-14 23:53:11.733452] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1920bc0 name Existed_Raid, state configuring 00:11:11.178 23:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:11.436 [2024-05-14 23:53:11.909906] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:11.436 [2024-05-14 23:53:11.909935] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:11.436 [2024-05-14 23:53:11.909944] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:11.436 [2024-05-14 23:53:11.909956] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:11.436 23:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:11.695 [2024-05-14 23:53:12.096223] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:11.695 BaseBdev1 00:11:11.695 23:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:11:11.695 23:53:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:11:11.695 23:53:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:11.695 23:53:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:11:11.695 23:53:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:11.695 23:53:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:11.695 23:53:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:11.953 23:53:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:11.953 [ 00:11:11.953 { 00:11:11.953 "name": "BaseBdev1", 00:11:11.953 "aliases": [ 00:11:11.953 "3ef15378-72e6-426d-915b-b8c59b846fdc" 00:11:11.953 ], 00:11:11.953 "product_name": "Malloc disk", 00:11:11.953 "block_size": 512, 00:11:11.953 "num_blocks": 65536, 00:11:11.953 "uuid": "3ef15378-72e6-426d-915b-b8c59b846fdc", 00:11:11.953 "assigned_rate_limits": { 00:11:11.953 "rw_ios_per_sec": 0, 00:11:11.953 "rw_mbytes_per_sec": 0, 00:11:11.953 "r_mbytes_per_sec": 0, 00:11:11.953 "w_mbytes_per_sec": 0 00:11:11.953 }, 00:11:11.953 "claimed": true, 00:11:11.953 "claim_type": "exclusive_write", 00:11:11.953 "zoned": false, 00:11:11.953 "supported_io_types": { 00:11:11.953 "read": true, 00:11:11.953 "write": true, 00:11:11.953 "unmap": true, 00:11:11.953 "write_zeroes": true, 00:11:11.953 "flush": true, 00:11:11.953 "reset": true, 00:11:11.953 "compare": false, 00:11:11.953 "compare_and_write": false, 00:11:11.953 "abort": true, 00:11:11.953 "nvme_admin": false, 00:11:11.953 "nvme_io": false 00:11:11.953 }, 00:11:11.953 "memory_domains": [ 00:11:11.953 { 00:11:11.953 "dma_device_id": "system", 00:11:11.953 "dma_device_type": 1 00:11:11.953 }, 00:11:11.953 { 00:11:11.953 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:11.953 "dma_device_type": 2 00:11:11.953 } 00:11:11.953 ], 00:11:11.953 "driver_specific": {} 00:11:11.953 } 00:11:11.953 ] 00:11:12.212 23:53:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:11:12.212 23:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:12.212 23:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:12.212 23:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:12.212 23:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:12.212 23:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:12.212 23:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:12.212 23:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:12.212 23:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:12.212 23:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:12.212 23:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:12.212 23:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:12.212 23:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:12.212 23:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:12.212 "name": "Existed_Raid", 00:11:12.212 "uuid": "3797f3c3-3285-469f-a19e-e5b68c671d9c", 00:11:12.212 "strip_size_kb": 64, 00:11:12.212 "state": "configuring", 00:11:12.212 "raid_level": "concat", 00:11:12.212 "superblock": true, 00:11:12.212 "num_base_bdevs": 2, 00:11:12.212 "num_base_bdevs_discovered": 1, 00:11:12.212 "num_base_bdevs_operational": 2, 00:11:12.212 "base_bdevs_list": [ 00:11:12.212 { 00:11:12.212 "name": "BaseBdev1", 00:11:12.212 "uuid": "3ef15378-72e6-426d-915b-b8c59b846fdc", 00:11:12.212 "is_configured": true, 00:11:12.212 "data_offset": 2048, 00:11:12.212 "data_size": 63488 00:11:12.212 }, 00:11:12.212 { 00:11:12.212 "name": "BaseBdev2", 00:11:12.212 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:12.212 "is_configured": false, 00:11:12.212 "data_offset": 0, 00:11:12.212 "data_size": 0 00:11:12.212 } 00:11:12.212 ] 00:11:12.212 }' 00:11:12.212 23:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:12.212 23:53:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:12.778 23:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:13.036 [2024-05-14 23:53:13.500116] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:13.036 [2024-05-14 23:53:13.500153] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1920e60 name Existed_Raid, state configuring 00:11:13.036 23:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:13.293 [2024-05-14 23:53:13.752828] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:13.294 [2024-05-14 23:53:13.754321] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:13.294 [2024-05-14 23:53:13.754355] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:13.294 23:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:11:13.294 23:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:11:13.294 23:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:13.294 23:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:13.294 23:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:13.294 23:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:13.294 23:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:13.294 23:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:13.294 23:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:13.294 23:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:13.294 23:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:13.294 23:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:13.294 23:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:13.294 23:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:13.552 23:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:13.552 "name": "Existed_Raid", 00:11:13.552 "uuid": "2da8e652-cca3-47cd-8e57-f3e3998f889b", 00:11:13.552 "strip_size_kb": 64, 00:11:13.552 "state": "configuring", 00:11:13.552 "raid_level": "concat", 00:11:13.552 "superblock": true, 00:11:13.552 "num_base_bdevs": 2, 00:11:13.552 "num_base_bdevs_discovered": 1, 00:11:13.552 "num_base_bdevs_operational": 2, 00:11:13.552 "base_bdevs_list": [ 00:11:13.552 { 00:11:13.552 "name": "BaseBdev1", 00:11:13.552 "uuid": "3ef15378-72e6-426d-915b-b8c59b846fdc", 00:11:13.552 "is_configured": true, 00:11:13.552 "data_offset": 2048, 00:11:13.552 "data_size": 63488 00:11:13.552 }, 00:11:13.552 { 00:11:13.552 "name": "BaseBdev2", 00:11:13.552 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:13.552 "is_configured": false, 00:11:13.552 "data_offset": 0, 00:11:13.552 "data_size": 0 00:11:13.552 } 00:11:13.552 ] 00:11:13.552 }' 00:11:13.552 23:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:13.552 23:53:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:14.119 23:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:14.119 [2024-05-14 23:53:14.686865] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:14.119 [2024-05-14 23:53:14.687011] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x19204b0 00:11:14.119 [2024-05-14 23:53:14.687031] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:14.119 [2024-05-14 23:53:14.687202] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1920a70 00:11:14.119 [2024-05-14 23:53:14.687314] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19204b0 00:11:14.119 [2024-05-14 23:53:14.687324] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x19204b0 00:11:14.119 [2024-05-14 23:53:14.687431] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:14.119 BaseBdev2 00:11:14.119 23:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:11:14.119 23:53:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:11:14.119 23:53:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:14.119 23:53:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:11:14.119 23:53:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:14.119 23:53:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:14.119 23:53:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:14.377 23:53:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:14.636 [ 00:11:14.636 { 00:11:14.636 "name": "BaseBdev2", 00:11:14.636 "aliases": [ 00:11:14.636 "5fa5d1c3-4489-4254-81c5-c9c43c8fbd70" 00:11:14.636 ], 00:11:14.636 "product_name": "Malloc disk", 00:11:14.636 "block_size": 512, 00:11:14.636 "num_blocks": 65536, 00:11:14.636 "uuid": "5fa5d1c3-4489-4254-81c5-c9c43c8fbd70", 00:11:14.636 "assigned_rate_limits": { 00:11:14.636 "rw_ios_per_sec": 0, 00:11:14.636 "rw_mbytes_per_sec": 0, 00:11:14.636 "r_mbytes_per_sec": 0, 00:11:14.636 "w_mbytes_per_sec": 0 00:11:14.636 }, 00:11:14.636 "claimed": true, 00:11:14.636 "claim_type": "exclusive_write", 00:11:14.636 "zoned": false, 00:11:14.636 "supported_io_types": { 00:11:14.636 "read": true, 00:11:14.636 "write": true, 00:11:14.636 "unmap": true, 00:11:14.636 "write_zeroes": true, 00:11:14.636 "flush": true, 00:11:14.636 "reset": true, 00:11:14.636 "compare": false, 00:11:14.636 "compare_and_write": false, 00:11:14.636 "abort": true, 00:11:14.636 "nvme_admin": false, 00:11:14.636 "nvme_io": false 00:11:14.636 }, 00:11:14.636 "memory_domains": [ 00:11:14.636 { 00:11:14.636 "dma_device_id": "system", 00:11:14.636 "dma_device_type": 1 00:11:14.636 }, 00:11:14.636 { 00:11:14.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:14.636 "dma_device_type": 2 00:11:14.636 } 00:11:14.636 ], 00:11:14.636 "driver_specific": {} 00:11:14.636 } 00:11:14.636 ] 00:11:14.636 23:53:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:11:14.636 23:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:11:14.636 23:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:11:14.636 23:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:11:14.636 23:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:14.636 23:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:11:14.636 23:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:14.636 23:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:14.636 23:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:14.636 23:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:14.636 23:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:14.636 23:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:14.636 23:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:14.636 23:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:14.636 23:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:14.895 23:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:14.895 "name": "Existed_Raid", 00:11:14.895 "uuid": "2da8e652-cca3-47cd-8e57-f3e3998f889b", 00:11:14.895 "strip_size_kb": 64, 00:11:14.895 "state": "online", 00:11:14.895 "raid_level": "concat", 00:11:14.895 "superblock": true, 00:11:14.895 "num_base_bdevs": 2, 00:11:14.895 "num_base_bdevs_discovered": 2, 00:11:14.895 "num_base_bdevs_operational": 2, 00:11:14.895 "base_bdevs_list": [ 00:11:14.895 { 00:11:14.895 "name": "BaseBdev1", 00:11:14.895 "uuid": "3ef15378-72e6-426d-915b-b8c59b846fdc", 00:11:14.895 "is_configured": true, 00:11:14.895 "data_offset": 2048, 00:11:14.895 "data_size": 63488 00:11:14.895 }, 00:11:14.895 { 00:11:14.895 "name": "BaseBdev2", 00:11:14.895 "uuid": "5fa5d1c3-4489-4254-81c5-c9c43c8fbd70", 00:11:14.895 "is_configured": true, 00:11:14.895 "data_offset": 2048, 00:11:14.895 "data_size": 63488 00:11:14.895 } 00:11:14.895 ] 00:11:14.895 }' 00:11:14.895 23:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:14.895 23:53:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:15.461 23:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:11:15.461 23:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:11:15.461 23:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:11:15.461 23:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:11:15.461 23:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:11:15.461 23:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:11:15.461 23:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:15.461 23:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:11:15.719 [2024-05-14 23:53:16.187094] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:15.719 23:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:11:15.719 "name": "Existed_Raid", 00:11:15.719 "aliases": [ 00:11:15.719 "2da8e652-cca3-47cd-8e57-f3e3998f889b" 00:11:15.719 ], 00:11:15.719 "product_name": "Raid Volume", 00:11:15.719 "block_size": 512, 00:11:15.719 "num_blocks": 126976, 00:11:15.719 "uuid": "2da8e652-cca3-47cd-8e57-f3e3998f889b", 00:11:15.719 "assigned_rate_limits": { 00:11:15.719 "rw_ios_per_sec": 0, 00:11:15.719 "rw_mbytes_per_sec": 0, 00:11:15.719 "r_mbytes_per_sec": 0, 00:11:15.719 "w_mbytes_per_sec": 0 00:11:15.719 }, 00:11:15.719 "claimed": false, 00:11:15.719 "zoned": false, 00:11:15.719 "supported_io_types": { 00:11:15.719 "read": true, 00:11:15.719 "write": true, 00:11:15.719 "unmap": true, 00:11:15.719 "write_zeroes": true, 00:11:15.719 "flush": true, 00:11:15.719 "reset": true, 00:11:15.719 "compare": false, 00:11:15.719 "compare_and_write": false, 00:11:15.719 "abort": false, 00:11:15.719 "nvme_admin": false, 00:11:15.719 "nvme_io": false 00:11:15.719 }, 00:11:15.719 "memory_domains": [ 00:11:15.719 { 00:11:15.719 "dma_device_id": "system", 00:11:15.719 "dma_device_type": 1 00:11:15.719 }, 00:11:15.719 { 00:11:15.719 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:15.719 "dma_device_type": 2 00:11:15.719 }, 00:11:15.719 { 00:11:15.719 "dma_device_id": "system", 00:11:15.719 "dma_device_type": 1 00:11:15.719 }, 00:11:15.719 { 00:11:15.719 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:15.719 "dma_device_type": 2 00:11:15.719 } 00:11:15.719 ], 00:11:15.719 "driver_specific": { 00:11:15.719 "raid": { 00:11:15.719 "uuid": "2da8e652-cca3-47cd-8e57-f3e3998f889b", 00:11:15.719 "strip_size_kb": 64, 00:11:15.719 "state": "online", 00:11:15.719 "raid_level": "concat", 00:11:15.719 "superblock": true, 00:11:15.719 "num_base_bdevs": 2, 00:11:15.719 "num_base_bdevs_discovered": 2, 00:11:15.719 "num_base_bdevs_operational": 2, 00:11:15.719 "base_bdevs_list": [ 00:11:15.719 { 00:11:15.719 "name": "BaseBdev1", 00:11:15.719 "uuid": "3ef15378-72e6-426d-915b-b8c59b846fdc", 00:11:15.719 "is_configured": true, 00:11:15.719 "data_offset": 2048, 00:11:15.719 "data_size": 63488 00:11:15.719 }, 00:11:15.719 { 00:11:15.719 "name": "BaseBdev2", 00:11:15.719 "uuid": "5fa5d1c3-4489-4254-81c5-c9c43c8fbd70", 00:11:15.719 "is_configured": true, 00:11:15.719 "data_offset": 2048, 00:11:15.719 "data_size": 63488 00:11:15.719 } 00:11:15.719 ] 00:11:15.719 } 00:11:15.719 } 00:11:15.719 }' 00:11:15.719 23:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:15.719 23:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:11:15.719 BaseBdev2' 00:11:15.719 23:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:15.719 23:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:15.719 23:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:15.978 23:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:15.978 "name": "BaseBdev1", 00:11:15.978 "aliases": [ 00:11:15.978 "3ef15378-72e6-426d-915b-b8c59b846fdc" 00:11:15.978 ], 00:11:15.978 "product_name": "Malloc disk", 00:11:15.978 "block_size": 512, 00:11:15.978 "num_blocks": 65536, 00:11:15.978 "uuid": "3ef15378-72e6-426d-915b-b8c59b846fdc", 00:11:15.978 "assigned_rate_limits": { 00:11:15.978 "rw_ios_per_sec": 0, 00:11:15.978 "rw_mbytes_per_sec": 0, 00:11:15.978 "r_mbytes_per_sec": 0, 00:11:15.978 "w_mbytes_per_sec": 0 00:11:15.978 }, 00:11:15.978 "claimed": true, 00:11:15.978 "claim_type": "exclusive_write", 00:11:15.978 "zoned": false, 00:11:15.978 "supported_io_types": { 00:11:15.978 "read": true, 00:11:15.978 "write": true, 00:11:15.978 "unmap": true, 00:11:15.978 "write_zeroes": true, 00:11:15.978 "flush": true, 00:11:15.978 "reset": true, 00:11:15.978 "compare": false, 00:11:15.978 "compare_and_write": false, 00:11:15.978 "abort": true, 00:11:15.978 "nvme_admin": false, 00:11:15.978 "nvme_io": false 00:11:15.978 }, 00:11:15.978 "memory_domains": [ 00:11:15.978 { 00:11:15.978 "dma_device_id": "system", 00:11:15.978 "dma_device_type": 1 00:11:15.978 }, 00:11:15.978 { 00:11:15.978 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:15.978 "dma_device_type": 2 00:11:15.978 } 00:11:15.978 ], 00:11:15.978 "driver_specific": {} 00:11:15.978 }' 00:11:15.978 23:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:15.978 23:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:15.978 23:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:15.978 23:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:16.236 23:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:16.236 23:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:16.236 23:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:16.236 23:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:16.236 23:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:16.236 23:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:16.236 23:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:16.236 23:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:16.236 23:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:16.236 23:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:16.236 23:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:16.494 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:16.494 "name": "BaseBdev2", 00:11:16.494 "aliases": [ 00:11:16.494 "5fa5d1c3-4489-4254-81c5-c9c43c8fbd70" 00:11:16.494 ], 00:11:16.494 "product_name": "Malloc disk", 00:11:16.494 "block_size": 512, 00:11:16.494 "num_blocks": 65536, 00:11:16.494 "uuid": "5fa5d1c3-4489-4254-81c5-c9c43c8fbd70", 00:11:16.494 "assigned_rate_limits": { 00:11:16.494 "rw_ios_per_sec": 0, 00:11:16.494 "rw_mbytes_per_sec": 0, 00:11:16.494 "r_mbytes_per_sec": 0, 00:11:16.494 "w_mbytes_per_sec": 0 00:11:16.494 }, 00:11:16.494 "claimed": true, 00:11:16.494 "claim_type": "exclusive_write", 00:11:16.494 "zoned": false, 00:11:16.494 "supported_io_types": { 00:11:16.494 "read": true, 00:11:16.494 "write": true, 00:11:16.494 "unmap": true, 00:11:16.494 "write_zeroes": true, 00:11:16.494 "flush": true, 00:11:16.494 "reset": true, 00:11:16.494 "compare": false, 00:11:16.494 "compare_and_write": false, 00:11:16.494 "abort": true, 00:11:16.494 "nvme_admin": false, 00:11:16.494 "nvme_io": false 00:11:16.494 }, 00:11:16.494 "memory_domains": [ 00:11:16.494 { 00:11:16.494 "dma_device_id": "system", 00:11:16.494 "dma_device_type": 1 00:11:16.494 }, 00:11:16.494 { 00:11:16.494 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:16.494 "dma_device_type": 2 00:11:16.494 } 00:11:16.494 ], 00:11:16.494 "driver_specific": {} 00:11:16.494 }' 00:11:16.494 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:16.494 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:16.752 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:16.752 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:16.752 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:16.752 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:16.752 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:16.752 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:16.753 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:16.753 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:16.753 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:17.011 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:17.011 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:17.011 [2024-05-14 23:53:17.514432] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:17.011 [2024-05-14 23:53:17.514461] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:17.011 [2024-05-14 23:53:17.514504] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:17.011 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # local expected_state 00:11:17.011 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # has_redundancy concat 00:11:17.011 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # case $1 in 00:11:17.011 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@216 -- # return 1 00:11:17.011 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:11:17.011 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:11:17.011 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:17.011 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:11:17.011 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:17.011 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:17.011 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:11:17.011 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:17.011 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:17.011 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:17.011 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:17.011 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:17.011 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:17.270 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:17.270 "name": "Existed_Raid", 00:11:17.270 "uuid": "2da8e652-cca3-47cd-8e57-f3e3998f889b", 00:11:17.270 "strip_size_kb": 64, 00:11:17.270 "state": "offline", 00:11:17.270 "raid_level": "concat", 00:11:17.270 "superblock": true, 00:11:17.270 "num_base_bdevs": 2, 00:11:17.270 "num_base_bdevs_discovered": 1, 00:11:17.270 "num_base_bdevs_operational": 1, 00:11:17.270 "base_bdevs_list": [ 00:11:17.270 { 00:11:17.270 "name": null, 00:11:17.270 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:17.270 "is_configured": false, 00:11:17.270 "data_offset": 2048, 00:11:17.270 "data_size": 63488 00:11:17.270 }, 00:11:17.270 { 00:11:17.270 "name": "BaseBdev2", 00:11:17.270 "uuid": "5fa5d1c3-4489-4254-81c5-c9c43c8fbd70", 00:11:17.270 "is_configured": true, 00:11:17.270 "data_offset": 2048, 00:11:17.270 "data_size": 63488 00:11:17.270 } 00:11:17.270 ] 00:11:17.270 }' 00:11:17.270 23:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:17.270 23:53:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:17.836 23:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:11:17.836 23:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:11:17.836 23:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:17.836 23:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:11:18.094 23:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:11:18.094 23:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:18.094 23:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:18.353 [2024-05-14 23:53:18.847059] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:18.353 [2024-05-14 23:53:18.847115] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19204b0 name Existed_Raid, state offline 00:11:18.353 23:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:11:18.353 23:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:11:18.353 23:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:18.353 23:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:11:18.611 23:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:11:18.611 23:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:11:18.611 23:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@300 -- # '[' 2 -gt 2 ']' 00:11:18.611 23:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@342 -- # killprocess 391797 00:11:18.611 23:53:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@946 -- # '[' -z 391797 ']' 00:11:18.611 23:53:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # kill -0 391797 00:11:18.611 23:53:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # uname 00:11:18.611 23:53:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:11:18.611 23:53:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 391797 00:11:18.611 23:53:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:11:18.611 23:53:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:11:18.611 23:53:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 391797' 00:11:18.611 killing process with pid 391797 00:11:18.611 23:53:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@965 -- # kill 391797 00:11:18.611 [2024-05-14 23:53:19.161034] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:18.611 23:53:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@970 -- # wait 391797 00:11:18.611 [2024-05-14 23:53:19.162021] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:18.870 23:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@344 -- # return 0 00:11:18.870 00:11:18.870 real 0m9.883s 00:11:18.870 user 0m17.494s 00:11:18.870 sys 0m1.866s 00:11:18.870 23:53:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:11:18.870 23:53:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:18.870 ************************************ 00:11:18.870 END TEST raid_state_function_test_sb 00:11:18.870 ************************************ 00:11:18.870 23:53:19 bdev_raid -- bdev/bdev_raid.sh@817 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:11:18.870 23:53:19 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:11:18.870 23:53:19 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:11:18.870 23:53:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:19.129 ************************************ 00:11:19.129 START TEST raid_superblock_test 00:11:19.129 ************************************ 00:11:19.129 23:53:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1121 -- # raid_superblock_test concat 2 00:11:19.129 23:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local raid_level=concat 00:11:19.129 23:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=2 00:11:19.129 23:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:11:19.129 23:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:11:19.129 23:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:11:19.129 23:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:11:19.129 23:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:11:19.129 23:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:11:19.129 23:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:11:19.129 23:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size 00:11:19.129 23:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:11:19.129 23:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:11:19.129 23:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:11:19.129 23:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # '[' concat '!=' raid1 ']' 00:11:19.129 23:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size=64 00:11:19.129 23:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@406 -- # strip_size_create_arg='-z 64' 00:11:19.129 23:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # raid_pid=393341 00:11:19.129 23:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # waitforlisten 393341 /var/tmp/spdk-raid.sock 00:11:19.129 23:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:19.129 23:53:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@827 -- # '[' -z 393341 ']' 00:11:19.129 23:53:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:19.129 23:53:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:11:19.129 23:53:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:19.129 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:19.129 23:53:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:11:19.129 23:53:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:19.129 [2024-05-14 23:53:19.563622] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:11:19.129 [2024-05-14 23:53:19.563689] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid393341 ] 00:11:19.129 [2024-05-14 23:53:19.694011] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:19.388 [2024-05-14 23:53:19.797864] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:11:19.388 [2024-05-14 23:53:19.870385] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:19.388 [2024-05-14 23:53:19.870434] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:19.954 23:53:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:11:19.954 23:53:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # return 0 00:11:19.954 23:53:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:11:19.954 23:53:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:11:19.954 23:53:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:11:19.954 23:53:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:11:19.954 23:53:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:11:19.954 23:53:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:19.954 23:53:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:11:19.954 23:53:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:19.954 23:53:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:11:20.213 malloc1 00:11:20.213 23:53:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:20.471 [2024-05-14 23:53:20.841361] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:20.471 [2024-05-14 23:53:20.841414] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:20.471 [2024-05-14 23:53:20.841438] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x268a780 00:11:20.471 [2024-05-14 23:53:20.841451] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:20.471 [2024-05-14 23:53:20.843110] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:20.471 [2024-05-14 23:53:20.843145] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:20.471 pt1 00:11:20.471 23:53:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:11:20.471 23:53:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:11:20.471 23:53:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:11:20.471 23:53:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:11:20.471 23:53:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:20.471 23:53:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:20.471 23:53:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:11:20.471 23:53:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:20.471 23:53:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:20.730 malloc2 00:11:20.730 23:53:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:20.988 [2024-05-14 23:53:21.323502] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:20.988 [2024-05-14 23:53:21.323552] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:20.988 [2024-05-14 23:53:21.323572] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x268bb60 00:11:20.988 [2024-05-14 23:53:21.323585] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:20.988 [2024-05-14 23:53:21.324978] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:20.988 [2024-05-14 23:53:21.325006] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:20.988 pt2 00:11:20.988 23:53:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:11:20.988 23:53:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:11:20.988 23:53:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:11:20.988 [2024-05-14 23:53:21.560150] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:20.988 [2024-05-14 23:53:21.561315] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:20.988 [2024-05-14 23:53:21.561462] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x28371f0 00:11:20.988 [2024-05-14 23:53:21.561477] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:20.988 [2024-05-14 23:53:21.561651] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26a1670 00:11:20.988 [2024-05-14 23:53:21.561787] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x28371f0 00:11:20.988 [2024-05-14 23:53:21.561797] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x28371f0 00:11:20.988 [2024-05-14 23:53:21.561887] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:21.270 23:53:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:21.270 23:53:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:11:21.270 23:53:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:11:21.270 23:53:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:21.270 23:53:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:21.270 23:53:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:21.270 23:53:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:21.270 23:53:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:21.270 23:53:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:21.270 23:53:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:21.270 23:53:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:21.270 23:53:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:21.270 23:53:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:21.270 "name": "raid_bdev1", 00:11:21.270 "uuid": "6a18bfba-75dc-4c5c-8391-fd99ec734e22", 00:11:21.270 "strip_size_kb": 64, 00:11:21.270 "state": "online", 00:11:21.270 "raid_level": "concat", 00:11:21.270 "superblock": true, 00:11:21.270 "num_base_bdevs": 2, 00:11:21.270 "num_base_bdevs_discovered": 2, 00:11:21.270 "num_base_bdevs_operational": 2, 00:11:21.270 "base_bdevs_list": [ 00:11:21.270 { 00:11:21.270 "name": "pt1", 00:11:21.270 "uuid": "b3006eaf-acda-5730-989d-98dd3fba0226", 00:11:21.270 "is_configured": true, 00:11:21.270 "data_offset": 2048, 00:11:21.270 "data_size": 63488 00:11:21.270 }, 00:11:21.270 { 00:11:21.270 "name": "pt2", 00:11:21.270 "uuid": "3dd7f2af-5a83-5b4d-a3f4-f26c1bd79743", 00:11:21.270 "is_configured": true, 00:11:21.270 "data_offset": 2048, 00:11:21.270 "data_size": 63488 00:11:21.270 } 00:11:21.270 ] 00:11:21.270 }' 00:11:21.270 23:53:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:21.270 23:53:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:21.847 23:53:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:11:21.847 23:53:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:11:21.847 23:53:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:11:21.847 23:53:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:11:21.847 23:53:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:11:21.847 23:53:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:11:21.847 23:53:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:21.847 23:53:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:11:22.106 [2024-05-14 23:53:22.575213] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:22.106 23:53:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:11:22.106 "name": "raid_bdev1", 00:11:22.106 "aliases": [ 00:11:22.106 "6a18bfba-75dc-4c5c-8391-fd99ec734e22" 00:11:22.106 ], 00:11:22.106 "product_name": "Raid Volume", 00:11:22.106 "block_size": 512, 00:11:22.106 "num_blocks": 126976, 00:11:22.106 "uuid": "6a18bfba-75dc-4c5c-8391-fd99ec734e22", 00:11:22.106 "assigned_rate_limits": { 00:11:22.106 "rw_ios_per_sec": 0, 00:11:22.106 "rw_mbytes_per_sec": 0, 00:11:22.106 "r_mbytes_per_sec": 0, 00:11:22.106 "w_mbytes_per_sec": 0 00:11:22.106 }, 00:11:22.106 "claimed": false, 00:11:22.106 "zoned": false, 00:11:22.106 "supported_io_types": { 00:11:22.106 "read": true, 00:11:22.106 "write": true, 00:11:22.106 "unmap": true, 00:11:22.106 "write_zeroes": true, 00:11:22.106 "flush": true, 00:11:22.106 "reset": true, 00:11:22.106 "compare": false, 00:11:22.106 "compare_and_write": false, 00:11:22.106 "abort": false, 00:11:22.106 "nvme_admin": false, 00:11:22.106 "nvme_io": false 00:11:22.106 }, 00:11:22.106 "memory_domains": [ 00:11:22.106 { 00:11:22.106 "dma_device_id": "system", 00:11:22.106 "dma_device_type": 1 00:11:22.106 }, 00:11:22.106 { 00:11:22.106 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:22.106 "dma_device_type": 2 00:11:22.106 }, 00:11:22.106 { 00:11:22.106 "dma_device_id": "system", 00:11:22.106 "dma_device_type": 1 00:11:22.106 }, 00:11:22.106 { 00:11:22.106 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:22.106 "dma_device_type": 2 00:11:22.106 } 00:11:22.106 ], 00:11:22.106 "driver_specific": { 00:11:22.106 "raid": { 00:11:22.106 "uuid": "6a18bfba-75dc-4c5c-8391-fd99ec734e22", 00:11:22.106 "strip_size_kb": 64, 00:11:22.106 "state": "online", 00:11:22.106 "raid_level": "concat", 00:11:22.106 "superblock": true, 00:11:22.106 "num_base_bdevs": 2, 00:11:22.106 "num_base_bdevs_discovered": 2, 00:11:22.106 "num_base_bdevs_operational": 2, 00:11:22.106 "base_bdevs_list": [ 00:11:22.106 { 00:11:22.106 "name": "pt1", 00:11:22.106 "uuid": "b3006eaf-acda-5730-989d-98dd3fba0226", 00:11:22.106 "is_configured": true, 00:11:22.106 "data_offset": 2048, 00:11:22.106 "data_size": 63488 00:11:22.106 }, 00:11:22.106 { 00:11:22.106 "name": "pt2", 00:11:22.106 "uuid": "3dd7f2af-5a83-5b4d-a3f4-f26c1bd79743", 00:11:22.106 "is_configured": true, 00:11:22.106 "data_offset": 2048, 00:11:22.106 "data_size": 63488 00:11:22.106 } 00:11:22.106 ] 00:11:22.106 } 00:11:22.106 } 00:11:22.106 }' 00:11:22.106 23:53:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:22.106 23:53:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:11:22.106 pt2' 00:11:22.106 23:53:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:22.106 23:53:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:22.106 23:53:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:22.365 23:53:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:22.365 "name": "pt1", 00:11:22.365 "aliases": [ 00:11:22.365 "b3006eaf-acda-5730-989d-98dd3fba0226" 00:11:22.365 ], 00:11:22.365 "product_name": "passthru", 00:11:22.365 "block_size": 512, 00:11:22.365 "num_blocks": 65536, 00:11:22.365 "uuid": "b3006eaf-acda-5730-989d-98dd3fba0226", 00:11:22.365 "assigned_rate_limits": { 00:11:22.365 "rw_ios_per_sec": 0, 00:11:22.365 "rw_mbytes_per_sec": 0, 00:11:22.365 "r_mbytes_per_sec": 0, 00:11:22.365 "w_mbytes_per_sec": 0 00:11:22.365 }, 00:11:22.365 "claimed": true, 00:11:22.365 "claim_type": "exclusive_write", 00:11:22.365 "zoned": false, 00:11:22.365 "supported_io_types": { 00:11:22.365 "read": true, 00:11:22.365 "write": true, 00:11:22.365 "unmap": true, 00:11:22.365 "write_zeroes": true, 00:11:22.365 "flush": true, 00:11:22.365 "reset": true, 00:11:22.365 "compare": false, 00:11:22.365 "compare_and_write": false, 00:11:22.365 "abort": true, 00:11:22.365 "nvme_admin": false, 00:11:22.365 "nvme_io": false 00:11:22.365 }, 00:11:22.365 "memory_domains": [ 00:11:22.365 { 00:11:22.365 "dma_device_id": "system", 00:11:22.365 "dma_device_type": 1 00:11:22.365 }, 00:11:22.365 { 00:11:22.365 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:22.365 "dma_device_type": 2 00:11:22.365 } 00:11:22.365 ], 00:11:22.365 "driver_specific": { 00:11:22.365 "passthru": { 00:11:22.365 "name": "pt1", 00:11:22.365 "base_bdev_name": "malloc1" 00:11:22.365 } 00:11:22.365 } 00:11:22.365 }' 00:11:22.365 23:53:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:22.365 23:53:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:22.623 23:53:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:22.623 23:53:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:22.623 23:53:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:22.623 23:53:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:22.623 23:53:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:22.623 23:53:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:22.623 23:53:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:22.623 23:53:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:22.623 23:53:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:22.882 23:53:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:22.882 23:53:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:22.882 23:53:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:22.882 23:53:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:23.140 23:53:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:23.140 "name": "pt2", 00:11:23.140 "aliases": [ 00:11:23.140 "3dd7f2af-5a83-5b4d-a3f4-f26c1bd79743" 00:11:23.140 ], 00:11:23.140 "product_name": "passthru", 00:11:23.140 "block_size": 512, 00:11:23.140 "num_blocks": 65536, 00:11:23.140 "uuid": "3dd7f2af-5a83-5b4d-a3f4-f26c1bd79743", 00:11:23.140 "assigned_rate_limits": { 00:11:23.140 "rw_ios_per_sec": 0, 00:11:23.140 "rw_mbytes_per_sec": 0, 00:11:23.140 "r_mbytes_per_sec": 0, 00:11:23.140 "w_mbytes_per_sec": 0 00:11:23.140 }, 00:11:23.140 "claimed": true, 00:11:23.140 "claim_type": "exclusive_write", 00:11:23.140 "zoned": false, 00:11:23.140 "supported_io_types": { 00:11:23.140 "read": true, 00:11:23.140 "write": true, 00:11:23.140 "unmap": true, 00:11:23.140 "write_zeroes": true, 00:11:23.140 "flush": true, 00:11:23.140 "reset": true, 00:11:23.140 "compare": false, 00:11:23.140 "compare_and_write": false, 00:11:23.140 "abort": true, 00:11:23.140 "nvme_admin": false, 00:11:23.140 "nvme_io": false 00:11:23.140 }, 00:11:23.140 "memory_domains": [ 00:11:23.140 { 00:11:23.140 "dma_device_id": "system", 00:11:23.140 "dma_device_type": 1 00:11:23.140 }, 00:11:23.140 { 00:11:23.140 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:23.140 "dma_device_type": 2 00:11:23.140 } 00:11:23.140 ], 00:11:23.140 "driver_specific": { 00:11:23.140 "passthru": { 00:11:23.140 "name": "pt2", 00:11:23.140 "base_bdev_name": "malloc2" 00:11:23.140 } 00:11:23.140 } 00:11:23.140 }' 00:11:23.140 23:53:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:23.140 23:53:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:23.140 23:53:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:23.140 23:53:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:23.140 23:53:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:23.140 23:53:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:23.140 23:53:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:23.140 23:53:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:23.399 23:53:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:23.399 23:53:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:23.399 23:53:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:23.399 23:53:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:23.399 23:53:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:23.399 23:53:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:11:23.658 [2024-05-14 23:53:24.035100] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:23.658 23:53:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=6a18bfba-75dc-4c5c-8391-fd99ec734e22 00:11:23.658 23:53:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # '[' -z 6a18bfba-75dc-4c5c-8391-fd99ec734e22 ']' 00:11:23.658 23:53:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:23.917 [2024-05-14 23:53:24.279517] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:23.917 [2024-05-14 23:53:24.279541] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:23.917 [2024-05-14 23:53:24.279598] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:23.917 [2024-05-14 23:53:24.279647] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:23.917 [2024-05-14 23:53:24.279659] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x28371f0 name raid_bdev1, state offline 00:11:23.917 23:53:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:23.917 23:53:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:11:24.175 23:53:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:11:24.175 23:53:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:11:24.175 23:53:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:11:24.175 23:53:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:24.433 23:53:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:11:24.433 23:53:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:24.433 23:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:11:24.433 23:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:11:24.691 23:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:11:24.692 23:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:11:24.692 23:53:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:11:24.692 23:53:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:11:24.692 23:53:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:24.692 23:53:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:24.692 23:53:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:24.692 23:53:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:24.692 23:53:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:24.692 23:53:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:24.692 23:53:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:24.692 23:53:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:11:24.692 23:53:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:11:24.950 [2024-05-14 23:53:25.482669] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:11:24.950 [2024-05-14 23:53:25.484078] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:11:24.950 [2024-05-14 23:53:25.484140] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:11:24.950 [2024-05-14 23:53:25.484182] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:11:24.950 [2024-05-14 23:53:25.484207] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:24.950 [2024-05-14 23:53:25.484216] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2837d90 name raid_bdev1, state configuring 00:11:24.950 request: 00:11:24.950 { 00:11:24.950 "name": "raid_bdev1", 00:11:24.950 "raid_level": "concat", 00:11:24.950 "base_bdevs": [ 00:11:24.950 "malloc1", 00:11:24.950 "malloc2" 00:11:24.950 ], 00:11:24.950 "superblock": false, 00:11:24.950 "strip_size_kb": 64, 00:11:24.950 "method": "bdev_raid_create", 00:11:24.950 "req_id": 1 00:11:24.950 } 00:11:24.950 Got JSON-RPC error response 00:11:24.950 response: 00:11:24.950 { 00:11:24.950 "code": -17, 00:11:24.950 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:11:24.950 } 00:11:24.950 23:53:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:11:24.950 23:53:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:11:24.950 23:53:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:11:24.950 23:53:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:11:24.951 23:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:24.951 23:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:11:25.209 23:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:11:25.209 23:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:11:25.209 23:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:25.468 [2024-05-14 23:53:25.963869] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:25.468 [2024-05-14 23:53:25.963927] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:25.468 [2024-05-14 23:53:25.963953] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x268a9b0 00:11:25.468 [2024-05-14 23:53:25.963966] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:25.468 [2024-05-14 23:53:25.965687] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:25.468 [2024-05-14 23:53:25.965717] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:25.468 [2024-05-14 23:53:25.965797] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt1 00:11:25.468 [2024-05-14 23:53:25.965825] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:25.468 pt1 00:11:25.468 23:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:11:25.468 23:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:11:25.468 23:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:25.468 23:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:25.468 23:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:25.468 23:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:25.468 23:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:25.468 23:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:25.468 23:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:25.468 23:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:25.468 23:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:25.468 23:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:25.727 23:53:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:25.727 "name": "raid_bdev1", 00:11:25.727 "uuid": "6a18bfba-75dc-4c5c-8391-fd99ec734e22", 00:11:25.727 "strip_size_kb": 64, 00:11:25.727 "state": "configuring", 00:11:25.727 "raid_level": "concat", 00:11:25.727 "superblock": true, 00:11:25.727 "num_base_bdevs": 2, 00:11:25.727 "num_base_bdevs_discovered": 1, 00:11:25.727 "num_base_bdevs_operational": 2, 00:11:25.727 "base_bdevs_list": [ 00:11:25.727 { 00:11:25.727 "name": "pt1", 00:11:25.727 "uuid": "b3006eaf-acda-5730-989d-98dd3fba0226", 00:11:25.727 "is_configured": true, 00:11:25.727 "data_offset": 2048, 00:11:25.727 "data_size": 63488 00:11:25.727 }, 00:11:25.727 { 00:11:25.727 "name": null, 00:11:25.727 "uuid": "3dd7f2af-5a83-5b4d-a3f4-f26c1bd79743", 00:11:25.727 "is_configured": false, 00:11:25.727 "data_offset": 2048, 00:11:25.727 "data_size": 63488 00:11:25.727 } 00:11:25.727 ] 00:11:25.727 }' 00:11:25.727 23:53:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:25.727 23:53:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:26.293 23:53:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@470 -- # '[' 2 -gt 2 ']' 00:11:26.293 23:53:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:11:26.293 23:53:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:11:26.293 23:53:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:26.551 [2024-05-14 23:53:27.046812] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:26.551 [2024-05-14 23:53:27.046866] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:26.551 [2024-05-14 23:53:27.046888] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x283b7a0 00:11:26.551 [2024-05-14 23:53:27.046901] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:26.551 [2024-05-14 23:53:27.047263] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:26.551 [2024-05-14 23:53:27.047280] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:26.551 [2024-05-14 23:53:27.047347] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:11:26.551 [2024-05-14 23:53:27.047366] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:26.551 [2024-05-14 23:53:27.047476] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x283b9f0 00:11:26.551 [2024-05-14 23:53:27.047487] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:26.551 [2024-05-14 23:53:27.047660] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2689900 00:11:26.551 [2024-05-14 23:53:27.047786] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x283b9f0 00:11:26.551 [2024-05-14 23:53:27.047796] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x283b9f0 00:11:26.551 [2024-05-14 23:53:27.047890] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:26.551 pt2 00:11:26.551 23:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:11:26.551 23:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:11:26.551 23:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:26.551 23:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:11:26.551 23:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:11:26.551 23:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:26.551 23:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:26.551 23:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:26.551 23:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:26.551 23:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:26.551 23:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:26.551 23:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:26.551 23:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:26.551 23:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:26.810 23:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:26.810 "name": "raid_bdev1", 00:11:26.810 "uuid": "6a18bfba-75dc-4c5c-8391-fd99ec734e22", 00:11:26.810 "strip_size_kb": 64, 00:11:26.810 "state": "online", 00:11:26.810 "raid_level": "concat", 00:11:26.810 "superblock": true, 00:11:26.810 "num_base_bdevs": 2, 00:11:26.810 "num_base_bdevs_discovered": 2, 00:11:26.810 "num_base_bdevs_operational": 2, 00:11:26.810 "base_bdevs_list": [ 00:11:26.810 { 00:11:26.810 "name": "pt1", 00:11:26.810 "uuid": "b3006eaf-acda-5730-989d-98dd3fba0226", 00:11:26.810 "is_configured": true, 00:11:26.810 "data_offset": 2048, 00:11:26.810 "data_size": 63488 00:11:26.810 }, 00:11:26.810 { 00:11:26.810 "name": "pt2", 00:11:26.810 "uuid": "3dd7f2af-5a83-5b4d-a3f4-f26c1bd79743", 00:11:26.810 "is_configured": true, 00:11:26.810 "data_offset": 2048, 00:11:26.810 "data_size": 63488 00:11:26.810 } 00:11:26.810 ] 00:11:26.810 }' 00:11:26.810 23:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:26.810 23:53:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:27.376 23:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:11:27.376 23:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:11:27.376 23:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:11:27.376 23:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:11:27.376 23:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:11:27.376 23:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:11:27.376 23:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:27.376 23:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:11:27.634 [2024-05-14 23:53:27.985524] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:27.634 23:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:11:27.634 "name": "raid_bdev1", 00:11:27.634 "aliases": [ 00:11:27.634 "6a18bfba-75dc-4c5c-8391-fd99ec734e22" 00:11:27.634 ], 00:11:27.634 "product_name": "Raid Volume", 00:11:27.634 "block_size": 512, 00:11:27.634 "num_blocks": 126976, 00:11:27.634 "uuid": "6a18bfba-75dc-4c5c-8391-fd99ec734e22", 00:11:27.634 "assigned_rate_limits": { 00:11:27.634 "rw_ios_per_sec": 0, 00:11:27.634 "rw_mbytes_per_sec": 0, 00:11:27.634 "r_mbytes_per_sec": 0, 00:11:27.634 "w_mbytes_per_sec": 0 00:11:27.634 }, 00:11:27.634 "claimed": false, 00:11:27.634 "zoned": false, 00:11:27.634 "supported_io_types": { 00:11:27.634 "read": true, 00:11:27.634 "write": true, 00:11:27.634 "unmap": true, 00:11:27.634 "write_zeroes": true, 00:11:27.634 "flush": true, 00:11:27.634 "reset": true, 00:11:27.634 "compare": false, 00:11:27.634 "compare_and_write": false, 00:11:27.634 "abort": false, 00:11:27.634 "nvme_admin": false, 00:11:27.634 "nvme_io": false 00:11:27.634 }, 00:11:27.634 "memory_domains": [ 00:11:27.634 { 00:11:27.634 "dma_device_id": "system", 00:11:27.634 "dma_device_type": 1 00:11:27.634 }, 00:11:27.634 { 00:11:27.634 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:27.634 "dma_device_type": 2 00:11:27.634 }, 00:11:27.634 { 00:11:27.634 "dma_device_id": "system", 00:11:27.634 "dma_device_type": 1 00:11:27.634 }, 00:11:27.634 { 00:11:27.634 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:27.634 "dma_device_type": 2 00:11:27.634 } 00:11:27.634 ], 00:11:27.634 "driver_specific": { 00:11:27.634 "raid": { 00:11:27.634 "uuid": "6a18bfba-75dc-4c5c-8391-fd99ec734e22", 00:11:27.634 "strip_size_kb": 64, 00:11:27.634 "state": "online", 00:11:27.634 "raid_level": "concat", 00:11:27.634 "superblock": true, 00:11:27.634 "num_base_bdevs": 2, 00:11:27.634 "num_base_bdevs_discovered": 2, 00:11:27.634 "num_base_bdevs_operational": 2, 00:11:27.634 "base_bdevs_list": [ 00:11:27.634 { 00:11:27.634 "name": "pt1", 00:11:27.634 "uuid": "b3006eaf-acda-5730-989d-98dd3fba0226", 00:11:27.634 "is_configured": true, 00:11:27.634 "data_offset": 2048, 00:11:27.634 "data_size": 63488 00:11:27.634 }, 00:11:27.634 { 00:11:27.634 "name": "pt2", 00:11:27.634 "uuid": "3dd7f2af-5a83-5b4d-a3f4-f26c1bd79743", 00:11:27.634 "is_configured": true, 00:11:27.634 "data_offset": 2048, 00:11:27.634 "data_size": 63488 00:11:27.634 } 00:11:27.634 ] 00:11:27.634 } 00:11:27.634 } 00:11:27.634 }' 00:11:27.634 23:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:27.634 23:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:11:27.634 pt2' 00:11:27.634 23:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:27.634 23:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:27.634 23:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:27.891 23:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:27.891 "name": "pt1", 00:11:27.891 "aliases": [ 00:11:27.891 "b3006eaf-acda-5730-989d-98dd3fba0226" 00:11:27.891 ], 00:11:27.891 "product_name": "passthru", 00:11:27.891 "block_size": 512, 00:11:27.891 "num_blocks": 65536, 00:11:27.891 "uuid": "b3006eaf-acda-5730-989d-98dd3fba0226", 00:11:27.891 "assigned_rate_limits": { 00:11:27.891 "rw_ios_per_sec": 0, 00:11:27.891 "rw_mbytes_per_sec": 0, 00:11:27.891 "r_mbytes_per_sec": 0, 00:11:27.891 "w_mbytes_per_sec": 0 00:11:27.891 }, 00:11:27.891 "claimed": true, 00:11:27.891 "claim_type": "exclusive_write", 00:11:27.891 "zoned": false, 00:11:27.891 "supported_io_types": { 00:11:27.891 "read": true, 00:11:27.891 "write": true, 00:11:27.891 "unmap": true, 00:11:27.891 "write_zeroes": true, 00:11:27.891 "flush": true, 00:11:27.891 "reset": true, 00:11:27.891 "compare": false, 00:11:27.891 "compare_and_write": false, 00:11:27.891 "abort": true, 00:11:27.891 "nvme_admin": false, 00:11:27.891 "nvme_io": false 00:11:27.891 }, 00:11:27.891 "memory_domains": [ 00:11:27.891 { 00:11:27.891 "dma_device_id": "system", 00:11:27.891 "dma_device_type": 1 00:11:27.891 }, 00:11:27.891 { 00:11:27.891 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:27.891 "dma_device_type": 2 00:11:27.891 } 00:11:27.891 ], 00:11:27.891 "driver_specific": { 00:11:27.891 "passthru": { 00:11:27.891 "name": "pt1", 00:11:27.891 "base_bdev_name": "malloc1" 00:11:27.891 } 00:11:27.891 } 00:11:27.891 }' 00:11:27.891 23:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:27.891 23:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:27.891 23:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:27.891 23:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:27.891 23:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:27.891 23:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:27.891 23:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:28.149 23:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:28.149 23:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:28.149 23:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:28.149 23:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:28.149 23:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:28.149 23:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:28.149 23:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:28.149 23:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:28.406 23:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:28.406 "name": "pt2", 00:11:28.406 "aliases": [ 00:11:28.406 "3dd7f2af-5a83-5b4d-a3f4-f26c1bd79743" 00:11:28.406 ], 00:11:28.406 "product_name": "passthru", 00:11:28.406 "block_size": 512, 00:11:28.406 "num_blocks": 65536, 00:11:28.406 "uuid": "3dd7f2af-5a83-5b4d-a3f4-f26c1bd79743", 00:11:28.406 "assigned_rate_limits": { 00:11:28.406 "rw_ios_per_sec": 0, 00:11:28.406 "rw_mbytes_per_sec": 0, 00:11:28.406 "r_mbytes_per_sec": 0, 00:11:28.406 "w_mbytes_per_sec": 0 00:11:28.406 }, 00:11:28.406 "claimed": true, 00:11:28.406 "claim_type": "exclusive_write", 00:11:28.406 "zoned": false, 00:11:28.406 "supported_io_types": { 00:11:28.406 "read": true, 00:11:28.406 "write": true, 00:11:28.406 "unmap": true, 00:11:28.406 "write_zeroes": true, 00:11:28.406 "flush": true, 00:11:28.406 "reset": true, 00:11:28.406 "compare": false, 00:11:28.406 "compare_and_write": false, 00:11:28.406 "abort": true, 00:11:28.406 "nvme_admin": false, 00:11:28.406 "nvme_io": false 00:11:28.406 }, 00:11:28.406 "memory_domains": [ 00:11:28.406 { 00:11:28.406 "dma_device_id": "system", 00:11:28.406 "dma_device_type": 1 00:11:28.406 }, 00:11:28.406 { 00:11:28.406 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:28.406 "dma_device_type": 2 00:11:28.406 } 00:11:28.406 ], 00:11:28.406 "driver_specific": { 00:11:28.406 "passthru": { 00:11:28.406 "name": "pt2", 00:11:28.406 "base_bdev_name": "malloc2" 00:11:28.406 } 00:11:28.406 } 00:11:28.406 }' 00:11:28.406 23:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:28.406 23:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:28.406 23:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:28.406 23:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:28.664 23:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:28.664 23:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:28.664 23:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:28.664 23:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:28.664 23:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:28.664 23:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:28.664 23:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:28.664 23:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:28.664 23:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:28.664 23:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:11:28.921 [2024-05-14 23:53:29.445394] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:28.921 23:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # '[' 6a18bfba-75dc-4c5c-8391-fd99ec734e22 '!=' 6a18bfba-75dc-4c5c-8391-fd99ec734e22 ']' 00:11:28.921 23:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@491 -- # has_redundancy concat 00:11:28.921 23:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:11:28.921 23:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@216 -- # return 1 00:11:28.921 23:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@568 -- # killprocess 393341 00:11:28.921 23:53:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@946 -- # '[' -z 393341 ']' 00:11:28.921 23:53:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # kill -0 393341 00:11:28.921 23:53:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # uname 00:11:28.921 23:53:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:11:28.921 23:53:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 393341 00:11:29.180 23:53:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:11:29.180 23:53:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:11:29.180 23:53:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 393341' 00:11:29.180 killing process with pid 393341 00:11:29.180 23:53:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@965 -- # kill 393341 00:11:29.180 [2024-05-14 23:53:29.516265] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:29.180 [2024-05-14 23:53:29.516344] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:29.180 [2024-05-14 23:53:29.516389] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:29.180 [2024-05-14 23:53:29.516408] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x283b9f0 name raid_bdev1, state offline 00:11:29.180 23:53:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@970 -- # wait 393341 00:11:29.180 [2024-05-14 23:53:29.535714] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:29.437 23:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # return 0 00:11:29.437 00:11:29.437 real 0m10.288s 00:11:29.437 user 0m18.295s 00:11:29.437 sys 0m1.935s 00:11:29.437 23:53:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:11:29.437 23:53:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:29.437 ************************************ 00:11:29.437 END TEST raid_superblock_test 00:11:29.437 ************************************ 00:11:29.437 23:53:29 bdev_raid -- bdev/bdev_raid.sh@814 -- # for level in raid0 concat raid1 00:11:29.437 23:53:29 bdev_raid -- bdev/bdev_raid.sh@815 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:11:29.437 23:53:29 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:11:29.437 23:53:29 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:11:29.437 23:53:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:29.437 ************************************ 00:11:29.437 START TEST raid_state_function_test 00:11:29.437 ************************************ 00:11:29.437 23:53:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1121 -- # raid_state_function_test raid1 2 false 00:11:29.437 23:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local raid_level=raid1 00:11:29.437 23:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=2 00:11:29.437 23:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local superblock=false 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # '[' raid1 '!=' raid1 ']' 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@235 -- # strip_size=0 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@238 -- # '[' false = true ']' 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@241 -- # superblock_create_arg= 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # raid_pid=394959 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 394959' 00:11:29.438 Process raid pid: 394959 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@247 -- # waitforlisten 394959 /var/tmp/spdk-raid.sock 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@827 -- # '[' -z 394959 ']' 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:29.438 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:11:29.438 23:53:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:29.438 [2024-05-14 23:53:29.947490] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:11:29.438 [2024-05-14 23:53:29.947562] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:29.695 [2024-05-14 23:53:30.093765] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:29.695 [2024-05-14 23:53:30.196987] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:11:29.695 [2024-05-14 23:53:30.273643] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:29.695 [2024-05-14 23:53:30.273678] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:30.260 23:53:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:11:30.260 23:53:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # return 0 00:11:30.260 23:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:30.518 [2024-05-14 23:53:30.912042] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:30.518 [2024-05-14 23:53:30.912085] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:30.518 [2024-05-14 23:53:30.912097] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:30.518 [2024-05-14 23:53:30.912109] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:30.518 23:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:30.518 23:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:30.518 23:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:30.518 23:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:11:30.518 23:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:11:30.518 23:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:30.518 23:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:30.518 23:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:30.518 23:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:30.518 23:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:30.518 23:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:30.518 23:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:30.776 23:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:30.776 "name": "Existed_Raid", 00:11:30.776 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:30.776 "strip_size_kb": 0, 00:11:30.776 "state": "configuring", 00:11:30.776 "raid_level": "raid1", 00:11:30.776 "superblock": false, 00:11:30.776 "num_base_bdevs": 2, 00:11:30.776 "num_base_bdevs_discovered": 0, 00:11:30.776 "num_base_bdevs_operational": 2, 00:11:30.776 "base_bdevs_list": [ 00:11:30.776 { 00:11:30.776 "name": "BaseBdev1", 00:11:30.776 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:30.776 "is_configured": false, 00:11:30.776 "data_offset": 0, 00:11:30.776 "data_size": 0 00:11:30.776 }, 00:11:30.776 { 00:11:30.776 "name": "BaseBdev2", 00:11:30.776 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:30.776 "is_configured": false, 00:11:30.776 "data_offset": 0, 00:11:30.776 "data_size": 0 00:11:30.776 } 00:11:30.776 ] 00:11:30.776 }' 00:11:30.776 23:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:30.776 23:53:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:31.343 23:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:31.601 [2024-05-14 23:53:31.934628] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:31.601 [2024-05-14 23:53:31.934660] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a7cbc0 name Existed_Raid, state configuring 00:11:31.601 23:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:31.601 [2024-05-14 23:53:32.099066] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:31.601 [2024-05-14 23:53:32.099094] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:31.601 [2024-05-14 23:53:32.099104] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:31.601 [2024-05-14 23:53:32.099116] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:31.601 23:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:31.859 [2024-05-14 23:53:32.277514] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:31.859 BaseBdev1 00:11:31.859 23:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:11:31.859 23:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:11:31.859 23:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:31.859 23:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:11:31.859 23:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:31.859 23:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:31.859 23:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:32.118 23:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:32.118 [ 00:11:32.118 { 00:11:32.118 "name": "BaseBdev1", 00:11:32.118 "aliases": [ 00:11:32.118 "71f36991-181d-4361-90d7-713326b70a93" 00:11:32.118 ], 00:11:32.118 "product_name": "Malloc disk", 00:11:32.118 "block_size": 512, 00:11:32.118 "num_blocks": 65536, 00:11:32.118 "uuid": "71f36991-181d-4361-90d7-713326b70a93", 00:11:32.118 "assigned_rate_limits": { 00:11:32.118 "rw_ios_per_sec": 0, 00:11:32.118 "rw_mbytes_per_sec": 0, 00:11:32.118 "r_mbytes_per_sec": 0, 00:11:32.118 "w_mbytes_per_sec": 0 00:11:32.118 }, 00:11:32.118 "claimed": true, 00:11:32.118 "claim_type": "exclusive_write", 00:11:32.118 "zoned": false, 00:11:32.118 "supported_io_types": { 00:11:32.118 "read": true, 00:11:32.118 "write": true, 00:11:32.118 "unmap": true, 00:11:32.118 "write_zeroes": true, 00:11:32.118 "flush": true, 00:11:32.118 "reset": true, 00:11:32.118 "compare": false, 00:11:32.118 "compare_and_write": false, 00:11:32.118 "abort": true, 00:11:32.118 "nvme_admin": false, 00:11:32.118 "nvme_io": false 00:11:32.118 }, 00:11:32.118 "memory_domains": [ 00:11:32.118 { 00:11:32.118 "dma_device_id": "system", 00:11:32.118 "dma_device_type": 1 00:11:32.118 }, 00:11:32.118 { 00:11:32.118 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:32.118 "dma_device_type": 2 00:11:32.118 } 00:11:32.118 ], 00:11:32.118 "driver_specific": {} 00:11:32.118 } 00:11:32.118 ] 00:11:32.376 23:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:11:32.376 23:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:32.376 23:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:32.376 23:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:32.376 23:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:11:32.376 23:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:11:32.376 23:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:32.376 23:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:32.376 23:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:32.376 23:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:32.376 23:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:32.376 23:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:32.376 23:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:32.376 23:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:32.376 "name": "Existed_Raid", 00:11:32.376 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:32.376 "strip_size_kb": 0, 00:11:32.376 "state": "configuring", 00:11:32.376 "raid_level": "raid1", 00:11:32.376 "superblock": false, 00:11:32.376 "num_base_bdevs": 2, 00:11:32.376 "num_base_bdevs_discovered": 1, 00:11:32.376 "num_base_bdevs_operational": 2, 00:11:32.376 "base_bdevs_list": [ 00:11:32.376 { 00:11:32.376 "name": "BaseBdev1", 00:11:32.376 "uuid": "71f36991-181d-4361-90d7-713326b70a93", 00:11:32.376 "is_configured": true, 00:11:32.377 "data_offset": 0, 00:11:32.377 "data_size": 65536 00:11:32.377 }, 00:11:32.377 { 00:11:32.377 "name": "BaseBdev2", 00:11:32.377 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:32.377 "is_configured": false, 00:11:32.377 "data_offset": 0, 00:11:32.377 "data_size": 0 00:11:32.377 } 00:11:32.377 ] 00:11:32.377 }' 00:11:32.377 23:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:32.377 23:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:32.945 23:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:33.205 [2024-05-14 23:53:33.661153] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:33.205 [2024-05-14 23:53:33.661197] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a7ce60 name Existed_Raid, state configuring 00:11:33.205 23:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:33.462 [2024-05-14 23:53:33.905820] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:33.462 [2024-05-14 23:53:33.907324] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:33.462 [2024-05-14 23:53:33.907359] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:33.462 23:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:11:33.462 23:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:11:33.462 23:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:33.462 23:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:33.462 23:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:33.462 23:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:11:33.462 23:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:11:33.462 23:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:33.462 23:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:33.462 23:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:33.462 23:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:33.462 23:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:33.462 23:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:33.462 23:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:33.720 23:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:33.720 "name": "Existed_Raid", 00:11:33.720 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:33.720 "strip_size_kb": 0, 00:11:33.720 "state": "configuring", 00:11:33.720 "raid_level": "raid1", 00:11:33.720 "superblock": false, 00:11:33.720 "num_base_bdevs": 2, 00:11:33.720 "num_base_bdevs_discovered": 1, 00:11:33.720 "num_base_bdevs_operational": 2, 00:11:33.720 "base_bdevs_list": [ 00:11:33.720 { 00:11:33.720 "name": "BaseBdev1", 00:11:33.720 "uuid": "71f36991-181d-4361-90d7-713326b70a93", 00:11:33.720 "is_configured": true, 00:11:33.720 "data_offset": 0, 00:11:33.720 "data_size": 65536 00:11:33.720 }, 00:11:33.720 { 00:11:33.720 "name": "BaseBdev2", 00:11:33.720 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:33.720 "is_configured": false, 00:11:33.720 "data_offset": 0, 00:11:33.720 "data_size": 0 00:11:33.720 } 00:11:33.720 ] 00:11:33.720 }' 00:11:33.720 23:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:33.720 23:53:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:34.286 23:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:34.543 [2024-05-14 23:53:35.004318] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:34.543 [2024-05-14 23:53:35.004360] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a7c4b0 00:11:34.543 [2024-05-14 23:53:35.004368] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:11:34.543 [2024-05-14 23:53:35.004577] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a7ca70 00:11:34.543 [2024-05-14 23:53:35.004703] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a7c4b0 00:11:34.543 [2024-05-14 23:53:35.004713] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1a7c4b0 00:11:34.543 [2024-05-14 23:53:35.004881] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:34.543 BaseBdev2 00:11:34.543 23:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:11:34.543 23:53:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:11:34.543 23:53:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:34.543 23:53:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:11:34.543 23:53:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:34.543 23:53:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:34.543 23:53:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:34.801 23:53:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:35.070 [ 00:11:35.070 { 00:11:35.070 "name": "BaseBdev2", 00:11:35.070 "aliases": [ 00:11:35.070 "ea291f2d-4267-4349-a406-8a14a4cdacad" 00:11:35.070 ], 00:11:35.070 "product_name": "Malloc disk", 00:11:35.070 "block_size": 512, 00:11:35.070 "num_blocks": 65536, 00:11:35.070 "uuid": "ea291f2d-4267-4349-a406-8a14a4cdacad", 00:11:35.070 "assigned_rate_limits": { 00:11:35.070 "rw_ios_per_sec": 0, 00:11:35.070 "rw_mbytes_per_sec": 0, 00:11:35.070 "r_mbytes_per_sec": 0, 00:11:35.070 "w_mbytes_per_sec": 0 00:11:35.070 }, 00:11:35.070 "claimed": true, 00:11:35.070 "claim_type": "exclusive_write", 00:11:35.070 "zoned": false, 00:11:35.070 "supported_io_types": { 00:11:35.070 "read": true, 00:11:35.070 "write": true, 00:11:35.070 "unmap": true, 00:11:35.070 "write_zeroes": true, 00:11:35.070 "flush": true, 00:11:35.070 "reset": true, 00:11:35.070 "compare": false, 00:11:35.070 "compare_and_write": false, 00:11:35.070 "abort": true, 00:11:35.070 "nvme_admin": false, 00:11:35.070 "nvme_io": false 00:11:35.070 }, 00:11:35.070 "memory_domains": [ 00:11:35.070 { 00:11:35.070 "dma_device_id": "system", 00:11:35.070 "dma_device_type": 1 00:11:35.070 }, 00:11:35.070 { 00:11:35.070 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:35.070 "dma_device_type": 2 00:11:35.070 } 00:11:35.070 ], 00:11:35.070 "driver_specific": {} 00:11:35.070 } 00:11:35.070 ] 00:11:35.070 23:53:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:11:35.070 23:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:11:35.070 23:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:11:35.070 23:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:11:35.070 23:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:35.070 23:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:11:35.070 23:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:11:35.070 23:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:11:35.070 23:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:35.070 23:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:35.070 23:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:35.070 23:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:35.070 23:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:35.070 23:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:35.070 23:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:35.345 23:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:35.345 "name": "Existed_Raid", 00:11:35.345 "uuid": "e7edfd21-0b1d-4296-816f-f6df563887e2", 00:11:35.345 "strip_size_kb": 0, 00:11:35.345 "state": "online", 00:11:35.345 "raid_level": "raid1", 00:11:35.345 "superblock": false, 00:11:35.345 "num_base_bdevs": 2, 00:11:35.345 "num_base_bdevs_discovered": 2, 00:11:35.345 "num_base_bdevs_operational": 2, 00:11:35.345 "base_bdevs_list": [ 00:11:35.345 { 00:11:35.345 "name": "BaseBdev1", 00:11:35.345 "uuid": "71f36991-181d-4361-90d7-713326b70a93", 00:11:35.345 "is_configured": true, 00:11:35.345 "data_offset": 0, 00:11:35.345 "data_size": 65536 00:11:35.345 }, 00:11:35.345 { 00:11:35.345 "name": "BaseBdev2", 00:11:35.345 "uuid": "ea291f2d-4267-4349-a406-8a14a4cdacad", 00:11:35.345 "is_configured": true, 00:11:35.345 "data_offset": 0, 00:11:35.345 "data_size": 65536 00:11:35.345 } 00:11:35.345 ] 00:11:35.345 }' 00:11:35.345 23:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:35.345 23:53:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:35.910 23:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:11:35.910 23:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:11:35.910 23:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:11:35.910 23:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:11:35.910 23:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:11:35.910 23:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:11:35.910 23:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:35.910 23:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:11:36.168 [2024-05-14 23:53:36.552696] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:36.168 23:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:11:36.168 "name": "Existed_Raid", 00:11:36.168 "aliases": [ 00:11:36.168 "e7edfd21-0b1d-4296-816f-f6df563887e2" 00:11:36.168 ], 00:11:36.168 "product_name": "Raid Volume", 00:11:36.168 "block_size": 512, 00:11:36.168 "num_blocks": 65536, 00:11:36.168 "uuid": "e7edfd21-0b1d-4296-816f-f6df563887e2", 00:11:36.168 "assigned_rate_limits": { 00:11:36.168 "rw_ios_per_sec": 0, 00:11:36.168 "rw_mbytes_per_sec": 0, 00:11:36.168 "r_mbytes_per_sec": 0, 00:11:36.168 "w_mbytes_per_sec": 0 00:11:36.168 }, 00:11:36.168 "claimed": false, 00:11:36.168 "zoned": false, 00:11:36.168 "supported_io_types": { 00:11:36.168 "read": true, 00:11:36.168 "write": true, 00:11:36.168 "unmap": false, 00:11:36.168 "write_zeroes": true, 00:11:36.168 "flush": false, 00:11:36.168 "reset": true, 00:11:36.168 "compare": false, 00:11:36.168 "compare_and_write": false, 00:11:36.168 "abort": false, 00:11:36.168 "nvme_admin": false, 00:11:36.168 "nvme_io": false 00:11:36.168 }, 00:11:36.168 "memory_domains": [ 00:11:36.168 { 00:11:36.168 "dma_device_id": "system", 00:11:36.168 "dma_device_type": 1 00:11:36.168 }, 00:11:36.168 { 00:11:36.168 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:36.168 "dma_device_type": 2 00:11:36.168 }, 00:11:36.168 { 00:11:36.168 "dma_device_id": "system", 00:11:36.168 "dma_device_type": 1 00:11:36.168 }, 00:11:36.168 { 00:11:36.168 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:36.168 "dma_device_type": 2 00:11:36.168 } 00:11:36.168 ], 00:11:36.168 "driver_specific": { 00:11:36.168 "raid": { 00:11:36.168 "uuid": "e7edfd21-0b1d-4296-816f-f6df563887e2", 00:11:36.168 "strip_size_kb": 0, 00:11:36.168 "state": "online", 00:11:36.168 "raid_level": "raid1", 00:11:36.168 "superblock": false, 00:11:36.168 "num_base_bdevs": 2, 00:11:36.168 "num_base_bdevs_discovered": 2, 00:11:36.168 "num_base_bdevs_operational": 2, 00:11:36.168 "base_bdevs_list": [ 00:11:36.168 { 00:11:36.168 "name": "BaseBdev1", 00:11:36.168 "uuid": "71f36991-181d-4361-90d7-713326b70a93", 00:11:36.168 "is_configured": true, 00:11:36.168 "data_offset": 0, 00:11:36.168 "data_size": 65536 00:11:36.168 }, 00:11:36.168 { 00:11:36.168 "name": "BaseBdev2", 00:11:36.168 "uuid": "ea291f2d-4267-4349-a406-8a14a4cdacad", 00:11:36.168 "is_configured": true, 00:11:36.168 "data_offset": 0, 00:11:36.168 "data_size": 65536 00:11:36.168 } 00:11:36.168 ] 00:11:36.168 } 00:11:36.168 } 00:11:36.168 }' 00:11:36.168 23:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:36.168 23:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:11:36.168 BaseBdev2' 00:11:36.168 23:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:36.168 23:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:36.168 23:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:36.427 23:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:36.427 "name": "BaseBdev1", 00:11:36.427 "aliases": [ 00:11:36.427 "71f36991-181d-4361-90d7-713326b70a93" 00:11:36.427 ], 00:11:36.427 "product_name": "Malloc disk", 00:11:36.427 "block_size": 512, 00:11:36.427 "num_blocks": 65536, 00:11:36.427 "uuid": "71f36991-181d-4361-90d7-713326b70a93", 00:11:36.427 "assigned_rate_limits": { 00:11:36.427 "rw_ios_per_sec": 0, 00:11:36.427 "rw_mbytes_per_sec": 0, 00:11:36.427 "r_mbytes_per_sec": 0, 00:11:36.427 "w_mbytes_per_sec": 0 00:11:36.427 }, 00:11:36.427 "claimed": true, 00:11:36.427 "claim_type": "exclusive_write", 00:11:36.427 "zoned": false, 00:11:36.427 "supported_io_types": { 00:11:36.427 "read": true, 00:11:36.427 "write": true, 00:11:36.427 "unmap": true, 00:11:36.427 "write_zeroes": true, 00:11:36.427 "flush": true, 00:11:36.427 "reset": true, 00:11:36.427 "compare": false, 00:11:36.427 "compare_and_write": false, 00:11:36.427 "abort": true, 00:11:36.427 "nvme_admin": false, 00:11:36.427 "nvme_io": false 00:11:36.427 }, 00:11:36.427 "memory_domains": [ 00:11:36.427 { 00:11:36.427 "dma_device_id": "system", 00:11:36.427 "dma_device_type": 1 00:11:36.427 }, 00:11:36.427 { 00:11:36.427 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:36.427 "dma_device_type": 2 00:11:36.427 } 00:11:36.427 ], 00:11:36.427 "driver_specific": {} 00:11:36.427 }' 00:11:36.427 23:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:36.427 23:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:36.427 23:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:36.427 23:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:36.427 23:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:36.685 23:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:36.685 23:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:36.685 23:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:36.685 23:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:36.685 23:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:36.685 23:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:36.685 23:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:36.685 23:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:36.685 23:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:36.685 23:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:36.943 23:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:36.943 "name": "BaseBdev2", 00:11:36.943 "aliases": [ 00:11:36.943 "ea291f2d-4267-4349-a406-8a14a4cdacad" 00:11:36.943 ], 00:11:36.943 "product_name": "Malloc disk", 00:11:36.943 "block_size": 512, 00:11:36.943 "num_blocks": 65536, 00:11:36.943 "uuid": "ea291f2d-4267-4349-a406-8a14a4cdacad", 00:11:36.943 "assigned_rate_limits": { 00:11:36.943 "rw_ios_per_sec": 0, 00:11:36.943 "rw_mbytes_per_sec": 0, 00:11:36.943 "r_mbytes_per_sec": 0, 00:11:36.943 "w_mbytes_per_sec": 0 00:11:36.943 }, 00:11:36.943 "claimed": true, 00:11:36.943 "claim_type": "exclusive_write", 00:11:36.943 "zoned": false, 00:11:36.943 "supported_io_types": { 00:11:36.943 "read": true, 00:11:36.943 "write": true, 00:11:36.943 "unmap": true, 00:11:36.943 "write_zeroes": true, 00:11:36.943 "flush": true, 00:11:36.943 "reset": true, 00:11:36.943 "compare": false, 00:11:36.943 "compare_and_write": false, 00:11:36.943 "abort": true, 00:11:36.943 "nvme_admin": false, 00:11:36.943 "nvme_io": false 00:11:36.943 }, 00:11:36.943 "memory_domains": [ 00:11:36.943 { 00:11:36.943 "dma_device_id": "system", 00:11:36.943 "dma_device_type": 1 00:11:36.943 }, 00:11:36.943 { 00:11:36.943 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:36.943 "dma_device_type": 2 00:11:36.943 } 00:11:36.943 ], 00:11:36.943 "driver_specific": {} 00:11:36.943 }' 00:11:36.943 23:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:36.943 23:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:36.943 23:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:36.943 23:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:37.201 23:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:37.201 23:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:37.201 23:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:37.201 23:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:37.201 23:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:37.201 23:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:37.201 23:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:37.201 23:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:37.201 23:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:37.459 [2024-05-14 23:53:38.004538] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:37.459 23:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # local expected_state 00:11:37.459 23:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # has_redundancy raid1 00:11:37.459 23:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:11:37.459 23:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 0 00:11:37.459 23:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@280 -- # expected_state=online 00:11:37.459 23:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:11:37.459 23:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:37.459 23:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:11:37.459 23:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:11:37.459 23:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:11:37.459 23:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:11:37.459 23:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:37.459 23:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:37.459 23:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:37.459 23:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:37.459 23:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:37.459 23:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:37.717 23:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:37.717 "name": "Existed_Raid", 00:11:37.717 "uuid": "e7edfd21-0b1d-4296-816f-f6df563887e2", 00:11:37.717 "strip_size_kb": 0, 00:11:37.717 "state": "online", 00:11:37.717 "raid_level": "raid1", 00:11:37.717 "superblock": false, 00:11:37.717 "num_base_bdevs": 2, 00:11:37.717 "num_base_bdevs_discovered": 1, 00:11:37.717 "num_base_bdevs_operational": 1, 00:11:37.717 "base_bdevs_list": [ 00:11:37.717 { 00:11:37.717 "name": null, 00:11:37.717 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:37.717 "is_configured": false, 00:11:37.717 "data_offset": 0, 00:11:37.717 "data_size": 65536 00:11:37.717 }, 00:11:37.717 { 00:11:37.717 "name": "BaseBdev2", 00:11:37.717 "uuid": "ea291f2d-4267-4349-a406-8a14a4cdacad", 00:11:37.717 "is_configured": true, 00:11:37.717 "data_offset": 0, 00:11:37.717 "data_size": 65536 00:11:37.717 } 00:11:37.717 ] 00:11:37.717 }' 00:11:37.717 23:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:37.717 23:53:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:38.283 23:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:11:38.283 23:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:11:38.283 23:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:38.283 23:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:11:38.541 23:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:11:38.541 23:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:38.541 23:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:38.892 [2024-05-14 23:53:39.305013] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:38.892 [2024-05-14 23:53:39.305090] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:38.892 [2024-05-14 23:53:39.315950] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:38.892 [2024-05-14 23:53:39.316017] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:38.892 [2024-05-14 23:53:39.316031] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a7c4b0 name Existed_Raid, state offline 00:11:38.892 23:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:11:38.892 23:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:11:38.892 23:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:38.892 23:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:11:39.152 23:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:11:39.152 23:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:11:39.152 23:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@300 -- # '[' 2 -gt 2 ']' 00:11:39.152 23:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@342 -- # killprocess 394959 00:11:39.152 23:53:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@946 -- # '[' -z 394959 ']' 00:11:39.152 23:53:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # kill -0 394959 00:11:39.152 23:53:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # uname 00:11:39.152 23:53:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:11:39.152 23:53:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 394959 00:11:39.152 23:53:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:11:39.152 23:53:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:11:39.152 23:53:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 394959' 00:11:39.152 killing process with pid 394959 00:11:39.152 23:53:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@965 -- # kill 394959 00:11:39.152 [2024-05-14 23:53:39.628815] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:39.152 23:53:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@970 -- # wait 394959 00:11:39.152 [2024-05-14 23:53:39.629720] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@344 -- # return 0 00:11:39.411 00:11:39.411 real 0m9.995s 00:11:39.411 user 0m17.668s 00:11:39.411 sys 0m1.895s 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:39.411 ************************************ 00:11:39.411 END TEST raid_state_function_test 00:11:39.411 ************************************ 00:11:39.411 23:53:39 bdev_raid -- bdev/bdev_raid.sh@816 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:11:39.411 23:53:39 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:11:39.411 23:53:39 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:11:39.411 23:53:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:39.411 ************************************ 00:11:39.411 START TEST raid_state_function_test_sb 00:11:39.411 ************************************ 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1121 -- # raid_state_function_test raid1 2 true 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local raid_level=raid1 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=2 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # '[' raid1 '!=' raid1 ']' 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@235 -- # strip_size=0 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # raid_pid=396431 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 396431' 00:11:39.411 Process raid pid: 396431 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@247 -- # waitforlisten 396431 /var/tmp/spdk-raid.sock 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@827 -- # '[' -z 396431 ']' 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:39.411 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:11:39.411 23:53:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:39.670 [2024-05-14 23:53:40.037981] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:11:39.670 [2024-05-14 23:53:40.038059] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:39.670 [2024-05-14 23:53:40.167391] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:39.929 [2024-05-14 23:53:40.274712] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:11:39.929 [2024-05-14 23:53:40.346689] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:39.929 [2024-05-14 23:53:40.346728] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:40.496 23:53:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:11:40.496 23:53:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # return 0 00:11:40.496 23:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:40.755 [2024-05-14 23:53:41.186238] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:40.755 [2024-05-14 23:53:41.186278] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:40.755 [2024-05-14 23:53:41.186291] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:40.755 [2024-05-14 23:53:41.186303] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:40.755 23:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:40.755 23:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:40.755 23:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:40.755 23:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:11:40.755 23:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:11:40.755 23:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:40.755 23:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:40.755 23:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:40.755 23:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:40.755 23:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:40.755 23:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:40.755 23:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:41.014 23:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:41.014 "name": "Existed_Raid", 00:11:41.014 "uuid": "4f41ab3d-5970-407c-884c-3b8848170404", 00:11:41.014 "strip_size_kb": 0, 00:11:41.014 "state": "configuring", 00:11:41.014 "raid_level": "raid1", 00:11:41.014 "superblock": true, 00:11:41.014 "num_base_bdevs": 2, 00:11:41.014 "num_base_bdevs_discovered": 0, 00:11:41.014 "num_base_bdevs_operational": 2, 00:11:41.014 "base_bdevs_list": [ 00:11:41.014 { 00:11:41.014 "name": "BaseBdev1", 00:11:41.014 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:41.014 "is_configured": false, 00:11:41.014 "data_offset": 0, 00:11:41.014 "data_size": 0 00:11:41.014 }, 00:11:41.014 { 00:11:41.014 "name": "BaseBdev2", 00:11:41.014 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:41.014 "is_configured": false, 00:11:41.014 "data_offset": 0, 00:11:41.014 "data_size": 0 00:11:41.014 } 00:11:41.014 ] 00:11:41.014 }' 00:11:41.014 23:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:41.014 23:53:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:41.582 23:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:41.840 [2024-05-14 23:53:42.260955] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:41.840 [2024-05-14 23:53:42.260986] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17d1bc0 name Existed_Raid, state configuring 00:11:41.840 23:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:42.098 [2024-05-14 23:53:42.505619] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:42.098 [2024-05-14 23:53:42.505651] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:42.098 [2024-05-14 23:53:42.505662] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:42.098 [2024-05-14 23:53:42.505673] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:42.098 23:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:42.357 [2024-05-14 23:53:42.760205] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:42.357 BaseBdev1 00:11:42.357 23:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:11:42.357 23:53:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:11:42.357 23:53:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:42.357 23:53:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:11:42.357 23:53:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:42.357 23:53:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:42.357 23:53:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:42.616 23:53:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:42.616 [ 00:11:42.616 { 00:11:42.616 "name": "BaseBdev1", 00:11:42.616 "aliases": [ 00:11:42.616 "c4a0a154-9dd9-41c7-a70d-b1a27fa4bc4f" 00:11:42.616 ], 00:11:42.616 "product_name": "Malloc disk", 00:11:42.616 "block_size": 512, 00:11:42.616 "num_blocks": 65536, 00:11:42.616 "uuid": "c4a0a154-9dd9-41c7-a70d-b1a27fa4bc4f", 00:11:42.616 "assigned_rate_limits": { 00:11:42.616 "rw_ios_per_sec": 0, 00:11:42.616 "rw_mbytes_per_sec": 0, 00:11:42.616 "r_mbytes_per_sec": 0, 00:11:42.616 "w_mbytes_per_sec": 0 00:11:42.616 }, 00:11:42.616 "claimed": true, 00:11:42.616 "claim_type": "exclusive_write", 00:11:42.616 "zoned": false, 00:11:42.616 "supported_io_types": { 00:11:42.616 "read": true, 00:11:42.616 "write": true, 00:11:42.616 "unmap": true, 00:11:42.616 "write_zeroes": true, 00:11:42.616 "flush": true, 00:11:42.616 "reset": true, 00:11:42.616 "compare": false, 00:11:42.616 "compare_and_write": false, 00:11:42.616 "abort": true, 00:11:42.616 "nvme_admin": false, 00:11:42.616 "nvme_io": false 00:11:42.616 }, 00:11:42.616 "memory_domains": [ 00:11:42.616 { 00:11:42.616 "dma_device_id": "system", 00:11:42.616 "dma_device_type": 1 00:11:42.616 }, 00:11:42.616 { 00:11:42.616 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:42.616 "dma_device_type": 2 00:11:42.616 } 00:11:42.616 ], 00:11:42.616 "driver_specific": {} 00:11:42.616 } 00:11:42.616 ] 00:11:42.875 23:53:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:11:42.875 23:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:42.875 23:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:42.875 23:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:42.875 23:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:11:42.875 23:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:11:42.875 23:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:42.875 23:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:42.875 23:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:42.875 23:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:42.875 23:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:42.875 23:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:42.875 23:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:42.875 23:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:42.875 "name": "Existed_Raid", 00:11:42.875 "uuid": "df60ddd3-1d19-4eb9-acd6-8c5c627abbe9", 00:11:42.875 "strip_size_kb": 0, 00:11:42.875 "state": "configuring", 00:11:42.875 "raid_level": "raid1", 00:11:42.875 "superblock": true, 00:11:42.875 "num_base_bdevs": 2, 00:11:42.875 "num_base_bdevs_discovered": 1, 00:11:42.875 "num_base_bdevs_operational": 2, 00:11:42.875 "base_bdevs_list": [ 00:11:42.875 { 00:11:42.875 "name": "BaseBdev1", 00:11:42.875 "uuid": "c4a0a154-9dd9-41c7-a70d-b1a27fa4bc4f", 00:11:42.875 "is_configured": true, 00:11:42.875 "data_offset": 2048, 00:11:42.875 "data_size": 63488 00:11:42.875 }, 00:11:42.875 { 00:11:42.875 "name": "BaseBdev2", 00:11:42.875 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:42.875 "is_configured": false, 00:11:42.875 "data_offset": 0, 00:11:42.875 "data_size": 0 00:11:42.875 } 00:11:42.875 ] 00:11:42.875 }' 00:11:42.875 23:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:42.875 23:53:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:43.441 23:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:43.698 [2024-05-14 23:53:44.143831] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:43.698 [2024-05-14 23:53:44.143870] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17d1e60 name Existed_Raid, state configuring 00:11:43.698 23:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:43.956 [2024-05-14 23:53:44.312313] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:43.956 [2024-05-14 23:53:44.313805] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:43.956 [2024-05-14 23:53:44.313836] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:43.956 23:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:11:43.956 23:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:11:43.956 23:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:43.956 23:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:43.956 23:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:43.956 23:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:11:43.956 23:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:11:43.956 23:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:43.956 23:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:43.956 23:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:43.956 23:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:43.956 23:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:43.956 23:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:43.956 23:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:44.214 23:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:44.214 "name": "Existed_Raid", 00:11:44.214 "uuid": "0b8818f3-7b4f-4749-af66-1dbcd692052f", 00:11:44.214 "strip_size_kb": 0, 00:11:44.214 "state": "configuring", 00:11:44.214 "raid_level": "raid1", 00:11:44.214 "superblock": true, 00:11:44.214 "num_base_bdevs": 2, 00:11:44.214 "num_base_bdevs_discovered": 1, 00:11:44.214 "num_base_bdevs_operational": 2, 00:11:44.214 "base_bdevs_list": [ 00:11:44.214 { 00:11:44.214 "name": "BaseBdev1", 00:11:44.214 "uuid": "c4a0a154-9dd9-41c7-a70d-b1a27fa4bc4f", 00:11:44.214 "is_configured": true, 00:11:44.214 "data_offset": 2048, 00:11:44.214 "data_size": 63488 00:11:44.214 }, 00:11:44.214 { 00:11:44.214 "name": "BaseBdev2", 00:11:44.214 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:44.214 "is_configured": false, 00:11:44.214 "data_offset": 0, 00:11:44.214 "data_size": 0 00:11:44.214 } 00:11:44.214 ] 00:11:44.214 }' 00:11:44.214 23:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:44.214 23:53:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:44.780 23:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:45.039 [2024-05-14 23:53:45.410605] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:45.039 [2024-05-14 23:53:45.410749] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x17d14b0 00:11:45.039 [2024-05-14 23:53:45.410764] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:45.039 [2024-05-14 23:53:45.410935] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17d1a70 00:11:45.039 [2024-05-14 23:53:45.411055] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17d14b0 00:11:45.039 [2024-05-14 23:53:45.411065] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x17d14b0 00:11:45.039 [2024-05-14 23:53:45.411156] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:45.039 BaseBdev2 00:11:45.039 23:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:11:45.039 23:53:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:11:45.039 23:53:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:45.039 23:53:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:11:45.039 23:53:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:45.039 23:53:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:45.039 23:53:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:45.298 23:53:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:45.557 [ 00:11:45.557 { 00:11:45.557 "name": "BaseBdev2", 00:11:45.557 "aliases": [ 00:11:45.557 "de8d84a9-eb35-4a25-b668-f6a99b1ee4b5" 00:11:45.557 ], 00:11:45.557 "product_name": "Malloc disk", 00:11:45.557 "block_size": 512, 00:11:45.557 "num_blocks": 65536, 00:11:45.557 "uuid": "de8d84a9-eb35-4a25-b668-f6a99b1ee4b5", 00:11:45.557 "assigned_rate_limits": { 00:11:45.557 "rw_ios_per_sec": 0, 00:11:45.557 "rw_mbytes_per_sec": 0, 00:11:45.557 "r_mbytes_per_sec": 0, 00:11:45.557 "w_mbytes_per_sec": 0 00:11:45.557 }, 00:11:45.557 "claimed": true, 00:11:45.557 "claim_type": "exclusive_write", 00:11:45.557 "zoned": false, 00:11:45.557 "supported_io_types": { 00:11:45.557 "read": true, 00:11:45.557 "write": true, 00:11:45.557 "unmap": true, 00:11:45.557 "write_zeroes": true, 00:11:45.557 "flush": true, 00:11:45.557 "reset": true, 00:11:45.557 "compare": false, 00:11:45.557 "compare_and_write": false, 00:11:45.557 "abort": true, 00:11:45.557 "nvme_admin": false, 00:11:45.557 "nvme_io": false 00:11:45.557 }, 00:11:45.557 "memory_domains": [ 00:11:45.557 { 00:11:45.557 "dma_device_id": "system", 00:11:45.557 "dma_device_type": 1 00:11:45.557 }, 00:11:45.557 { 00:11:45.557 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:45.557 "dma_device_type": 2 00:11:45.557 } 00:11:45.557 ], 00:11:45.557 "driver_specific": {} 00:11:45.557 } 00:11:45.557 ] 00:11:45.557 23:53:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:11:45.557 23:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:11:45.557 23:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:11:45.557 23:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:11:45.557 23:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:45.557 23:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:11:45.557 23:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:11:45.557 23:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:11:45.557 23:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:45.557 23:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:45.557 23:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:45.557 23:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:45.557 23:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:45.557 23:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:45.557 23:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:45.815 23:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:45.815 "name": "Existed_Raid", 00:11:45.815 "uuid": "0b8818f3-7b4f-4749-af66-1dbcd692052f", 00:11:45.815 "strip_size_kb": 0, 00:11:45.815 "state": "online", 00:11:45.815 "raid_level": "raid1", 00:11:45.815 "superblock": true, 00:11:45.815 "num_base_bdevs": 2, 00:11:45.815 "num_base_bdevs_discovered": 2, 00:11:45.815 "num_base_bdevs_operational": 2, 00:11:45.815 "base_bdevs_list": [ 00:11:45.815 { 00:11:45.815 "name": "BaseBdev1", 00:11:45.815 "uuid": "c4a0a154-9dd9-41c7-a70d-b1a27fa4bc4f", 00:11:45.815 "is_configured": true, 00:11:45.815 "data_offset": 2048, 00:11:45.815 "data_size": 63488 00:11:45.815 }, 00:11:45.815 { 00:11:45.815 "name": "BaseBdev2", 00:11:45.815 "uuid": "de8d84a9-eb35-4a25-b668-f6a99b1ee4b5", 00:11:45.815 "is_configured": true, 00:11:45.815 "data_offset": 2048, 00:11:45.815 "data_size": 63488 00:11:45.815 } 00:11:45.815 ] 00:11:45.815 }' 00:11:45.815 23:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:45.815 23:53:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:46.381 23:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:11:46.381 23:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:11:46.381 23:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:11:46.381 23:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:11:46.381 23:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:11:46.381 23:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:11:46.381 23:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:46.381 23:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:11:46.381 [2024-05-14 23:53:46.966973] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:46.639 23:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:11:46.639 "name": "Existed_Raid", 00:11:46.639 "aliases": [ 00:11:46.639 "0b8818f3-7b4f-4749-af66-1dbcd692052f" 00:11:46.639 ], 00:11:46.639 "product_name": "Raid Volume", 00:11:46.639 "block_size": 512, 00:11:46.639 "num_blocks": 63488, 00:11:46.639 "uuid": "0b8818f3-7b4f-4749-af66-1dbcd692052f", 00:11:46.639 "assigned_rate_limits": { 00:11:46.639 "rw_ios_per_sec": 0, 00:11:46.639 "rw_mbytes_per_sec": 0, 00:11:46.639 "r_mbytes_per_sec": 0, 00:11:46.639 "w_mbytes_per_sec": 0 00:11:46.639 }, 00:11:46.639 "claimed": false, 00:11:46.639 "zoned": false, 00:11:46.639 "supported_io_types": { 00:11:46.639 "read": true, 00:11:46.639 "write": true, 00:11:46.639 "unmap": false, 00:11:46.639 "write_zeroes": true, 00:11:46.639 "flush": false, 00:11:46.639 "reset": true, 00:11:46.639 "compare": false, 00:11:46.639 "compare_and_write": false, 00:11:46.639 "abort": false, 00:11:46.639 "nvme_admin": false, 00:11:46.639 "nvme_io": false 00:11:46.639 }, 00:11:46.639 "memory_domains": [ 00:11:46.639 { 00:11:46.639 "dma_device_id": "system", 00:11:46.639 "dma_device_type": 1 00:11:46.639 }, 00:11:46.639 { 00:11:46.639 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:46.639 "dma_device_type": 2 00:11:46.639 }, 00:11:46.639 { 00:11:46.639 "dma_device_id": "system", 00:11:46.639 "dma_device_type": 1 00:11:46.639 }, 00:11:46.639 { 00:11:46.639 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:46.639 "dma_device_type": 2 00:11:46.639 } 00:11:46.639 ], 00:11:46.639 "driver_specific": { 00:11:46.639 "raid": { 00:11:46.639 "uuid": "0b8818f3-7b4f-4749-af66-1dbcd692052f", 00:11:46.639 "strip_size_kb": 0, 00:11:46.639 "state": "online", 00:11:46.639 "raid_level": "raid1", 00:11:46.639 "superblock": true, 00:11:46.639 "num_base_bdevs": 2, 00:11:46.639 "num_base_bdevs_discovered": 2, 00:11:46.639 "num_base_bdevs_operational": 2, 00:11:46.639 "base_bdevs_list": [ 00:11:46.639 { 00:11:46.639 "name": "BaseBdev1", 00:11:46.639 "uuid": "c4a0a154-9dd9-41c7-a70d-b1a27fa4bc4f", 00:11:46.639 "is_configured": true, 00:11:46.639 "data_offset": 2048, 00:11:46.639 "data_size": 63488 00:11:46.639 }, 00:11:46.639 { 00:11:46.639 "name": "BaseBdev2", 00:11:46.639 "uuid": "de8d84a9-eb35-4a25-b668-f6a99b1ee4b5", 00:11:46.639 "is_configured": true, 00:11:46.639 "data_offset": 2048, 00:11:46.639 "data_size": 63488 00:11:46.639 } 00:11:46.639 ] 00:11:46.639 } 00:11:46.639 } 00:11:46.639 }' 00:11:46.639 23:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:46.639 23:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:11:46.639 BaseBdev2' 00:11:46.639 23:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:46.639 23:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:46.639 23:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:46.898 23:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:46.898 "name": "BaseBdev1", 00:11:46.898 "aliases": [ 00:11:46.898 "c4a0a154-9dd9-41c7-a70d-b1a27fa4bc4f" 00:11:46.898 ], 00:11:46.898 "product_name": "Malloc disk", 00:11:46.898 "block_size": 512, 00:11:46.898 "num_blocks": 65536, 00:11:46.898 "uuid": "c4a0a154-9dd9-41c7-a70d-b1a27fa4bc4f", 00:11:46.898 "assigned_rate_limits": { 00:11:46.898 "rw_ios_per_sec": 0, 00:11:46.898 "rw_mbytes_per_sec": 0, 00:11:46.898 "r_mbytes_per_sec": 0, 00:11:46.898 "w_mbytes_per_sec": 0 00:11:46.898 }, 00:11:46.898 "claimed": true, 00:11:46.898 "claim_type": "exclusive_write", 00:11:46.898 "zoned": false, 00:11:46.898 "supported_io_types": { 00:11:46.898 "read": true, 00:11:46.898 "write": true, 00:11:46.898 "unmap": true, 00:11:46.898 "write_zeroes": true, 00:11:46.898 "flush": true, 00:11:46.898 "reset": true, 00:11:46.898 "compare": false, 00:11:46.898 "compare_and_write": false, 00:11:46.898 "abort": true, 00:11:46.898 "nvme_admin": false, 00:11:46.898 "nvme_io": false 00:11:46.898 }, 00:11:46.898 "memory_domains": [ 00:11:46.898 { 00:11:46.898 "dma_device_id": "system", 00:11:46.898 "dma_device_type": 1 00:11:46.898 }, 00:11:46.898 { 00:11:46.898 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:46.898 "dma_device_type": 2 00:11:46.898 } 00:11:46.898 ], 00:11:46.898 "driver_specific": {} 00:11:46.898 }' 00:11:46.898 23:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:46.898 23:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:46.898 23:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:46.898 23:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:46.898 23:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:46.898 23:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:46.898 23:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:47.156 23:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:47.156 23:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:47.156 23:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:47.156 23:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:47.156 23:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:47.156 23:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:47.156 23:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:47.156 23:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:47.414 23:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:47.414 "name": "BaseBdev2", 00:11:47.414 "aliases": [ 00:11:47.414 "de8d84a9-eb35-4a25-b668-f6a99b1ee4b5" 00:11:47.414 ], 00:11:47.414 "product_name": "Malloc disk", 00:11:47.414 "block_size": 512, 00:11:47.414 "num_blocks": 65536, 00:11:47.414 "uuid": "de8d84a9-eb35-4a25-b668-f6a99b1ee4b5", 00:11:47.414 "assigned_rate_limits": { 00:11:47.414 "rw_ios_per_sec": 0, 00:11:47.414 "rw_mbytes_per_sec": 0, 00:11:47.414 "r_mbytes_per_sec": 0, 00:11:47.414 "w_mbytes_per_sec": 0 00:11:47.414 }, 00:11:47.414 "claimed": true, 00:11:47.414 "claim_type": "exclusive_write", 00:11:47.414 "zoned": false, 00:11:47.414 "supported_io_types": { 00:11:47.414 "read": true, 00:11:47.414 "write": true, 00:11:47.414 "unmap": true, 00:11:47.414 "write_zeroes": true, 00:11:47.414 "flush": true, 00:11:47.414 "reset": true, 00:11:47.414 "compare": false, 00:11:47.414 "compare_and_write": false, 00:11:47.414 "abort": true, 00:11:47.415 "nvme_admin": false, 00:11:47.415 "nvme_io": false 00:11:47.415 }, 00:11:47.415 "memory_domains": [ 00:11:47.415 { 00:11:47.415 "dma_device_id": "system", 00:11:47.415 "dma_device_type": 1 00:11:47.415 }, 00:11:47.415 { 00:11:47.415 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:47.415 "dma_device_type": 2 00:11:47.415 } 00:11:47.415 ], 00:11:47.415 "driver_specific": {} 00:11:47.415 }' 00:11:47.415 23:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:47.415 23:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:47.415 23:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:47.415 23:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:47.672 23:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:47.672 23:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:47.672 23:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:47.672 23:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:47.672 23:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:47.672 23:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:47.672 23:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:47.672 23:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:47.673 23:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:47.930 [2024-05-14 23:53:48.410611] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:47.930 23:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # local expected_state 00:11:47.930 23:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # has_redundancy raid1 00:11:47.930 23:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # case $1 in 00:11:47.930 23:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 0 00:11:47.930 23:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@280 -- # expected_state=online 00:11:47.930 23:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:11:47.930 23:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:47.930 23:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:11:47.930 23:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:11:47.930 23:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:11:47.930 23:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:11:47.930 23:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:47.930 23:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:47.930 23:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:47.930 23:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:47.930 23:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:47.931 23:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:48.189 23:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:48.189 "name": "Existed_Raid", 00:11:48.189 "uuid": "0b8818f3-7b4f-4749-af66-1dbcd692052f", 00:11:48.189 "strip_size_kb": 0, 00:11:48.189 "state": "online", 00:11:48.189 "raid_level": "raid1", 00:11:48.189 "superblock": true, 00:11:48.189 "num_base_bdevs": 2, 00:11:48.189 "num_base_bdevs_discovered": 1, 00:11:48.189 "num_base_bdevs_operational": 1, 00:11:48.189 "base_bdevs_list": [ 00:11:48.189 { 00:11:48.189 "name": null, 00:11:48.189 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:48.189 "is_configured": false, 00:11:48.189 "data_offset": 2048, 00:11:48.189 "data_size": 63488 00:11:48.189 }, 00:11:48.189 { 00:11:48.189 "name": "BaseBdev2", 00:11:48.189 "uuid": "de8d84a9-eb35-4a25-b668-f6a99b1ee4b5", 00:11:48.189 "is_configured": true, 00:11:48.189 "data_offset": 2048, 00:11:48.189 "data_size": 63488 00:11:48.189 } 00:11:48.189 ] 00:11:48.189 }' 00:11:48.189 23:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:48.189 23:53:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:48.755 23:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:11:48.755 23:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:11:48.755 23:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:48.755 23:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:11:49.013 23:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:11:49.013 23:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:49.013 23:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:49.271 [2024-05-14 23:53:49.751228] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:49.271 [2024-05-14 23:53:49.751313] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:49.271 [2024-05-14 23:53:49.764043] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:49.271 [2024-05-14 23:53:49.764111] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:49.271 [2024-05-14 23:53:49.764124] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17d14b0 name Existed_Raid, state offline 00:11:49.271 23:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:11:49.271 23:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:11:49.271 23:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:49.271 23:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:11:49.529 23:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:11:49.529 23:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:11:49.529 23:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@300 -- # '[' 2 -gt 2 ']' 00:11:49.529 23:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@342 -- # killprocess 396431 00:11:49.529 23:53:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@946 -- # '[' -z 396431 ']' 00:11:49.529 23:53:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # kill -0 396431 00:11:49.529 23:53:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # uname 00:11:49.529 23:53:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:11:49.529 23:53:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 396431 00:11:49.529 23:53:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:11:49.529 23:53:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:11:49.529 23:53:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 396431' 00:11:49.529 killing process with pid 396431 00:11:49.529 23:53:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@965 -- # kill 396431 00:11:49.529 [2024-05-14 23:53:50.087099] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:49.529 23:53:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@970 -- # wait 396431 00:11:49.529 [2024-05-14 23:53:50.087977] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:49.788 23:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@344 -- # return 0 00:11:49.788 00:11:49.788 real 0m10.345s 00:11:49.788 user 0m18.332s 00:11:49.788 sys 0m1.948s 00:11:49.788 23:53:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:11:49.788 23:53:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:49.788 ************************************ 00:11:49.788 END TEST raid_state_function_test_sb 00:11:49.788 ************************************ 00:11:49.788 23:53:50 bdev_raid -- bdev/bdev_raid.sh@817 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:11:49.788 23:53:50 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:11:49.788 23:53:50 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:11:49.788 23:53:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:50.046 ************************************ 00:11:50.046 START TEST raid_superblock_test 00:11:50.046 ************************************ 00:11:50.046 23:53:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1121 -- # raid_superblock_test raid1 2 00:11:50.046 23:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local raid_level=raid1 00:11:50.046 23:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=2 00:11:50.046 23:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:11:50.046 23:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:11:50.046 23:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:11:50.046 23:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:11:50.046 23:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:11:50.046 23:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:11:50.046 23:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:11:50.046 23:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size 00:11:50.046 23:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:11:50.046 23:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:11:50.046 23:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:11:50.046 23:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # '[' raid1 '!=' raid1 ']' 00:11:50.046 23:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # strip_size=0 00:11:50.046 23:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # raid_pid=398060 00:11:50.046 23:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # waitforlisten 398060 /var/tmp/spdk-raid.sock 00:11:50.046 23:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:50.046 23:53:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@827 -- # '[' -z 398060 ']' 00:11:50.046 23:53:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:50.046 23:53:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:11:50.047 23:53:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:50.047 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:50.047 23:53:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:11:50.047 23:53:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:50.047 [2024-05-14 23:53:50.465371] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:11:50.047 [2024-05-14 23:53:50.465442] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid398060 ] 00:11:50.047 [2024-05-14 23:53:50.593431] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:50.304 [2024-05-14 23:53:50.700738] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:11:50.304 [2024-05-14 23:53:50.768794] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:50.304 [2024-05-14 23:53:50.768834] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:50.870 23:53:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:11:50.870 23:53:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # return 0 00:11:50.870 23:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:11:50.870 23:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:11:50.870 23:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:11:50.870 23:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:11:50.870 23:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:11:50.870 23:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:50.870 23:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:11:50.870 23:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:50.870 23:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:11:51.132 malloc1 00:11:51.132 23:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:51.390 [2024-05-14 23:53:51.863340] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:51.390 [2024-05-14 23:53:51.863389] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:51.390 [2024-05-14 23:53:51.863421] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd8f780 00:11:51.390 [2024-05-14 23:53:51.863434] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:51.390 [2024-05-14 23:53:51.865166] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:51.390 [2024-05-14 23:53:51.865194] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:51.390 pt1 00:11:51.390 23:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:11:51.390 23:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:11:51.390 23:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:11:51.391 23:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:11:51.391 23:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:51.391 23:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:51.391 23:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:11:51.391 23:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:51.391 23:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:51.649 malloc2 00:11:51.649 23:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:51.907 [2024-05-14 23:53:52.358748] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:51.907 [2024-05-14 23:53:52.358796] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:51.907 [2024-05-14 23:53:52.358815] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd90b60 00:11:51.907 [2024-05-14 23:53:52.358827] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:51.907 [2024-05-14 23:53:52.360424] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:51.907 [2024-05-14 23:53:52.360452] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:51.907 pt2 00:11:51.907 23:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:11:51.907 23:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:11:51.907 23:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:11:52.166 [2024-05-14 23:53:52.599412] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:52.166 [2024-05-14 23:53:52.600787] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:52.166 [2024-05-14 23:53:52.600946] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xf3c1f0 00:11:52.166 [2024-05-14 23:53:52.600960] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:52.166 [2024-05-14 23:53:52.601172] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xda6670 00:11:52.166 [2024-05-14 23:53:52.601326] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf3c1f0 00:11:52.166 [2024-05-14 23:53:52.601336] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf3c1f0 00:11:52.166 [2024-05-14 23:53:52.601451] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:52.166 23:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:11:52.166 23:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:11:52.166 23:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:11:52.166 23:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:11:52.166 23:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:11:52.166 23:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:52.166 23:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:52.166 23:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:52.166 23:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:52.166 23:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:52.166 23:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:52.166 23:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:52.424 23:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:52.424 "name": "raid_bdev1", 00:11:52.424 "uuid": "11ac5987-e345-4762-9f68-fdf46091ca4b", 00:11:52.424 "strip_size_kb": 0, 00:11:52.424 "state": "online", 00:11:52.424 "raid_level": "raid1", 00:11:52.424 "superblock": true, 00:11:52.424 "num_base_bdevs": 2, 00:11:52.424 "num_base_bdevs_discovered": 2, 00:11:52.424 "num_base_bdevs_operational": 2, 00:11:52.424 "base_bdevs_list": [ 00:11:52.424 { 00:11:52.424 "name": "pt1", 00:11:52.424 "uuid": "431fb222-b44c-598c-8966-5f9807a75ccc", 00:11:52.424 "is_configured": true, 00:11:52.424 "data_offset": 2048, 00:11:52.424 "data_size": 63488 00:11:52.424 }, 00:11:52.424 { 00:11:52.424 "name": "pt2", 00:11:52.424 "uuid": "0c2d2685-4fd4-5f63-956f-a0c28d31fdc7", 00:11:52.424 "is_configured": true, 00:11:52.424 "data_offset": 2048, 00:11:52.424 "data_size": 63488 00:11:52.424 } 00:11:52.424 ] 00:11:52.424 }' 00:11:52.424 23:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:52.424 23:53:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:52.993 23:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:11:52.993 23:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:11:52.993 23:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:11:52.993 23:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:11:52.993 23:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:11:52.993 23:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:11:52.993 23:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:52.993 23:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:11:53.258 [2024-05-14 23:53:53.666422] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:53.258 23:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:11:53.258 "name": "raid_bdev1", 00:11:53.258 "aliases": [ 00:11:53.258 "11ac5987-e345-4762-9f68-fdf46091ca4b" 00:11:53.258 ], 00:11:53.258 "product_name": "Raid Volume", 00:11:53.258 "block_size": 512, 00:11:53.258 "num_blocks": 63488, 00:11:53.258 "uuid": "11ac5987-e345-4762-9f68-fdf46091ca4b", 00:11:53.258 "assigned_rate_limits": { 00:11:53.258 "rw_ios_per_sec": 0, 00:11:53.258 "rw_mbytes_per_sec": 0, 00:11:53.258 "r_mbytes_per_sec": 0, 00:11:53.258 "w_mbytes_per_sec": 0 00:11:53.258 }, 00:11:53.258 "claimed": false, 00:11:53.258 "zoned": false, 00:11:53.258 "supported_io_types": { 00:11:53.258 "read": true, 00:11:53.258 "write": true, 00:11:53.258 "unmap": false, 00:11:53.258 "write_zeroes": true, 00:11:53.258 "flush": false, 00:11:53.258 "reset": true, 00:11:53.258 "compare": false, 00:11:53.258 "compare_and_write": false, 00:11:53.258 "abort": false, 00:11:53.258 "nvme_admin": false, 00:11:53.258 "nvme_io": false 00:11:53.258 }, 00:11:53.258 "memory_domains": [ 00:11:53.258 { 00:11:53.258 "dma_device_id": "system", 00:11:53.258 "dma_device_type": 1 00:11:53.258 }, 00:11:53.258 { 00:11:53.258 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:53.258 "dma_device_type": 2 00:11:53.258 }, 00:11:53.258 { 00:11:53.258 "dma_device_id": "system", 00:11:53.258 "dma_device_type": 1 00:11:53.258 }, 00:11:53.258 { 00:11:53.258 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:53.258 "dma_device_type": 2 00:11:53.258 } 00:11:53.258 ], 00:11:53.258 "driver_specific": { 00:11:53.258 "raid": { 00:11:53.258 "uuid": "11ac5987-e345-4762-9f68-fdf46091ca4b", 00:11:53.258 "strip_size_kb": 0, 00:11:53.258 "state": "online", 00:11:53.258 "raid_level": "raid1", 00:11:53.258 "superblock": true, 00:11:53.258 "num_base_bdevs": 2, 00:11:53.258 "num_base_bdevs_discovered": 2, 00:11:53.258 "num_base_bdevs_operational": 2, 00:11:53.258 "base_bdevs_list": [ 00:11:53.258 { 00:11:53.258 "name": "pt1", 00:11:53.258 "uuid": "431fb222-b44c-598c-8966-5f9807a75ccc", 00:11:53.258 "is_configured": true, 00:11:53.258 "data_offset": 2048, 00:11:53.258 "data_size": 63488 00:11:53.258 }, 00:11:53.258 { 00:11:53.258 "name": "pt2", 00:11:53.258 "uuid": "0c2d2685-4fd4-5f63-956f-a0c28d31fdc7", 00:11:53.258 "is_configured": true, 00:11:53.258 "data_offset": 2048, 00:11:53.258 "data_size": 63488 00:11:53.258 } 00:11:53.258 ] 00:11:53.258 } 00:11:53.258 } 00:11:53.258 }' 00:11:53.258 23:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:53.258 23:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:11:53.258 pt2' 00:11:53.258 23:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:53.258 23:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:53.258 23:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:53.516 23:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:53.516 "name": "pt1", 00:11:53.516 "aliases": [ 00:11:53.516 "431fb222-b44c-598c-8966-5f9807a75ccc" 00:11:53.516 ], 00:11:53.516 "product_name": "passthru", 00:11:53.516 "block_size": 512, 00:11:53.516 "num_blocks": 65536, 00:11:53.516 "uuid": "431fb222-b44c-598c-8966-5f9807a75ccc", 00:11:53.516 "assigned_rate_limits": { 00:11:53.516 "rw_ios_per_sec": 0, 00:11:53.516 "rw_mbytes_per_sec": 0, 00:11:53.516 "r_mbytes_per_sec": 0, 00:11:53.516 "w_mbytes_per_sec": 0 00:11:53.516 }, 00:11:53.516 "claimed": true, 00:11:53.516 "claim_type": "exclusive_write", 00:11:53.516 "zoned": false, 00:11:53.516 "supported_io_types": { 00:11:53.516 "read": true, 00:11:53.516 "write": true, 00:11:53.516 "unmap": true, 00:11:53.516 "write_zeroes": true, 00:11:53.516 "flush": true, 00:11:53.516 "reset": true, 00:11:53.516 "compare": false, 00:11:53.516 "compare_and_write": false, 00:11:53.516 "abort": true, 00:11:53.516 "nvme_admin": false, 00:11:53.516 "nvme_io": false 00:11:53.516 }, 00:11:53.516 "memory_domains": [ 00:11:53.516 { 00:11:53.516 "dma_device_id": "system", 00:11:53.516 "dma_device_type": 1 00:11:53.516 }, 00:11:53.516 { 00:11:53.516 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:53.516 "dma_device_type": 2 00:11:53.516 } 00:11:53.516 ], 00:11:53.516 "driver_specific": { 00:11:53.516 "passthru": { 00:11:53.516 "name": "pt1", 00:11:53.516 "base_bdev_name": "malloc1" 00:11:53.516 } 00:11:53.516 } 00:11:53.516 }' 00:11:53.516 23:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:53.516 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:53.516 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:53.516 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:53.774 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:53.774 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:53.774 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:53.774 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:53.774 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:53.774 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:53.774 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:53.774 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:53.774 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:53.774 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:53.774 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:54.032 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:54.032 "name": "pt2", 00:11:54.032 "aliases": [ 00:11:54.032 "0c2d2685-4fd4-5f63-956f-a0c28d31fdc7" 00:11:54.032 ], 00:11:54.032 "product_name": "passthru", 00:11:54.032 "block_size": 512, 00:11:54.032 "num_blocks": 65536, 00:11:54.032 "uuid": "0c2d2685-4fd4-5f63-956f-a0c28d31fdc7", 00:11:54.032 "assigned_rate_limits": { 00:11:54.032 "rw_ios_per_sec": 0, 00:11:54.032 "rw_mbytes_per_sec": 0, 00:11:54.032 "r_mbytes_per_sec": 0, 00:11:54.032 "w_mbytes_per_sec": 0 00:11:54.032 }, 00:11:54.032 "claimed": true, 00:11:54.032 "claim_type": "exclusive_write", 00:11:54.032 "zoned": false, 00:11:54.032 "supported_io_types": { 00:11:54.032 "read": true, 00:11:54.032 "write": true, 00:11:54.032 "unmap": true, 00:11:54.032 "write_zeroes": true, 00:11:54.032 "flush": true, 00:11:54.032 "reset": true, 00:11:54.032 "compare": false, 00:11:54.032 "compare_and_write": false, 00:11:54.032 "abort": true, 00:11:54.032 "nvme_admin": false, 00:11:54.032 "nvme_io": false 00:11:54.032 }, 00:11:54.032 "memory_domains": [ 00:11:54.032 { 00:11:54.032 "dma_device_id": "system", 00:11:54.032 "dma_device_type": 1 00:11:54.032 }, 00:11:54.032 { 00:11:54.032 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:54.032 "dma_device_type": 2 00:11:54.032 } 00:11:54.032 ], 00:11:54.032 "driver_specific": { 00:11:54.032 "passthru": { 00:11:54.032 "name": "pt2", 00:11:54.032 "base_bdev_name": "malloc2" 00:11:54.032 } 00:11:54.032 } 00:11:54.032 }' 00:11:54.032 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:54.032 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:54.290 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:54.290 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:54.290 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:54.290 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:54.290 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:54.290 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:54.290 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:54.290 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:54.290 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:54.548 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:54.548 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:54.548 23:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:11:54.807 [2024-05-14 23:53:55.142310] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:54.807 23:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=11ac5987-e345-4762-9f68-fdf46091ca4b 00:11:54.807 23:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # '[' -z 11ac5987-e345-4762-9f68-fdf46091ca4b ']' 00:11:54.807 23:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:54.807 [2024-05-14 23:53:55.386750] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:54.807 [2024-05-14 23:53:55.386771] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:54.807 [2024-05-14 23:53:55.386828] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:54.807 [2024-05-14 23:53:55.386884] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:54.807 [2024-05-14 23:53:55.386896] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf3c1f0 name raid_bdev1, state offline 00:11:55.065 23:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:55.065 23:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:11:55.065 23:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:11:55.065 23:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:11:55.065 23:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:11:55.065 23:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:55.324 23:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:11:55.324 23:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:55.581 23:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:11:55.581 23:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:11:55.840 23:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:11:55.840 23:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:11:55.840 23:53:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:11:55.840 23:53:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:11:55.840 23:53:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:55.840 23:53:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:55.840 23:53:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:55.840 23:53:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:55.840 23:53:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:55.840 23:53:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:55.840 23:53:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:55.840 23:53:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:11:55.840 23:53:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:11:56.098 [2024-05-14 23:53:56.605925] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:11:56.098 [2024-05-14 23:53:56.607277] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:11:56.098 [2024-05-14 23:53:56.607332] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:11:56.098 [2024-05-14 23:53:56.607374] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:11:56.098 [2024-05-14 23:53:56.607392] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:56.098 [2024-05-14 23:53:56.607411] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd8fc00 name raid_bdev1, state configuring 00:11:56.098 request: 00:11:56.098 { 00:11:56.098 "name": "raid_bdev1", 00:11:56.098 "raid_level": "raid1", 00:11:56.098 "base_bdevs": [ 00:11:56.098 "malloc1", 00:11:56.098 "malloc2" 00:11:56.098 ], 00:11:56.098 "superblock": false, 00:11:56.098 "method": "bdev_raid_create", 00:11:56.098 "req_id": 1 00:11:56.098 } 00:11:56.098 Got JSON-RPC error response 00:11:56.098 response: 00:11:56.098 { 00:11:56.098 "code": -17, 00:11:56.098 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:11:56.098 } 00:11:56.098 23:53:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:11:56.098 23:53:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:11:56.098 23:53:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:11:56.098 23:53:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:11:56.098 23:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:56.098 23:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:11:56.356 23:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:11:56.356 23:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:11:56.356 23:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:56.614 [2024-05-14 23:53:57.095161] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:56.614 [2024-05-14 23:53:57.095206] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:56.614 [2024-05-14 23:53:57.095231] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd8f9b0 00:11:56.614 [2024-05-14 23:53:57.095244] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:56.614 [2024-05-14 23:53:57.096907] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:56.614 [2024-05-14 23:53:57.096935] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:56.614 [2024-05-14 23:53:57.097002] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt1 00:11:56.615 [2024-05-14 23:53:57.097029] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:56.615 pt1 00:11:56.615 23:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:11:56.615 23:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:11:56.615 23:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:56.615 23:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:11:56.615 23:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:11:56.615 23:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:56.615 23:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:56.615 23:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:56.615 23:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:56.615 23:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:56.615 23:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:56.615 23:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:56.873 23:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:56.873 "name": "raid_bdev1", 00:11:56.873 "uuid": "11ac5987-e345-4762-9f68-fdf46091ca4b", 00:11:56.873 "strip_size_kb": 0, 00:11:56.873 "state": "configuring", 00:11:56.873 "raid_level": "raid1", 00:11:56.873 "superblock": true, 00:11:56.873 "num_base_bdevs": 2, 00:11:56.873 "num_base_bdevs_discovered": 1, 00:11:56.873 "num_base_bdevs_operational": 2, 00:11:56.873 "base_bdevs_list": [ 00:11:56.873 { 00:11:56.873 "name": "pt1", 00:11:56.873 "uuid": "431fb222-b44c-598c-8966-5f9807a75ccc", 00:11:56.873 "is_configured": true, 00:11:56.873 "data_offset": 2048, 00:11:56.873 "data_size": 63488 00:11:56.873 }, 00:11:56.873 { 00:11:56.873 "name": null, 00:11:56.873 "uuid": "0c2d2685-4fd4-5f63-956f-a0c28d31fdc7", 00:11:56.873 "is_configured": false, 00:11:56.873 "data_offset": 2048, 00:11:56.873 "data_size": 63488 00:11:56.873 } 00:11:56.873 ] 00:11:56.873 }' 00:11:56.873 23:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:56.873 23:53:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:57.439 23:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@470 -- # '[' 2 -gt 2 ']' 00:11:57.439 23:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:11:57.439 23:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:11:57.439 23:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:57.697 [2024-05-14 23:53:58.121890] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:57.697 [2024-05-14 23:53:58.121938] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:57.697 [2024-05-14 23:53:58.121960] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf41110 00:11:57.697 [2024-05-14 23:53:58.121973] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:57.697 [2024-05-14 23:53:58.122310] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:57.697 [2024-05-14 23:53:58.122327] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:57.697 [2024-05-14 23:53:58.122390] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:11:57.697 [2024-05-14 23:53:58.122416] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:57.697 [2024-05-14 23:53:58.122515] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xf40a60 00:11:57.697 [2024-05-14 23:53:58.122530] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:57.697 [2024-05-14 23:53:58.122710] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd8f0e0 00:11:57.697 [2024-05-14 23:53:58.122840] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf40a60 00:11:57.697 [2024-05-14 23:53:58.122850] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf40a60 00:11:57.697 [2024-05-14 23:53:58.122948] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:57.697 pt2 00:11:57.697 23:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:11:57.697 23:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:11:57.697 23:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:11:57.697 23:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:11:57.697 23:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:11:57.697 23:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:11:57.697 23:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:11:57.697 23:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:57.697 23:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:57.697 23:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:57.697 23:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:57.697 23:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:57.697 23:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:57.697 23:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:57.956 23:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:57.956 "name": "raid_bdev1", 00:11:57.956 "uuid": "11ac5987-e345-4762-9f68-fdf46091ca4b", 00:11:57.956 "strip_size_kb": 0, 00:11:57.956 "state": "online", 00:11:57.956 "raid_level": "raid1", 00:11:57.956 "superblock": true, 00:11:57.956 "num_base_bdevs": 2, 00:11:57.956 "num_base_bdevs_discovered": 2, 00:11:57.956 "num_base_bdevs_operational": 2, 00:11:57.956 "base_bdevs_list": [ 00:11:57.956 { 00:11:57.956 "name": "pt1", 00:11:57.956 "uuid": "431fb222-b44c-598c-8966-5f9807a75ccc", 00:11:57.956 "is_configured": true, 00:11:57.956 "data_offset": 2048, 00:11:57.956 "data_size": 63488 00:11:57.956 }, 00:11:57.956 { 00:11:57.956 "name": "pt2", 00:11:57.956 "uuid": "0c2d2685-4fd4-5f63-956f-a0c28d31fdc7", 00:11:57.956 "is_configured": true, 00:11:57.956 "data_offset": 2048, 00:11:57.956 "data_size": 63488 00:11:57.956 } 00:11:57.956 ] 00:11:57.956 }' 00:11:57.956 23:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:57.956 23:53:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:58.522 23:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:11:58.522 23:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:11:58.522 23:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:11:58.522 23:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:11:58.522 23:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:11:58.522 23:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:11:58.522 23:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:58.522 23:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:11:58.780 [2024-05-14 23:53:59.221075] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:58.780 23:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:11:58.780 "name": "raid_bdev1", 00:11:58.780 "aliases": [ 00:11:58.780 "11ac5987-e345-4762-9f68-fdf46091ca4b" 00:11:58.780 ], 00:11:58.780 "product_name": "Raid Volume", 00:11:58.780 "block_size": 512, 00:11:58.780 "num_blocks": 63488, 00:11:58.780 "uuid": "11ac5987-e345-4762-9f68-fdf46091ca4b", 00:11:58.780 "assigned_rate_limits": { 00:11:58.780 "rw_ios_per_sec": 0, 00:11:58.780 "rw_mbytes_per_sec": 0, 00:11:58.780 "r_mbytes_per_sec": 0, 00:11:58.780 "w_mbytes_per_sec": 0 00:11:58.780 }, 00:11:58.780 "claimed": false, 00:11:58.780 "zoned": false, 00:11:58.780 "supported_io_types": { 00:11:58.780 "read": true, 00:11:58.780 "write": true, 00:11:58.780 "unmap": false, 00:11:58.780 "write_zeroes": true, 00:11:58.780 "flush": false, 00:11:58.780 "reset": true, 00:11:58.780 "compare": false, 00:11:58.780 "compare_and_write": false, 00:11:58.780 "abort": false, 00:11:58.780 "nvme_admin": false, 00:11:58.780 "nvme_io": false 00:11:58.780 }, 00:11:58.780 "memory_domains": [ 00:11:58.780 { 00:11:58.780 "dma_device_id": "system", 00:11:58.780 "dma_device_type": 1 00:11:58.780 }, 00:11:58.780 { 00:11:58.780 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:58.780 "dma_device_type": 2 00:11:58.780 }, 00:11:58.780 { 00:11:58.780 "dma_device_id": "system", 00:11:58.780 "dma_device_type": 1 00:11:58.780 }, 00:11:58.780 { 00:11:58.780 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:58.780 "dma_device_type": 2 00:11:58.780 } 00:11:58.780 ], 00:11:58.780 "driver_specific": { 00:11:58.780 "raid": { 00:11:58.780 "uuid": "11ac5987-e345-4762-9f68-fdf46091ca4b", 00:11:58.780 "strip_size_kb": 0, 00:11:58.780 "state": "online", 00:11:58.780 "raid_level": "raid1", 00:11:58.780 "superblock": true, 00:11:58.780 "num_base_bdevs": 2, 00:11:58.780 "num_base_bdevs_discovered": 2, 00:11:58.780 "num_base_bdevs_operational": 2, 00:11:58.780 "base_bdevs_list": [ 00:11:58.780 { 00:11:58.780 "name": "pt1", 00:11:58.780 "uuid": "431fb222-b44c-598c-8966-5f9807a75ccc", 00:11:58.780 "is_configured": true, 00:11:58.780 "data_offset": 2048, 00:11:58.780 "data_size": 63488 00:11:58.780 }, 00:11:58.780 { 00:11:58.780 "name": "pt2", 00:11:58.780 "uuid": "0c2d2685-4fd4-5f63-956f-a0c28d31fdc7", 00:11:58.780 "is_configured": true, 00:11:58.780 "data_offset": 2048, 00:11:58.780 "data_size": 63488 00:11:58.780 } 00:11:58.780 ] 00:11:58.780 } 00:11:58.780 } 00:11:58.780 }' 00:11:58.780 23:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:58.780 23:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:11:58.780 pt2' 00:11:58.780 23:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:58.780 23:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:58.780 23:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:59.039 23:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:59.039 "name": "pt1", 00:11:59.039 "aliases": [ 00:11:59.039 "431fb222-b44c-598c-8966-5f9807a75ccc" 00:11:59.039 ], 00:11:59.039 "product_name": "passthru", 00:11:59.039 "block_size": 512, 00:11:59.039 "num_blocks": 65536, 00:11:59.039 "uuid": "431fb222-b44c-598c-8966-5f9807a75ccc", 00:11:59.039 "assigned_rate_limits": { 00:11:59.039 "rw_ios_per_sec": 0, 00:11:59.039 "rw_mbytes_per_sec": 0, 00:11:59.039 "r_mbytes_per_sec": 0, 00:11:59.039 "w_mbytes_per_sec": 0 00:11:59.039 }, 00:11:59.039 "claimed": true, 00:11:59.039 "claim_type": "exclusive_write", 00:11:59.039 "zoned": false, 00:11:59.039 "supported_io_types": { 00:11:59.039 "read": true, 00:11:59.039 "write": true, 00:11:59.039 "unmap": true, 00:11:59.039 "write_zeroes": true, 00:11:59.039 "flush": true, 00:11:59.039 "reset": true, 00:11:59.039 "compare": false, 00:11:59.039 "compare_and_write": false, 00:11:59.039 "abort": true, 00:11:59.039 "nvme_admin": false, 00:11:59.039 "nvme_io": false 00:11:59.039 }, 00:11:59.039 "memory_domains": [ 00:11:59.039 { 00:11:59.039 "dma_device_id": "system", 00:11:59.039 "dma_device_type": 1 00:11:59.039 }, 00:11:59.039 { 00:11:59.039 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:59.039 "dma_device_type": 2 00:11:59.039 } 00:11:59.039 ], 00:11:59.039 "driver_specific": { 00:11:59.039 "passthru": { 00:11:59.039 "name": "pt1", 00:11:59.039 "base_bdev_name": "malloc1" 00:11:59.039 } 00:11:59.039 } 00:11:59.039 }' 00:11:59.039 23:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:59.039 23:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:59.039 23:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:59.039 23:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:59.298 23:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:59.298 23:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:59.298 23:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:59.298 23:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:59.298 23:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:59.298 23:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:59.298 23:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:59.298 23:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:59.298 23:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:59.298 23:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:59.298 23:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:59.556 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:59.556 "name": "pt2", 00:11:59.556 "aliases": [ 00:11:59.556 "0c2d2685-4fd4-5f63-956f-a0c28d31fdc7" 00:11:59.556 ], 00:11:59.556 "product_name": "passthru", 00:11:59.556 "block_size": 512, 00:11:59.556 "num_blocks": 65536, 00:11:59.556 "uuid": "0c2d2685-4fd4-5f63-956f-a0c28d31fdc7", 00:11:59.556 "assigned_rate_limits": { 00:11:59.556 "rw_ios_per_sec": 0, 00:11:59.556 "rw_mbytes_per_sec": 0, 00:11:59.556 "r_mbytes_per_sec": 0, 00:11:59.556 "w_mbytes_per_sec": 0 00:11:59.556 }, 00:11:59.556 "claimed": true, 00:11:59.556 "claim_type": "exclusive_write", 00:11:59.556 "zoned": false, 00:11:59.556 "supported_io_types": { 00:11:59.556 "read": true, 00:11:59.556 "write": true, 00:11:59.556 "unmap": true, 00:11:59.556 "write_zeroes": true, 00:11:59.556 "flush": true, 00:11:59.556 "reset": true, 00:11:59.556 "compare": false, 00:11:59.556 "compare_and_write": false, 00:11:59.556 "abort": true, 00:11:59.556 "nvme_admin": false, 00:11:59.556 "nvme_io": false 00:11:59.556 }, 00:11:59.556 "memory_domains": [ 00:11:59.556 { 00:11:59.556 "dma_device_id": "system", 00:11:59.556 "dma_device_type": 1 00:11:59.556 }, 00:11:59.556 { 00:11:59.556 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:59.556 "dma_device_type": 2 00:11:59.556 } 00:11:59.556 ], 00:11:59.556 "driver_specific": { 00:11:59.556 "passthru": { 00:11:59.556 "name": "pt2", 00:11:59.556 "base_bdev_name": "malloc2" 00:11:59.556 } 00:11:59.556 } 00:11:59.556 }' 00:11:59.556 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:59.814 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:59.814 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:59.814 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:59.814 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:59.814 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:59.814 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:59.814 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:59.814 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:59.814 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:00.072 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:00.072 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:00.072 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:12:00.072 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:00.330 [2024-05-14 23:54:00.680949] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:00.330 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # '[' 11ac5987-e345-4762-9f68-fdf46091ca4b '!=' 11ac5987-e345-4762-9f68-fdf46091ca4b ']' 00:12:00.330 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@491 -- # has_redundancy raid1 00:12:00.330 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:12:00.330 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 0 00:12:00.330 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:00.589 [2024-05-14 23:54:00.921384] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:12:00.589 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@496 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:00.589 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:12:00.589 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:12:00.589 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:12:00.589 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:12:00.589 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:12:00.589 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:00.589 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:00.589 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:00.589 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:00.589 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:00.589 23:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:00.847 23:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:00.847 "name": "raid_bdev1", 00:12:00.847 "uuid": "11ac5987-e345-4762-9f68-fdf46091ca4b", 00:12:00.847 "strip_size_kb": 0, 00:12:00.847 "state": "online", 00:12:00.847 "raid_level": "raid1", 00:12:00.847 "superblock": true, 00:12:00.847 "num_base_bdevs": 2, 00:12:00.847 "num_base_bdevs_discovered": 1, 00:12:00.847 "num_base_bdevs_operational": 1, 00:12:00.847 "base_bdevs_list": [ 00:12:00.847 { 00:12:00.847 "name": null, 00:12:00.847 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:00.847 "is_configured": false, 00:12:00.847 "data_offset": 2048, 00:12:00.847 "data_size": 63488 00:12:00.847 }, 00:12:00.847 { 00:12:00.847 "name": "pt2", 00:12:00.847 "uuid": "0c2d2685-4fd4-5f63-956f-a0c28d31fdc7", 00:12:00.847 "is_configured": true, 00:12:00.847 "data_offset": 2048, 00:12:00.847 "data_size": 63488 00:12:00.847 } 00:12:00.847 ] 00:12:00.847 }' 00:12:00.847 23:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:00.847 23:54:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:01.413 23:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:01.413 [2024-05-14 23:54:01.936050] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:01.413 [2024-05-14 23:54:01.936077] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:01.413 [2024-05-14 23:54:01.936132] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:01.413 [2024-05-14 23:54:01.936181] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:01.413 [2024-05-14 23:54:01.936193] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf40a60 name raid_bdev1, state offline 00:12:01.413 23:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:01.413 23:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # jq -r '.[]' 00:12:01.672 23:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # raid_bdev= 00:12:01.672 23:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@501 -- # '[' -n '' ']' 00:12:01.672 23:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i = 1 )) 00:12:01.672 23:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:12:01.672 23:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@507 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:01.931 23:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i++ )) 00:12:01.931 23:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:12:01.931 23:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i = 1 )) 00:12:01.931 23:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i < num_base_bdevs - 1 )) 00:12:01.931 23:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # i=1 00:12:01.931 23:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@520 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:02.190 [2024-05-14 23:54:02.621839] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:02.190 [2024-05-14 23:54:02.621885] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:02.190 [2024-05-14 23:54:02.621904] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf38aa0 00:12:02.190 [2024-05-14 23:54:02.621917] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:02.190 [2024-05-14 23:54:02.623516] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:02.190 [2024-05-14 23:54:02.623545] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:02.190 [2024-05-14 23:54:02.623608] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:12:02.190 [2024-05-14 23:54:02.623634] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:02.190 [2024-05-14 23:54:02.623718] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xf39bd0 00:12:02.190 [2024-05-14 23:54:02.623728] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:02.190 [2024-05-14 23:54:02.623909] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd8f450 00:12:02.190 [2024-05-14 23:54:02.624036] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf39bd0 00:12:02.190 [2024-05-14 23:54:02.624046] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf39bd0 00:12:02.190 [2024-05-14 23:54:02.624145] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:02.190 pt2 00:12:02.190 23:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@523 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:02.190 23:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:12:02.190 23:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:12:02.190 23:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:12:02.190 23:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:12:02.190 23:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:12:02.190 23:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:02.190 23:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:02.190 23:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:02.190 23:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:02.190 23:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:02.190 23:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:02.449 23:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:02.449 "name": "raid_bdev1", 00:12:02.449 "uuid": "11ac5987-e345-4762-9f68-fdf46091ca4b", 00:12:02.449 "strip_size_kb": 0, 00:12:02.449 "state": "online", 00:12:02.449 "raid_level": "raid1", 00:12:02.449 "superblock": true, 00:12:02.449 "num_base_bdevs": 2, 00:12:02.449 "num_base_bdevs_discovered": 1, 00:12:02.449 "num_base_bdevs_operational": 1, 00:12:02.449 "base_bdevs_list": [ 00:12:02.449 { 00:12:02.449 "name": null, 00:12:02.449 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:02.449 "is_configured": false, 00:12:02.449 "data_offset": 2048, 00:12:02.449 "data_size": 63488 00:12:02.449 }, 00:12:02.449 { 00:12:02.449 "name": "pt2", 00:12:02.449 "uuid": "0c2d2685-4fd4-5f63-956f-a0c28d31fdc7", 00:12:02.449 "is_configured": true, 00:12:02.449 "data_offset": 2048, 00:12:02.449 "data_size": 63488 00:12:02.449 } 00:12:02.449 ] 00:12:02.449 }' 00:12:02.449 23:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:02.449 23:54:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:03.015 23:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # '[' 2 -gt 2 ']' 00:12:03.015 23:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@563 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:03.015 23:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@563 -- # jq -r '.[] | .uuid' 00:12:03.274 [2024-05-14 23:54:03.700905] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:03.274 23:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@563 -- # '[' 11ac5987-e345-4762-9f68-fdf46091ca4b '!=' 11ac5987-e345-4762-9f68-fdf46091ca4b ']' 00:12:03.274 23:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@568 -- # killprocess 398060 00:12:03.274 23:54:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@946 -- # '[' -z 398060 ']' 00:12:03.274 23:54:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # kill -0 398060 00:12:03.274 23:54:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # uname 00:12:03.274 23:54:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:12:03.274 23:54:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 398060 00:12:03.274 23:54:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:12:03.274 23:54:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:12:03.274 23:54:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 398060' 00:12:03.274 killing process with pid 398060 00:12:03.274 23:54:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@965 -- # kill 398060 00:12:03.274 [2024-05-14 23:54:03.771051] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:03.274 [2024-05-14 23:54:03.771121] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:03.274 [2024-05-14 23:54:03.771178] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:03.274 [2024-05-14 23:54:03.771192] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf39bd0 name raid_bdev1, state offline 00:12:03.274 23:54:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@970 -- # wait 398060 00:12:03.274 [2024-05-14 23:54:03.789939] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:03.550 23:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # return 0 00:12:03.550 00:12:03.550 real 0m13.633s 00:12:03.550 user 0m24.527s 00:12:03.550 sys 0m2.555s 00:12:03.550 23:54:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:12:03.550 23:54:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:03.550 ************************************ 00:12:03.550 END TEST raid_superblock_test 00:12:03.550 ************************************ 00:12:03.550 23:54:04 bdev_raid -- bdev/bdev_raid.sh@813 -- # for n in {2..4} 00:12:03.550 23:54:04 bdev_raid -- bdev/bdev_raid.sh@814 -- # for level in raid0 concat raid1 00:12:03.550 23:54:04 bdev_raid -- bdev/bdev_raid.sh@815 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:12:03.550 23:54:04 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:12:03.550 23:54:04 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:12:03.550 23:54:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:03.818 ************************************ 00:12:03.818 START TEST raid_state_function_test 00:12:03.818 ************************************ 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1121 -- # raid_state_function_test raid0 3 false 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local raid_level=raid0 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=3 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local superblock=false 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # '[' raid0 '!=' raid1 ']' 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@238 -- # '[' false = true ']' 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@241 -- # superblock_create_arg= 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # raid_pid=400278 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 400278' 00:12:03.818 Process raid pid: 400278 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@247 -- # waitforlisten 400278 /var/tmp/spdk-raid.sock 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@827 -- # '[' -z 400278 ']' 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:03.818 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:12:03.818 23:54:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:03.818 [2024-05-14 23:54:04.195923] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:12:03.818 [2024-05-14 23:54:04.195984] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:03.818 [2024-05-14 23:54:04.322881] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:04.076 [2024-05-14 23:54:04.423507] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:12:04.076 [2024-05-14 23:54:04.491211] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:04.076 [2024-05-14 23:54:04.491249] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:04.643 23:54:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:12:04.643 23:54:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # return 0 00:12:04.643 23:54:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:04.901 [2024-05-14 23:54:05.361486] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:04.901 [2024-05-14 23:54:05.361535] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:04.901 [2024-05-14 23:54:05.361550] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:04.901 [2024-05-14 23:54:05.361566] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:04.901 [2024-05-14 23:54:05.361578] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:04.901 [2024-05-14 23:54:05.361597] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:04.901 23:54:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:04.901 23:54:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:04.901 23:54:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:04.901 23:54:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:04.901 23:54:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:04.901 23:54:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:04.901 23:54:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:04.901 23:54:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:04.901 23:54:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:04.901 23:54:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:04.901 23:54:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:04.901 23:54:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:05.159 23:54:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:05.159 "name": "Existed_Raid", 00:12:05.159 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:05.159 "strip_size_kb": 64, 00:12:05.159 "state": "configuring", 00:12:05.159 "raid_level": "raid0", 00:12:05.159 "superblock": false, 00:12:05.159 "num_base_bdevs": 3, 00:12:05.159 "num_base_bdevs_discovered": 0, 00:12:05.159 "num_base_bdevs_operational": 3, 00:12:05.159 "base_bdevs_list": [ 00:12:05.159 { 00:12:05.159 "name": "BaseBdev1", 00:12:05.159 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:05.159 "is_configured": false, 00:12:05.159 "data_offset": 0, 00:12:05.159 "data_size": 0 00:12:05.159 }, 00:12:05.159 { 00:12:05.159 "name": "BaseBdev2", 00:12:05.159 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:05.159 "is_configured": false, 00:12:05.159 "data_offset": 0, 00:12:05.159 "data_size": 0 00:12:05.159 }, 00:12:05.159 { 00:12:05.159 "name": "BaseBdev3", 00:12:05.159 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:05.159 "is_configured": false, 00:12:05.159 "data_offset": 0, 00:12:05.159 "data_size": 0 00:12:05.159 } 00:12:05.159 ] 00:12:05.159 }' 00:12:05.159 23:54:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:05.159 23:54:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:05.725 23:54:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:05.983 [2024-05-14 23:54:06.408123] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:05.983 [2024-05-14 23:54:06.408157] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2130be0 name Existed_Raid, state configuring 00:12:05.983 23:54:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:06.241 [2024-05-14 23:54:06.652786] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:06.241 [2024-05-14 23:54:06.652822] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:06.241 [2024-05-14 23:54:06.652836] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:06.241 [2024-05-14 23:54:06.652863] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:06.241 [2024-05-14 23:54:06.652875] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:06.241 [2024-05-14 23:54:06.652892] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:06.241 23:54:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:06.499 [2024-05-14 23:54:06.904581] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:06.499 BaseBdev1 00:12:06.499 23:54:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:12:06.499 23:54:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:12:06.499 23:54:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:06.499 23:54:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:12:06.499 23:54:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:06.499 23:54:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:06.499 23:54:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:06.757 23:54:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:07.014 [ 00:12:07.014 { 00:12:07.014 "name": "BaseBdev1", 00:12:07.014 "aliases": [ 00:12:07.014 "52d16ac0-ff3d-4ceb-9c29-b3a6d0389f60" 00:12:07.014 ], 00:12:07.014 "product_name": "Malloc disk", 00:12:07.014 "block_size": 512, 00:12:07.014 "num_blocks": 65536, 00:12:07.014 "uuid": "52d16ac0-ff3d-4ceb-9c29-b3a6d0389f60", 00:12:07.014 "assigned_rate_limits": { 00:12:07.014 "rw_ios_per_sec": 0, 00:12:07.014 "rw_mbytes_per_sec": 0, 00:12:07.014 "r_mbytes_per_sec": 0, 00:12:07.014 "w_mbytes_per_sec": 0 00:12:07.014 }, 00:12:07.014 "claimed": true, 00:12:07.014 "claim_type": "exclusive_write", 00:12:07.014 "zoned": false, 00:12:07.014 "supported_io_types": { 00:12:07.014 "read": true, 00:12:07.014 "write": true, 00:12:07.014 "unmap": true, 00:12:07.014 "write_zeroes": true, 00:12:07.014 "flush": true, 00:12:07.014 "reset": true, 00:12:07.014 "compare": false, 00:12:07.014 "compare_and_write": false, 00:12:07.014 "abort": true, 00:12:07.014 "nvme_admin": false, 00:12:07.014 "nvme_io": false 00:12:07.014 }, 00:12:07.014 "memory_domains": [ 00:12:07.014 { 00:12:07.014 "dma_device_id": "system", 00:12:07.014 "dma_device_type": 1 00:12:07.014 }, 00:12:07.014 { 00:12:07.014 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:07.014 "dma_device_type": 2 00:12:07.014 } 00:12:07.015 ], 00:12:07.015 "driver_specific": {} 00:12:07.015 } 00:12:07.015 ] 00:12:07.015 23:54:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:12:07.015 23:54:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:07.015 23:54:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:07.015 23:54:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:07.015 23:54:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:07.015 23:54:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:07.015 23:54:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:07.015 23:54:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:07.015 23:54:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:07.015 23:54:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:07.015 23:54:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:07.015 23:54:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:07.015 23:54:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:07.272 23:54:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:07.272 "name": "Existed_Raid", 00:12:07.272 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:07.272 "strip_size_kb": 64, 00:12:07.272 "state": "configuring", 00:12:07.272 "raid_level": "raid0", 00:12:07.272 "superblock": false, 00:12:07.272 "num_base_bdevs": 3, 00:12:07.272 "num_base_bdevs_discovered": 1, 00:12:07.272 "num_base_bdevs_operational": 3, 00:12:07.272 "base_bdevs_list": [ 00:12:07.272 { 00:12:07.272 "name": "BaseBdev1", 00:12:07.272 "uuid": "52d16ac0-ff3d-4ceb-9c29-b3a6d0389f60", 00:12:07.272 "is_configured": true, 00:12:07.272 "data_offset": 0, 00:12:07.272 "data_size": 65536 00:12:07.272 }, 00:12:07.272 { 00:12:07.272 "name": "BaseBdev2", 00:12:07.272 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:07.272 "is_configured": false, 00:12:07.272 "data_offset": 0, 00:12:07.272 "data_size": 0 00:12:07.272 }, 00:12:07.272 { 00:12:07.272 "name": "BaseBdev3", 00:12:07.272 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:07.272 "is_configured": false, 00:12:07.272 "data_offset": 0, 00:12:07.272 "data_size": 0 00:12:07.272 } 00:12:07.272 ] 00:12:07.272 }' 00:12:07.272 23:54:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:07.272 23:54:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:07.838 23:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:08.096 [2024-05-14 23:54:08.432629] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:08.096 [2024-05-14 23:54:08.432673] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21304b0 name Existed_Raid, state configuring 00:12:08.096 23:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:08.096 [2024-05-14 23:54:08.673308] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:08.096 [2024-05-14 23:54:08.674799] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:08.096 [2024-05-14 23:54:08.674833] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:08.096 [2024-05-14 23:54:08.674844] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:08.096 [2024-05-14 23:54:08.674856] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:08.354 23:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:12:08.354 23:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:12:08.354 23:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:08.354 23:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:08.354 23:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:08.354 23:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:08.354 23:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:08.354 23:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:08.354 23:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:08.354 23:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:08.354 23:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:08.354 23:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:08.354 23:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:08.354 23:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:08.612 23:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:08.612 "name": "Existed_Raid", 00:12:08.612 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:08.612 "strip_size_kb": 64, 00:12:08.612 "state": "configuring", 00:12:08.612 "raid_level": "raid0", 00:12:08.612 "superblock": false, 00:12:08.612 "num_base_bdevs": 3, 00:12:08.612 "num_base_bdevs_discovered": 1, 00:12:08.612 "num_base_bdevs_operational": 3, 00:12:08.612 "base_bdevs_list": [ 00:12:08.612 { 00:12:08.612 "name": "BaseBdev1", 00:12:08.612 "uuid": "52d16ac0-ff3d-4ceb-9c29-b3a6d0389f60", 00:12:08.612 "is_configured": true, 00:12:08.612 "data_offset": 0, 00:12:08.612 "data_size": 65536 00:12:08.612 }, 00:12:08.612 { 00:12:08.612 "name": "BaseBdev2", 00:12:08.612 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:08.612 "is_configured": false, 00:12:08.612 "data_offset": 0, 00:12:08.612 "data_size": 0 00:12:08.612 }, 00:12:08.612 { 00:12:08.612 "name": "BaseBdev3", 00:12:08.612 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:08.612 "is_configured": false, 00:12:08.612 "data_offset": 0, 00:12:08.612 "data_size": 0 00:12:08.612 } 00:12:08.612 ] 00:12:08.612 }' 00:12:08.612 23:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:08.612 23:54:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:09.178 23:54:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:09.178 [2024-05-14 23:54:09.763520] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:09.178 BaseBdev2 00:12:09.436 23:54:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:12:09.436 23:54:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:12:09.436 23:54:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:09.436 23:54:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:12:09.436 23:54:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:09.436 23:54:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:09.436 23:54:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:09.694 23:54:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:09.694 [ 00:12:09.694 { 00:12:09.694 "name": "BaseBdev2", 00:12:09.694 "aliases": [ 00:12:09.694 "e3fd6929-0dcd-45c9-acd5-17ac186dd0c5" 00:12:09.694 ], 00:12:09.694 "product_name": "Malloc disk", 00:12:09.694 "block_size": 512, 00:12:09.694 "num_blocks": 65536, 00:12:09.694 "uuid": "e3fd6929-0dcd-45c9-acd5-17ac186dd0c5", 00:12:09.694 "assigned_rate_limits": { 00:12:09.694 "rw_ios_per_sec": 0, 00:12:09.694 "rw_mbytes_per_sec": 0, 00:12:09.694 "r_mbytes_per_sec": 0, 00:12:09.694 "w_mbytes_per_sec": 0 00:12:09.694 }, 00:12:09.694 "claimed": true, 00:12:09.694 "claim_type": "exclusive_write", 00:12:09.694 "zoned": false, 00:12:09.694 "supported_io_types": { 00:12:09.694 "read": true, 00:12:09.694 "write": true, 00:12:09.694 "unmap": true, 00:12:09.694 "write_zeroes": true, 00:12:09.694 "flush": true, 00:12:09.694 "reset": true, 00:12:09.694 "compare": false, 00:12:09.695 "compare_and_write": false, 00:12:09.695 "abort": true, 00:12:09.695 "nvme_admin": false, 00:12:09.695 "nvme_io": false 00:12:09.695 }, 00:12:09.695 "memory_domains": [ 00:12:09.695 { 00:12:09.695 "dma_device_id": "system", 00:12:09.695 "dma_device_type": 1 00:12:09.695 }, 00:12:09.695 { 00:12:09.695 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:09.695 "dma_device_type": 2 00:12:09.695 } 00:12:09.695 ], 00:12:09.695 "driver_specific": {} 00:12:09.695 } 00:12:09.695 ] 00:12:09.695 23:54:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:12:09.695 23:54:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:12:09.695 23:54:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:12:09.695 23:54:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:09.695 23:54:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:09.695 23:54:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:09.695 23:54:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:09.695 23:54:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:09.695 23:54:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:09.695 23:54:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:09.695 23:54:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:09.695 23:54:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:09.695 23:54:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:09.695 23:54:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:09.695 23:54:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:09.952 23:54:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:09.952 "name": "Existed_Raid", 00:12:09.952 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:09.952 "strip_size_kb": 64, 00:12:09.952 "state": "configuring", 00:12:09.952 "raid_level": "raid0", 00:12:09.952 "superblock": false, 00:12:09.952 "num_base_bdevs": 3, 00:12:09.952 "num_base_bdevs_discovered": 2, 00:12:09.952 "num_base_bdevs_operational": 3, 00:12:09.952 "base_bdevs_list": [ 00:12:09.952 { 00:12:09.952 "name": "BaseBdev1", 00:12:09.952 "uuid": "52d16ac0-ff3d-4ceb-9c29-b3a6d0389f60", 00:12:09.952 "is_configured": true, 00:12:09.952 "data_offset": 0, 00:12:09.952 "data_size": 65536 00:12:09.952 }, 00:12:09.952 { 00:12:09.952 "name": "BaseBdev2", 00:12:09.952 "uuid": "e3fd6929-0dcd-45c9-acd5-17ac186dd0c5", 00:12:09.952 "is_configured": true, 00:12:09.952 "data_offset": 0, 00:12:09.952 "data_size": 65536 00:12:09.952 }, 00:12:09.952 { 00:12:09.952 "name": "BaseBdev3", 00:12:09.952 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:09.952 "is_configured": false, 00:12:09.952 "data_offset": 0, 00:12:09.952 "data_size": 0 00:12:09.952 } 00:12:09.952 ] 00:12:09.952 }' 00:12:09.953 23:54:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:09.953 23:54:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:10.518 23:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:10.776 [2024-05-14 23:54:11.218771] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:10.776 [2024-05-14 23:54:11.218808] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x2131560 00:12:10.776 [2024-05-14 23:54:11.218817] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:12:10.776 [2024-05-14 23:54:11.219012] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2148490 00:12:10.776 [2024-05-14 23:54:11.219135] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2131560 00:12:10.776 [2024-05-14 23:54:11.219145] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2131560 00:12:10.776 [2024-05-14 23:54:11.219308] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:10.776 BaseBdev3 00:12:10.776 23:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:12:10.776 23:54:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:12:10.776 23:54:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:10.776 23:54:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:12:10.776 23:54:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:10.776 23:54:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:10.776 23:54:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:11.035 23:54:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:11.293 [ 00:12:11.293 { 00:12:11.293 "name": "BaseBdev3", 00:12:11.293 "aliases": [ 00:12:11.293 "8b16b037-edbb-4a8a-89bd-135beb535dd8" 00:12:11.293 ], 00:12:11.293 "product_name": "Malloc disk", 00:12:11.293 "block_size": 512, 00:12:11.293 "num_blocks": 65536, 00:12:11.293 "uuid": "8b16b037-edbb-4a8a-89bd-135beb535dd8", 00:12:11.294 "assigned_rate_limits": { 00:12:11.294 "rw_ios_per_sec": 0, 00:12:11.294 "rw_mbytes_per_sec": 0, 00:12:11.294 "r_mbytes_per_sec": 0, 00:12:11.294 "w_mbytes_per_sec": 0 00:12:11.294 }, 00:12:11.294 "claimed": true, 00:12:11.294 "claim_type": "exclusive_write", 00:12:11.294 "zoned": false, 00:12:11.294 "supported_io_types": { 00:12:11.294 "read": true, 00:12:11.294 "write": true, 00:12:11.294 "unmap": true, 00:12:11.294 "write_zeroes": true, 00:12:11.294 "flush": true, 00:12:11.294 "reset": true, 00:12:11.294 "compare": false, 00:12:11.294 "compare_and_write": false, 00:12:11.294 "abort": true, 00:12:11.294 "nvme_admin": false, 00:12:11.294 "nvme_io": false 00:12:11.294 }, 00:12:11.294 "memory_domains": [ 00:12:11.294 { 00:12:11.294 "dma_device_id": "system", 00:12:11.294 "dma_device_type": 1 00:12:11.294 }, 00:12:11.294 { 00:12:11.294 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:11.294 "dma_device_type": 2 00:12:11.294 } 00:12:11.294 ], 00:12:11.294 "driver_specific": {} 00:12:11.294 } 00:12:11.294 ] 00:12:11.294 23:54:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:12:11.294 23:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:12:11.294 23:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:12:11.294 23:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:12:11.294 23:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:11.294 23:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:12:11.294 23:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:11.294 23:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:11.294 23:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:11.294 23:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:11.294 23:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:11.294 23:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:11.294 23:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:11.294 23:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:11.294 23:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:11.553 23:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:11.553 "name": "Existed_Raid", 00:12:11.553 "uuid": "f6fe2151-3788-45d3-9f2b-66eca898cfe9", 00:12:11.553 "strip_size_kb": 64, 00:12:11.553 "state": "online", 00:12:11.553 "raid_level": "raid0", 00:12:11.553 "superblock": false, 00:12:11.553 "num_base_bdevs": 3, 00:12:11.553 "num_base_bdevs_discovered": 3, 00:12:11.553 "num_base_bdevs_operational": 3, 00:12:11.553 "base_bdevs_list": [ 00:12:11.553 { 00:12:11.553 "name": "BaseBdev1", 00:12:11.553 "uuid": "52d16ac0-ff3d-4ceb-9c29-b3a6d0389f60", 00:12:11.553 "is_configured": true, 00:12:11.553 "data_offset": 0, 00:12:11.553 "data_size": 65536 00:12:11.553 }, 00:12:11.553 { 00:12:11.553 "name": "BaseBdev2", 00:12:11.553 "uuid": "e3fd6929-0dcd-45c9-acd5-17ac186dd0c5", 00:12:11.553 "is_configured": true, 00:12:11.553 "data_offset": 0, 00:12:11.553 "data_size": 65536 00:12:11.553 }, 00:12:11.553 { 00:12:11.553 "name": "BaseBdev3", 00:12:11.553 "uuid": "8b16b037-edbb-4a8a-89bd-135beb535dd8", 00:12:11.553 "is_configured": true, 00:12:11.553 "data_offset": 0, 00:12:11.553 "data_size": 65536 00:12:11.553 } 00:12:11.553 ] 00:12:11.553 }' 00:12:11.553 23:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:11.553 23:54:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:12.120 23:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:12:12.120 23:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:12:12.120 23:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:12:12.120 23:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:12:12.120 23:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:12:12.120 23:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:12:12.120 23:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:12.120 23:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:12:12.120 [2024-05-14 23:54:12.562611] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:12.120 23:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:12:12.120 "name": "Existed_Raid", 00:12:12.120 "aliases": [ 00:12:12.120 "f6fe2151-3788-45d3-9f2b-66eca898cfe9" 00:12:12.120 ], 00:12:12.120 "product_name": "Raid Volume", 00:12:12.120 "block_size": 512, 00:12:12.120 "num_blocks": 196608, 00:12:12.120 "uuid": "f6fe2151-3788-45d3-9f2b-66eca898cfe9", 00:12:12.120 "assigned_rate_limits": { 00:12:12.120 "rw_ios_per_sec": 0, 00:12:12.120 "rw_mbytes_per_sec": 0, 00:12:12.120 "r_mbytes_per_sec": 0, 00:12:12.120 "w_mbytes_per_sec": 0 00:12:12.120 }, 00:12:12.120 "claimed": false, 00:12:12.120 "zoned": false, 00:12:12.120 "supported_io_types": { 00:12:12.120 "read": true, 00:12:12.120 "write": true, 00:12:12.120 "unmap": true, 00:12:12.120 "write_zeroes": true, 00:12:12.120 "flush": true, 00:12:12.120 "reset": true, 00:12:12.120 "compare": false, 00:12:12.120 "compare_and_write": false, 00:12:12.120 "abort": false, 00:12:12.120 "nvme_admin": false, 00:12:12.120 "nvme_io": false 00:12:12.120 }, 00:12:12.120 "memory_domains": [ 00:12:12.120 { 00:12:12.120 "dma_device_id": "system", 00:12:12.120 "dma_device_type": 1 00:12:12.120 }, 00:12:12.120 { 00:12:12.120 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:12.120 "dma_device_type": 2 00:12:12.120 }, 00:12:12.120 { 00:12:12.120 "dma_device_id": "system", 00:12:12.120 "dma_device_type": 1 00:12:12.120 }, 00:12:12.120 { 00:12:12.120 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:12.120 "dma_device_type": 2 00:12:12.120 }, 00:12:12.120 { 00:12:12.120 "dma_device_id": "system", 00:12:12.120 "dma_device_type": 1 00:12:12.120 }, 00:12:12.120 { 00:12:12.120 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:12.120 "dma_device_type": 2 00:12:12.120 } 00:12:12.120 ], 00:12:12.120 "driver_specific": { 00:12:12.120 "raid": { 00:12:12.120 "uuid": "f6fe2151-3788-45d3-9f2b-66eca898cfe9", 00:12:12.120 "strip_size_kb": 64, 00:12:12.120 "state": "online", 00:12:12.120 "raid_level": "raid0", 00:12:12.120 "superblock": false, 00:12:12.120 "num_base_bdevs": 3, 00:12:12.120 "num_base_bdevs_discovered": 3, 00:12:12.120 "num_base_bdevs_operational": 3, 00:12:12.120 "base_bdevs_list": [ 00:12:12.120 { 00:12:12.120 "name": "BaseBdev1", 00:12:12.120 "uuid": "52d16ac0-ff3d-4ceb-9c29-b3a6d0389f60", 00:12:12.120 "is_configured": true, 00:12:12.120 "data_offset": 0, 00:12:12.120 "data_size": 65536 00:12:12.120 }, 00:12:12.120 { 00:12:12.120 "name": "BaseBdev2", 00:12:12.120 "uuid": "e3fd6929-0dcd-45c9-acd5-17ac186dd0c5", 00:12:12.120 "is_configured": true, 00:12:12.120 "data_offset": 0, 00:12:12.120 "data_size": 65536 00:12:12.120 }, 00:12:12.120 { 00:12:12.120 "name": "BaseBdev3", 00:12:12.120 "uuid": "8b16b037-edbb-4a8a-89bd-135beb535dd8", 00:12:12.120 "is_configured": true, 00:12:12.120 "data_offset": 0, 00:12:12.120 "data_size": 65536 00:12:12.120 } 00:12:12.120 ] 00:12:12.120 } 00:12:12.120 } 00:12:12.120 }' 00:12:12.120 23:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:12.120 23:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:12:12.120 BaseBdev2 00:12:12.120 BaseBdev3' 00:12:12.120 23:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:12.120 23:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:12.120 23:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:12.380 23:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:12.380 "name": "BaseBdev1", 00:12:12.380 "aliases": [ 00:12:12.380 "52d16ac0-ff3d-4ceb-9c29-b3a6d0389f60" 00:12:12.380 ], 00:12:12.380 "product_name": "Malloc disk", 00:12:12.380 "block_size": 512, 00:12:12.380 "num_blocks": 65536, 00:12:12.380 "uuid": "52d16ac0-ff3d-4ceb-9c29-b3a6d0389f60", 00:12:12.380 "assigned_rate_limits": { 00:12:12.380 "rw_ios_per_sec": 0, 00:12:12.380 "rw_mbytes_per_sec": 0, 00:12:12.380 "r_mbytes_per_sec": 0, 00:12:12.380 "w_mbytes_per_sec": 0 00:12:12.380 }, 00:12:12.380 "claimed": true, 00:12:12.380 "claim_type": "exclusive_write", 00:12:12.380 "zoned": false, 00:12:12.380 "supported_io_types": { 00:12:12.380 "read": true, 00:12:12.380 "write": true, 00:12:12.380 "unmap": true, 00:12:12.380 "write_zeroes": true, 00:12:12.380 "flush": true, 00:12:12.380 "reset": true, 00:12:12.380 "compare": false, 00:12:12.380 "compare_and_write": false, 00:12:12.380 "abort": true, 00:12:12.380 "nvme_admin": false, 00:12:12.380 "nvme_io": false 00:12:12.380 }, 00:12:12.380 "memory_domains": [ 00:12:12.380 { 00:12:12.380 "dma_device_id": "system", 00:12:12.380 "dma_device_type": 1 00:12:12.380 }, 00:12:12.380 { 00:12:12.380 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:12.380 "dma_device_type": 2 00:12:12.380 } 00:12:12.380 ], 00:12:12.380 "driver_specific": {} 00:12:12.380 }' 00:12:12.380 23:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:12.380 23:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:12.380 23:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:12.380 23:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:12.380 23:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:12.639 23:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:12.639 23:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:12.639 23:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:12.639 23:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:12.639 23:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:12.639 23:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:12.639 23:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:12.639 23:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:12.639 23:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:12.639 23:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:12.898 23:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:12.898 "name": "BaseBdev2", 00:12:12.898 "aliases": [ 00:12:12.898 "e3fd6929-0dcd-45c9-acd5-17ac186dd0c5" 00:12:12.898 ], 00:12:12.898 "product_name": "Malloc disk", 00:12:12.898 "block_size": 512, 00:12:12.898 "num_blocks": 65536, 00:12:12.898 "uuid": "e3fd6929-0dcd-45c9-acd5-17ac186dd0c5", 00:12:12.898 "assigned_rate_limits": { 00:12:12.898 "rw_ios_per_sec": 0, 00:12:12.898 "rw_mbytes_per_sec": 0, 00:12:12.898 "r_mbytes_per_sec": 0, 00:12:12.898 "w_mbytes_per_sec": 0 00:12:12.898 }, 00:12:12.898 "claimed": true, 00:12:12.898 "claim_type": "exclusive_write", 00:12:12.898 "zoned": false, 00:12:12.898 "supported_io_types": { 00:12:12.898 "read": true, 00:12:12.898 "write": true, 00:12:12.898 "unmap": true, 00:12:12.898 "write_zeroes": true, 00:12:12.898 "flush": true, 00:12:12.898 "reset": true, 00:12:12.898 "compare": false, 00:12:12.898 "compare_and_write": false, 00:12:12.898 "abort": true, 00:12:12.898 "nvme_admin": false, 00:12:12.898 "nvme_io": false 00:12:12.898 }, 00:12:12.898 "memory_domains": [ 00:12:12.898 { 00:12:12.898 "dma_device_id": "system", 00:12:12.898 "dma_device_type": 1 00:12:12.898 }, 00:12:12.898 { 00:12:12.898 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:12.898 "dma_device_type": 2 00:12:12.898 } 00:12:12.898 ], 00:12:12.898 "driver_specific": {} 00:12:12.898 }' 00:12:12.898 23:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:12.898 23:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:12.898 23:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:12.898 23:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:13.156 23:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:13.156 23:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:13.156 23:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:13.156 23:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:13.156 23:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:13.156 23:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:13.156 23:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:13.156 23:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:13.156 23:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:13.156 23:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:13.156 23:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:13.415 23:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:13.415 "name": "BaseBdev3", 00:12:13.415 "aliases": [ 00:12:13.415 "8b16b037-edbb-4a8a-89bd-135beb535dd8" 00:12:13.415 ], 00:12:13.415 "product_name": "Malloc disk", 00:12:13.415 "block_size": 512, 00:12:13.415 "num_blocks": 65536, 00:12:13.415 "uuid": "8b16b037-edbb-4a8a-89bd-135beb535dd8", 00:12:13.415 "assigned_rate_limits": { 00:12:13.415 "rw_ios_per_sec": 0, 00:12:13.415 "rw_mbytes_per_sec": 0, 00:12:13.415 "r_mbytes_per_sec": 0, 00:12:13.415 "w_mbytes_per_sec": 0 00:12:13.415 }, 00:12:13.415 "claimed": true, 00:12:13.415 "claim_type": "exclusive_write", 00:12:13.415 "zoned": false, 00:12:13.415 "supported_io_types": { 00:12:13.415 "read": true, 00:12:13.415 "write": true, 00:12:13.415 "unmap": true, 00:12:13.415 "write_zeroes": true, 00:12:13.416 "flush": true, 00:12:13.416 "reset": true, 00:12:13.416 "compare": false, 00:12:13.416 "compare_and_write": false, 00:12:13.416 "abort": true, 00:12:13.416 "nvme_admin": false, 00:12:13.416 "nvme_io": false 00:12:13.416 }, 00:12:13.416 "memory_domains": [ 00:12:13.416 { 00:12:13.416 "dma_device_id": "system", 00:12:13.416 "dma_device_type": 1 00:12:13.416 }, 00:12:13.416 { 00:12:13.416 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:13.416 "dma_device_type": 2 00:12:13.416 } 00:12:13.416 ], 00:12:13.416 "driver_specific": {} 00:12:13.416 }' 00:12:13.416 23:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:13.416 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:13.675 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:13.675 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:13.675 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:13.675 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:13.675 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:13.675 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:13.675 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:13.675 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:13.933 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:13.933 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:13.933 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:13.933 [2024-05-14 23:54:14.523711] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:13.933 [2024-05-14 23:54:14.523739] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:13.933 [2024-05-14 23:54:14.523782] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:14.192 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # local expected_state 00:12:14.192 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # has_redundancy raid0 00:12:14.192 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:12:14.192 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@216 -- # return 1 00:12:14.192 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:12:14.192 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:12:14.192 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:14.192 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:12:14.192 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:14.192 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:14.192 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:12:14.192 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:14.192 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:14.192 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:14.192 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:14.192 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:14.192 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:14.451 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:14.451 "name": "Existed_Raid", 00:12:14.451 "uuid": "f6fe2151-3788-45d3-9f2b-66eca898cfe9", 00:12:14.451 "strip_size_kb": 64, 00:12:14.451 "state": "offline", 00:12:14.451 "raid_level": "raid0", 00:12:14.451 "superblock": false, 00:12:14.451 "num_base_bdevs": 3, 00:12:14.451 "num_base_bdevs_discovered": 2, 00:12:14.451 "num_base_bdevs_operational": 2, 00:12:14.451 "base_bdevs_list": [ 00:12:14.451 { 00:12:14.451 "name": null, 00:12:14.451 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:14.451 "is_configured": false, 00:12:14.451 "data_offset": 0, 00:12:14.451 "data_size": 65536 00:12:14.451 }, 00:12:14.451 { 00:12:14.451 "name": "BaseBdev2", 00:12:14.451 "uuid": "e3fd6929-0dcd-45c9-acd5-17ac186dd0c5", 00:12:14.451 "is_configured": true, 00:12:14.451 "data_offset": 0, 00:12:14.451 "data_size": 65536 00:12:14.451 }, 00:12:14.451 { 00:12:14.451 "name": "BaseBdev3", 00:12:14.451 "uuid": "8b16b037-edbb-4a8a-89bd-135beb535dd8", 00:12:14.451 "is_configured": true, 00:12:14.451 "data_offset": 0, 00:12:14.451 "data_size": 65536 00:12:14.451 } 00:12:14.451 ] 00:12:14.451 }' 00:12:14.451 23:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:14.451 23:54:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:15.020 23:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:12:15.020 23:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:12:15.020 23:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:15.020 23:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:12:15.020 23:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:12:15.020 23:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:15.020 23:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:15.279 [2024-05-14 23:54:15.800198] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:15.279 23:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:12:15.279 23:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:12:15.279 23:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:15.279 23:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:12:15.538 23:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:12:15.538 23:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:15.538 23:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:12:15.798 [2024-05-14 23:54:16.285917] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:15.798 [2024-05-14 23:54:16.285959] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2131560 name Existed_Raid, state offline 00:12:15.798 23:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:12:15.798 23:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:12:15.798 23:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:15.798 23:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:12:16.057 23:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:12:16.057 23:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:12:16.057 23:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@300 -- # '[' 3 -gt 2 ']' 00:12:16.057 23:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:12:16.057 23:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:12:16.057 23:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:16.316 BaseBdev2 00:12:16.316 23:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:12:16.316 23:54:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:12:16.316 23:54:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:16.316 23:54:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:12:16.316 23:54:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:16.316 23:54:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:16.316 23:54:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:16.605 23:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:16.863 [ 00:12:16.863 { 00:12:16.863 "name": "BaseBdev2", 00:12:16.863 "aliases": [ 00:12:16.863 "f841054e-5657-49d9-bd0b-a972b54fd0ec" 00:12:16.863 ], 00:12:16.863 "product_name": "Malloc disk", 00:12:16.863 "block_size": 512, 00:12:16.863 "num_blocks": 65536, 00:12:16.863 "uuid": "f841054e-5657-49d9-bd0b-a972b54fd0ec", 00:12:16.863 "assigned_rate_limits": { 00:12:16.863 "rw_ios_per_sec": 0, 00:12:16.863 "rw_mbytes_per_sec": 0, 00:12:16.863 "r_mbytes_per_sec": 0, 00:12:16.863 "w_mbytes_per_sec": 0 00:12:16.863 }, 00:12:16.863 "claimed": false, 00:12:16.863 "zoned": false, 00:12:16.863 "supported_io_types": { 00:12:16.863 "read": true, 00:12:16.863 "write": true, 00:12:16.863 "unmap": true, 00:12:16.863 "write_zeroes": true, 00:12:16.863 "flush": true, 00:12:16.863 "reset": true, 00:12:16.863 "compare": false, 00:12:16.863 "compare_and_write": false, 00:12:16.863 "abort": true, 00:12:16.863 "nvme_admin": false, 00:12:16.863 "nvme_io": false 00:12:16.863 }, 00:12:16.863 "memory_domains": [ 00:12:16.863 { 00:12:16.863 "dma_device_id": "system", 00:12:16.863 "dma_device_type": 1 00:12:16.863 }, 00:12:16.863 { 00:12:16.863 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:16.863 "dma_device_type": 2 00:12:16.863 } 00:12:16.863 ], 00:12:16.863 "driver_specific": {} 00:12:16.863 } 00:12:16.863 ] 00:12:16.863 23:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:12:16.863 23:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:12:16.863 23:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:12:16.863 23:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:17.122 BaseBdev3 00:12:17.122 23:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:12:17.122 23:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:12:17.122 23:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:17.122 23:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:12:17.122 23:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:17.122 23:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:17.122 23:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:17.381 23:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:17.381 [ 00:12:17.381 { 00:12:17.381 "name": "BaseBdev3", 00:12:17.381 "aliases": [ 00:12:17.381 "9dc2c9a4-0662-4a23-93cc-72ee1d7418bf" 00:12:17.381 ], 00:12:17.381 "product_name": "Malloc disk", 00:12:17.381 "block_size": 512, 00:12:17.381 "num_blocks": 65536, 00:12:17.381 "uuid": "9dc2c9a4-0662-4a23-93cc-72ee1d7418bf", 00:12:17.381 "assigned_rate_limits": { 00:12:17.381 "rw_ios_per_sec": 0, 00:12:17.381 "rw_mbytes_per_sec": 0, 00:12:17.381 "r_mbytes_per_sec": 0, 00:12:17.381 "w_mbytes_per_sec": 0 00:12:17.381 }, 00:12:17.381 "claimed": false, 00:12:17.381 "zoned": false, 00:12:17.381 "supported_io_types": { 00:12:17.381 "read": true, 00:12:17.381 "write": true, 00:12:17.381 "unmap": true, 00:12:17.381 "write_zeroes": true, 00:12:17.381 "flush": true, 00:12:17.381 "reset": true, 00:12:17.381 "compare": false, 00:12:17.381 "compare_and_write": false, 00:12:17.381 "abort": true, 00:12:17.381 "nvme_admin": false, 00:12:17.381 "nvme_io": false 00:12:17.381 }, 00:12:17.381 "memory_domains": [ 00:12:17.381 { 00:12:17.381 "dma_device_id": "system", 00:12:17.381 "dma_device_type": 1 00:12:17.381 }, 00:12:17.381 { 00:12:17.381 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:17.381 "dma_device_type": 2 00:12:17.381 } 00:12:17.381 ], 00:12:17.381 "driver_specific": {} 00:12:17.381 } 00:12:17.381 ] 00:12:17.640 23:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:12:17.640 23:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:12:17.640 23:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:12:17.640 23:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:17.640 [2024-05-14 23:54:18.202521] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:17.640 [2024-05-14 23:54:18.202562] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:17.640 [2024-05-14 23:54:18.202583] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:17.640 [2024-05-14 23:54:18.203954] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:17.640 23:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:17.640 23:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:17.640 23:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:17.640 23:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:17.640 23:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:17.640 23:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:17.640 23:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:17.640 23:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:17.640 23:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:17.640 23:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:17.640 23:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:17.640 23:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:17.898 23:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:17.898 "name": "Existed_Raid", 00:12:17.898 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:17.898 "strip_size_kb": 64, 00:12:17.898 "state": "configuring", 00:12:17.898 "raid_level": "raid0", 00:12:17.898 "superblock": false, 00:12:17.898 "num_base_bdevs": 3, 00:12:17.898 "num_base_bdevs_discovered": 2, 00:12:17.898 "num_base_bdevs_operational": 3, 00:12:17.898 "base_bdevs_list": [ 00:12:17.898 { 00:12:17.898 "name": "BaseBdev1", 00:12:17.898 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:17.898 "is_configured": false, 00:12:17.898 "data_offset": 0, 00:12:17.898 "data_size": 0 00:12:17.898 }, 00:12:17.898 { 00:12:17.898 "name": "BaseBdev2", 00:12:17.898 "uuid": "f841054e-5657-49d9-bd0b-a972b54fd0ec", 00:12:17.898 "is_configured": true, 00:12:17.898 "data_offset": 0, 00:12:17.898 "data_size": 65536 00:12:17.898 }, 00:12:17.898 { 00:12:17.898 "name": "BaseBdev3", 00:12:17.898 "uuid": "9dc2c9a4-0662-4a23-93cc-72ee1d7418bf", 00:12:17.898 "is_configured": true, 00:12:17.898 "data_offset": 0, 00:12:17.898 "data_size": 65536 00:12:17.898 } 00:12:17.898 ] 00:12:17.898 }' 00:12:17.898 23:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:17.898 23:54:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:18.833 23:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:12:18.833 [2024-05-14 23:54:19.297612] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:18.833 23:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:18.833 23:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:18.833 23:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:18.833 23:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:18.833 23:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:18.833 23:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:18.833 23:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:18.833 23:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:18.833 23:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:18.833 23:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:18.833 23:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:18.833 23:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:19.091 23:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:19.091 "name": "Existed_Raid", 00:12:19.091 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:19.091 "strip_size_kb": 64, 00:12:19.091 "state": "configuring", 00:12:19.091 "raid_level": "raid0", 00:12:19.091 "superblock": false, 00:12:19.091 "num_base_bdevs": 3, 00:12:19.091 "num_base_bdevs_discovered": 1, 00:12:19.091 "num_base_bdevs_operational": 3, 00:12:19.091 "base_bdevs_list": [ 00:12:19.091 { 00:12:19.091 "name": "BaseBdev1", 00:12:19.091 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:19.091 "is_configured": false, 00:12:19.091 "data_offset": 0, 00:12:19.091 "data_size": 0 00:12:19.091 }, 00:12:19.091 { 00:12:19.091 "name": null, 00:12:19.091 "uuid": "f841054e-5657-49d9-bd0b-a972b54fd0ec", 00:12:19.091 "is_configured": false, 00:12:19.091 "data_offset": 0, 00:12:19.091 "data_size": 65536 00:12:19.091 }, 00:12:19.091 { 00:12:19.091 "name": "BaseBdev3", 00:12:19.091 "uuid": "9dc2c9a4-0662-4a23-93cc-72ee1d7418bf", 00:12:19.091 "is_configured": true, 00:12:19.092 "data_offset": 0, 00:12:19.092 "data_size": 65536 00:12:19.092 } 00:12:19.092 ] 00:12:19.092 }' 00:12:19.092 23:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:19.092 23:54:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:19.658 23:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:19.658 23:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:19.916 23:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:12:19.916 23:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:20.174 [2024-05-14 23:54:20.657943] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:20.174 BaseBdev1 00:12:20.174 23:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:12:20.174 23:54:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:12:20.174 23:54:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:20.174 23:54:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:12:20.174 23:54:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:20.174 23:54:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:20.174 23:54:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:20.433 23:54:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:20.691 [ 00:12:20.691 { 00:12:20.691 "name": "BaseBdev1", 00:12:20.691 "aliases": [ 00:12:20.691 "2e16453b-5a6a-47b9-bf02-239f9e7dab9e" 00:12:20.691 ], 00:12:20.691 "product_name": "Malloc disk", 00:12:20.691 "block_size": 512, 00:12:20.691 "num_blocks": 65536, 00:12:20.691 "uuid": "2e16453b-5a6a-47b9-bf02-239f9e7dab9e", 00:12:20.691 "assigned_rate_limits": { 00:12:20.691 "rw_ios_per_sec": 0, 00:12:20.691 "rw_mbytes_per_sec": 0, 00:12:20.691 "r_mbytes_per_sec": 0, 00:12:20.691 "w_mbytes_per_sec": 0 00:12:20.691 }, 00:12:20.691 "claimed": true, 00:12:20.691 "claim_type": "exclusive_write", 00:12:20.691 "zoned": false, 00:12:20.691 "supported_io_types": { 00:12:20.691 "read": true, 00:12:20.691 "write": true, 00:12:20.691 "unmap": true, 00:12:20.691 "write_zeroes": true, 00:12:20.691 "flush": true, 00:12:20.691 "reset": true, 00:12:20.691 "compare": false, 00:12:20.691 "compare_and_write": false, 00:12:20.691 "abort": true, 00:12:20.691 "nvme_admin": false, 00:12:20.691 "nvme_io": false 00:12:20.691 }, 00:12:20.691 "memory_domains": [ 00:12:20.691 { 00:12:20.691 "dma_device_id": "system", 00:12:20.691 "dma_device_type": 1 00:12:20.691 }, 00:12:20.691 { 00:12:20.691 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:20.691 "dma_device_type": 2 00:12:20.691 } 00:12:20.691 ], 00:12:20.691 "driver_specific": {} 00:12:20.691 } 00:12:20.691 ] 00:12:20.691 23:54:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:12:20.691 23:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:20.691 23:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:20.691 23:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:20.691 23:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:20.691 23:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:20.691 23:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:20.691 23:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:20.691 23:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:20.691 23:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:20.691 23:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:20.691 23:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:20.691 23:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:20.950 23:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:20.950 "name": "Existed_Raid", 00:12:20.950 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:20.950 "strip_size_kb": 64, 00:12:20.950 "state": "configuring", 00:12:20.950 "raid_level": "raid0", 00:12:20.950 "superblock": false, 00:12:20.950 "num_base_bdevs": 3, 00:12:20.950 "num_base_bdevs_discovered": 2, 00:12:20.950 "num_base_bdevs_operational": 3, 00:12:20.950 "base_bdevs_list": [ 00:12:20.950 { 00:12:20.950 "name": "BaseBdev1", 00:12:20.950 "uuid": "2e16453b-5a6a-47b9-bf02-239f9e7dab9e", 00:12:20.950 "is_configured": true, 00:12:20.950 "data_offset": 0, 00:12:20.950 "data_size": 65536 00:12:20.950 }, 00:12:20.950 { 00:12:20.950 "name": null, 00:12:20.950 "uuid": "f841054e-5657-49d9-bd0b-a972b54fd0ec", 00:12:20.950 "is_configured": false, 00:12:20.950 "data_offset": 0, 00:12:20.950 "data_size": 65536 00:12:20.950 }, 00:12:20.950 { 00:12:20.950 "name": "BaseBdev3", 00:12:20.950 "uuid": "9dc2c9a4-0662-4a23-93cc-72ee1d7418bf", 00:12:20.950 "is_configured": true, 00:12:20.950 "data_offset": 0, 00:12:20.950 "data_size": 65536 00:12:20.950 } 00:12:20.950 ] 00:12:20.950 }' 00:12:20.950 23:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:20.950 23:54:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:21.516 23:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:21.516 23:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:21.774 23:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:12:21.774 23:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:12:22.033 [2024-05-14 23:54:22.438685] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:22.033 23:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:22.033 23:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:22.033 23:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:22.033 23:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:22.033 23:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:22.033 23:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:22.033 23:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:22.033 23:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:22.033 23:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:22.033 23:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:22.033 23:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:22.033 23:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:22.291 23:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:22.291 "name": "Existed_Raid", 00:12:22.292 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:22.292 "strip_size_kb": 64, 00:12:22.292 "state": "configuring", 00:12:22.292 "raid_level": "raid0", 00:12:22.292 "superblock": false, 00:12:22.292 "num_base_bdevs": 3, 00:12:22.292 "num_base_bdevs_discovered": 1, 00:12:22.292 "num_base_bdevs_operational": 3, 00:12:22.292 "base_bdevs_list": [ 00:12:22.292 { 00:12:22.292 "name": "BaseBdev1", 00:12:22.292 "uuid": "2e16453b-5a6a-47b9-bf02-239f9e7dab9e", 00:12:22.292 "is_configured": true, 00:12:22.292 "data_offset": 0, 00:12:22.292 "data_size": 65536 00:12:22.292 }, 00:12:22.292 { 00:12:22.292 "name": null, 00:12:22.292 "uuid": "f841054e-5657-49d9-bd0b-a972b54fd0ec", 00:12:22.292 "is_configured": false, 00:12:22.292 "data_offset": 0, 00:12:22.292 "data_size": 65536 00:12:22.292 }, 00:12:22.292 { 00:12:22.292 "name": null, 00:12:22.292 "uuid": "9dc2c9a4-0662-4a23-93cc-72ee1d7418bf", 00:12:22.292 "is_configured": false, 00:12:22.292 "data_offset": 0, 00:12:22.292 "data_size": 65536 00:12:22.292 } 00:12:22.292 ] 00:12:22.292 }' 00:12:22.292 23:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:22.292 23:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:22.858 23:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:22.858 23:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:23.116 23:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:12:23.116 23:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:12:23.373 [2024-05-14 23:54:23.746149] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:23.374 23:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:23.374 23:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:23.374 23:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:23.374 23:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:23.374 23:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:23.374 23:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:23.374 23:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:23.374 23:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:23.374 23:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:23.374 23:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:23.374 23:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:23.374 23:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:23.631 23:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:23.631 "name": "Existed_Raid", 00:12:23.631 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:23.631 "strip_size_kb": 64, 00:12:23.631 "state": "configuring", 00:12:23.631 "raid_level": "raid0", 00:12:23.631 "superblock": false, 00:12:23.631 "num_base_bdevs": 3, 00:12:23.631 "num_base_bdevs_discovered": 2, 00:12:23.631 "num_base_bdevs_operational": 3, 00:12:23.631 "base_bdevs_list": [ 00:12:23.631 { 00:12:23.631 "name": "BaseBdev1", 00:12:23.631 "uuid": "2e16453b-5a6a-47b9-bf02-239f9e7dab9e", 00:12:23.631 "is_configured": true, 00:12:23.631 "data_offset": 0, 00:12:23.631 "data_size": 65536 00:12:23.631 }, 00:12:23.631 { 00:12:23.631 "name": null, 00:12:23.631 "uuid": "f841054e-5657-49d9-bd0b-a972b54fd0ec", 00:12:23.631 "is_configured": false, 00:12:23.631 "data_offset": 0, 00:12:23.631 "data_size": 65536 00:12:23.631 }, 00:12:23.631 { 00:12:23.631 "name": "BaseBdev3", 00:12:23.631 "uuid": "9dc2c9a4-0662-4a23-93cc-72ee1d7418bf", 00:12:23.631 "is_configured": true, 00:12:23.631 "data_offset": 0, 00:12:23.632 "data_size": 65536 00:12:23.632 } 00:12:23.632 ] 00:12:23.632 }' 00:12:23.632 23:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:23.632 23:54:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:24.198 23:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:24.198 23:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:24.456 23:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:12:24.456 23:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:24.715 [2024-05-14 23:54:25.053632] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:24.715 23:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:24.715 23:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:24.715 23:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:24.715 23:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:24.715 23:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:24.715 23:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:24.715 23:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:24.715 23:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:24.715 23:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:24.715 23:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:24.715 23:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:24.715 23:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:24.974 23:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:24.974 "name": "Existed_Raid", 00:12:24.974 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:24.974 "strip_size_kb": 64, 00:12:24.974 "state": "configuring", 00:12:24.974 "raid_level": "raid0", 00:12:24.974 "superblock": false, 00:12:24.975 "num_base_bdevs": 3, 00:12:24.975 "num_base_bdevs_discovered": 1, 00:12:24.975 "num_base_bdevs_operational": 3, 00:12:24.975 "base_bdevs_list": [ 00:12:24.975 { 00:12:24.975 "name": null, 00:12:24.975 "uuid": "2e16453b-5a6a-47b9-bf02-239f9e7dab9e", 00:12:24.975 "is_configured": false, 00:12:24.975 "data_offset": 0, 00:12:24.975 "data_size": 65536 00:12:24.975 }, 00:12:24.975 { 00:12:24.975 "name": null, 00:12:24.975 "uuid": "f841054e-5657-49d9-bd0b-a972b54fd0ec", 00:12:24.975 "is_configured": false, 00:12:24.975 "data_offset": 0, 00:12:24.975 "data_size": 65536 00:12:24.975 }, 00:12:24.975 { 00:12:24.975 "name": "BaseBdev3", 00:12:24.975 "uuid": "9dc2c9a4-0662-4a23-93cc-72ee1d7418bf", 00:12:24.975 "is_configured": true, 00:12:24.975 "data_offset": 0, 00:12:24.975 "data_size": 65536 00:12:24.975 } 00:12:24.975 ] 00:12:24.975 }' 00:12:24.975 23:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:24.975 23:54:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:25.542 23:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:25.542 23:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:25.800 23:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:12:25.800 23:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:12:25.800 [2024-05-14 23:54:26.373666] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:26.059 23:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:26.059 23:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:26.059 23:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:26.059 23:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:26.059 23:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:26.059 23:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:26.059 23:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:26.059 23:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:26.059 23:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:26.059 23:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:26.059 23:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:26.059 23:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:26.059 23:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:26.059 "name": "Existed_Raid", 00:12:26.059 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:26.059 "strip_size_kb": 64, 00:12:26.059 "state": "configuring", 00:12:26.059 "raid_level": "raid0", 00:12:26.059 "superblock": false, 00:12:26.059 "num_base_bdevs": 3, 00:12:26.059 "num_base_bdevs_discovered": 2, 00:12:26.059 "num_base_bdevs_operational": 3, 00:12:26.059 "base_bdevs_list": [ 00:12:26.059 { 00:12:26.059 "name": null, 00:12:26.059 "uuid": "2e16453b-5a6a-47b9-bf02-239f9e7dab9e", 00:12:26.059 "is_configured": false, 00:12:26.059 "data_offset": 0, 00:12:26.059 "data_size": 65536 00:12:26.059 }, 00:12:26.059 { 00:12:26.059 "name": "BaseBdev2", 00:12:26.059 "uuid": "f841054e-5657-49d9-bd0b-a972b54fd0ec", 00:12:26.059 "is_configured": true, 00:12:26.059 "data_offset": 0, 00:12:26.059 "data_size": 65536 00:12:26.059 }, 00:12:26.059 { 00:12:26.059 "name": "BaseBdev3", 00:12:26.059 "uuid": "9dc2c9a4-0662-4a23-93cc-72ee1d7418bf", 00:12:26.059 "is_configured": true, 00:12:26.059 "data_offset": 0, 00:12:26.059 "data_size": 65536 00:12:26.059 } 00:12:26.059 ] 00:12:26.059 }' 00:12:26.059 23:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:26.059 23:54:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:26.994 23:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:26.995 23:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:26.995 23:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:12:26.995 23:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:26.995 23:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:12:27.253 23:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 2e16453b-5a6a-47b9-bf02-239f9e7dab9e 00:12:27.511 [2024-05-14 23:54:27.934281] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:12:27.511 [2024-05-14 23:54:27.934322] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x22d4f70 00:12:27.511 [2024-05-14 23:54:27.934330] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:12:27.511 [2024-05-14 23:54:27.934540] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22dc270 00:12:27.511 [2024-05-14 23:54:27.934670] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22d4f70 00:12:27.511 [2024-05-14 23:54:27.934680] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x22d4f70 00:12:27.511 [2024-05-14 23:54:27.934839] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:27.511 NewBaseBdev 00:12:27.511 23:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:12:27.511 23:54:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:12:27.511 23:54:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:27.511 23:54:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:12:27.511 23:54:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:27.511 23:54:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:27.511 23:54:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:27.769 23:54:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:12:28.027 [ 00:12:28.027 { 00:12:28.027 "name": "NewBaseBdev", 00:12:28.027 "aliases": [ 00:12:28.027 "2e16453b-5a6a-47b9-bf02-239f9e7dab9e" 00:12:28.027 ], 00:12:28.027 "product_name": "Malloc disk", 00:12:28.027 "block_size": 512, 00:12:28.027 "num_blocks": 65536, 00:12:28.027 "uuid": "2e16453b-5a6a-47b9-bf02-239f9e7dab9e", 00:12:28.027 "assigned_rate_limits": { 00:12:28.027 "rw_ios_per_sec": 0, 00:12:28.027 "rw_mbytes_per_sec": 0, 00:12:28.027 "r_mbytes_per_sec": 0, 00:12:28.027 "w_mbytes_per_sec": 0 00:12:28.027 }, 00:12:28.027 "claimed": true, 00:12:28.027 "claim_type": "exclusive_write", 00:12:28.027 "zoned": false, 00:12:28.027 "supported_io_types": { 00:12:28.027 "read": true, 00:12:28.027 "write": true, 00:12:28.027 "unmap": true, 00:12:28.027 "write_zeroes": true, 00:12:28.027 "flush": true, 00:12:28.027 "reset": true, 00:12:28.027 "compare": false, 00:12:28.027 "compare_and_write": false, 00:12:28.027 "abort": true, 00:12:28.027 "nvme_admin": false, 00:12:28.027 "nvme_io": false 00:12:28.027 }, 00:12:28.027 "memory_domains": [ 00:12:28.027 { 00:12:28.027 "dma_device_id": "system", 00:12:28.027 "dma_device_type": 1 00:12:28.027 }, 00:12:28.027 { 00:12:28.027 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:28.027 "dma_device_type": 2 00:12:28.027 } 00:12:28.027 ], 00:12:28.027 "driver_specific": {} 00:12:28.027 } 00:12:28.027 ] 00:12:28.027 23:54:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:12:28.027 23:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:12:28.028 23:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:28.028 23:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:12:28.028 23:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:28.028 23:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:28.028 23:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:28.028 23:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:28.028 23:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:28.028 23:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:28.028 23:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:28.028 23:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:28.028 23:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:28.286 23:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:28.286 "name": "Existed_Raid", 00:12:28.286 "uuid": "5e55348c-aa67-4074-b3e2-3f33d858bf07", 00:12:28.286 "strip_size_kb": 64, 00:12:28.286 "state": "online", 00:12:28.286 "raid_level": "raid0", 00:12:28.286 "superblock": false, 00:12:28.286 "num_base_bdevs": 3, 00:12:28.286 "num_base_bdevs_discovered": 3, 00:12:28.286 "num_base_bdevs_operational": 3, 00:12:28.286 "base_bdevs_list": [ 00:12:28.286 { 00:12:28.286 "name": "NewBaseBdev", 00:12:28.286 "uuid": "2e16453b-5a6a-47b9-bf02-239f9e7dab9e", 00:12:28.286 "is_configured": true, 00:12:28.286 "data_offset": 0, 00:12:28.286 "data_size": 65536 00:12:28.286 }, 00:12:28.286 { 00:12:28.286 "name": "BaseBdev2", 00:12:28.286 "uuid": "f841054e-5657-49d9-bd0b-a972b54fd0ec", 00:12:28.286 "is_configured": true, 00:12:28.286 "data_offset": 0, 00:12:28.286 "data_size": 65536 00:12:28.286 }, 00:12:28.286 { 00:12:28.286 "name": "BaseBdev3", 00:12:28.286 "uuid": "9dc2c9a4-0662-4a23-93cc-72ee1d7418bf", 00:12:28.286 "is_configured": true, 00:12:28.286 "data_offset": 0, 00:12:28.286 "data_size": 65536 00:12:28.286 } 00:12:28.286 ] 00:12:28.286 }' 00:12:28.286 23:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:28.286 23:54:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:28.853 23:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:12:28.853 23:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:12:28.853 23:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:12:28.853 23:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:12:28.853 23:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:12:28.853 23:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:12:28.853 23:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:28.853 23:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:12:29.112 [2024-05-14 23:54:29.470627] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:29.112 23:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:12:29.112 "name": "Existed_Raid", 00:12:29.112 "aliases": [ 00:12:29.112 "5e55348c-aa67-4074-b3e2-3f33d858bf07" 00:12:29.112 ], 00:12:29.112 "product_name": "Raid Volume", 00:12:29.112 "block_size": 512, 00:12:29.112 "num_blocks": 196608, 00:12:29.112 "uuid": "5e55348c-aa67-4074-b3e2-3f33d858bf07", 00:12:29.112 "assigned_rate_limits": { 00:12:29.112 "rw_ios_per_sec": 0, 00:12:29.112 "rw_mbytes_per_sec": 0, 00:12:29.112 "r_mbytes_per_sec": 0, 00:12:29.112 "w_mbytes_per_sec": 0 00:12:29.112 }, 00:12:29.112 "claimed": false, 00:12:29.112 "zoned": false, 00:12:29.112 "supported_io_types": { 00:12:29.112 "read": true, 00:12:29.112 "write": true, 00:12:29.112 "unmap": true, 00:12:29.112 "write_zeroes": true, 00:12:29.112 "flush": true, 00:12:29.112 "reset": true, 00:12:29.112 "compare": false, 00:12:29.112 "compare_and_write": false, 00:12:29.112 "abort": false, 00:12:29.112 "nvme_admin": false, 00:12:29.112 "nvme_io": false 00:12:29.112 }, 00:12:29.112 "memory_domains": [ 00:12:29.112 { 00:12:29.112 "dma_device_id": "system", 00:12:29.112 "dma_device_type": 1 00:12:29.112 }, 00:12:29.112 { 00:12:29.112 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:29.112 "dma_device_type": 2 00:12:29.112 }, 00:12:29.112 { 00:12:29.112 "dma_device_id": "system", 00:12:29.112 "dma_device_type": 1 00:12:29.112 }, 00:12:29.112 { 00:12:29.112 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:29.112 "dma_device_type": 2 00:12:29.112 }, 00:12:29.112 { 00:12:29.112 "dma_device_id": "system", 00:12:29.112 "dma_device_type": 1 00:12:29.112 }, 00:12:29.112 { 00:12:29.112 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:29.112 "dma_device_type": 2 00:12:29.112 } 00:12:29.112 ], 00:12:29.112 "driver_specific": { 00:12:29.112 "raid": { 00:12:29.112 "uuid": "5e55348c-aa67-4074-b3e2-3f33d858bf07", 00:12:29.112 "strip_size_kb": 64, 00:12:29.112 "state": "online", 00:12:29.112 "raid_level": "raid0", 00:12:29.112 "superblock": false, 00:12:29.112 "num_base_bdevs": 3, 00:12:29.112 "num_base_bdevs_discovered": 3, 00:12:29.112 "num_base_bdevs_operational": 3, 00:12:29.112 "base_bdevs_list": [ 00:12:29.112 { 00:12:29.112 "name": "NewBaseBdev", 00:12:29.112 "uuid": "2e16453b-5a6a-47b9-bf02-239f9e7dab9e", 00:12:29.112 "is_configured": true, 00:12:29.112 "data_offset": 0, 00:12:29.112 "data_size": 65536 00:12:29.112 }, 00:12:29.112 { 00:12:29.112 "name": "BaseBdev2", 00:12:29.112 "uuid": "f841054e-5657-49d9-bd0b-a972b54fd0ec", 00:12:29.112 "is_configured": true, 00:12:29.112 "data_offset": 0, 00:12:29.112 "data_size": 65536 00:12:29.112 }, 00:12:29.112 { 00:12:29.112 "name": "BaseBdev3", 00:12:29.112 "uuid": "9dc2c9a4-0662-4a23-93cc-72ee1d7418bf", 00:12:29.112 "is_configured": true, 00:12:29.112 "data_offset": 0, 00:12:29.112 "data_size": 65536 00:12:29.112 } 00:12:29.112 ] 00:12:29.112 } 00:12:29.112 } 00:12:29.112 }' 00:12:29.112 23:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:29.112 23:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:12:29.112 BaseBdev2 00:12:29.112 BaseBdev3' 00:12:29.112 23:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:29.112 23:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:12:29.112 23:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:29.371 23:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:29.371 "name": "NewBaseBdev", 00:12:29.371 "aliases": [ 00:12:29.371 "2e16453b-5a6a-47b9-bf02-239f9e7dab9e" 00:12:29.371 ], 00:12:29.371 "product_name": "Malloc disk", 00:12:29.371 "block_size": 512, 00:12:29.371 "num_blocks": 65536, 00:12:29.371 "uuid": "2e16453b-5a6a-47b9-bf02-239f9e7dab9e", 00:12:29.371 "assigned_rate_limits": { 00:12:29.371 "rw_ios_per_sec": 0, 00:12:29.371 "rw_mbytes_per_sec": 0, 00:12:29.371 "r_mbytes_per_sec": 0, 00:12:29.371 "w_mbytes_per_sec": 0 00:12:29.371 }, 00:12:29.371 "claimed": true, 00:12:29.371 "claim_type": "exclusive_write", 00:12:29.371 "zoned": false, 00:12:29.371 "supported_io_types": { 00:12:29.371 "read": true, 00:12:29.371 "write": true, 00:12:29.371 "unmap": true, 00:12:29.371 "write_zeroes": true, 00:12:29.371 "flush": true, 00:12:29.371 "reset": true, 00:12:29.371 "compare": false, 00:12:29.371 "compare_and_write": false, 00:12:29.371 "abort": true, 00:12:29.371 "nvme_admin": false, 00:12:29.371 "nvme_io": false 00:12:29.371 }, 00:12:29.371 "memory_domains": [ 00:12:29.371 { 00:12:29.371 "dma_device_id": "system", 00:12:29.371 "dma_device_type": 1 00:12:29.371 }, 00:12:29.371 { 00:12:29.371 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:29.371 "dma_device_type": 2 00:12:29.371 } 00:12:29.371 ], 00:12:29.371 "driver_specific": {} 00:12:29.371 }' 00:12:29.371 23:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:29.371 23:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:29.371 23:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:29.371 23:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:29.371 23:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:29.371 23:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:29.371 23:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:29.630 23:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:29.630 23:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:29.630 23:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:29.630 23:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:29.630 23:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:29.630 23:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:29.630 23:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:29.630 23:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:29.888 23:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:29.888 "name": "BaseBdev2", 00:12:29.888 "aliases": [ 00:12:29.888 "f841054e-5657-49d9-bd0b-a972b54fd0ec" 00:12:29.888 ], 00:12:29.888 "product_name": "Malloc disk", 00:12:29.888 "block_size": 512, 00:12:29.888 "num_blocks": 65536, 00:12:29.888 "uuid": "f841054e-5657-49d9-bd0b-a972b54fd0ec", 00:12:29.888 "assigned_rate_limits": { 00:12:29.888 "rw_ios_per_sec": 0, 00:12:29.888 "rw_mbytes_per_sec": 0, 00:12:29.888 "r_mbytes_per_sec": 0, 00:12:29.888 "w_mbytes_per_sec": 0 00:12:29.888 }, 00:12:29.888 "claimed": true, 00:12:29.888 "claim_type": "exclusive_write", 00:12:29.888 "zoned": false, 00:12:29.888 "supported_io_types": { 00:12:29.888 "read": true, 00:12:29.888 "write": true, 00:12:29.888 "unmap": true, 00:12:29.888 "write_zeroes": true, 00:12:29.888 "flush": true, 00:12:29.888 "reset": true, 00:12:29.888 "compare": false, 00:12:29.888 "compare_and_write": false, 00:12:29.888 "abort": true, 00:12:29.888 "nvme_admin": false, 00:12:29.888 "nvme_io": false 00:12:29.888 }, 00:12:29.888 "memory_domains": [ 00:12:29.888 { 00:12:29.888 "dma_device_id": "system", 00:12:29.888 "dma_device_type": 1 00:12:29.888 }, 00:12:29.888 { 00:12:29.888 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:29.888 "dma_device_type": 2 00:12:29.888 } 00:12:29.888 ], 00:12:29.888 "driver_specific": {} 00:12:29.888 }' 00:12:29.888 23:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:29.889 23:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:29.889 23:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:29.889 23:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:29.889 23:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:30.147 23:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:30.147 23:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:30.147 23:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:30.147 23:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:30.147 23:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:30.148 23:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:30.148 23:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:30.148 23:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:30.148 23:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:30.148 23:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:30.407 23:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:30.407 "name": "BaseBdev3", 00:12:30.407 "aliases": [ 00:12:30.407 "9dc2c9a4-0662-4a23-93cc-72ee1d7418bf" 00:12:30.407 ], 00:12:30.407 "product_name": "Malloc disk", 00:12:30.407 "block_size": 512, 00:12:30.407 "num_blocks": 65536, 00:12:30.407 "uuid": "9dc2c9a4-0662-4a23-93cc-72ee1d7418bf", 00:12:30.407 "assigned_rate_limits": { 00:12:30.407 "rw_ios_per_sec": 0, 00:12:30.407 "rw_mbytes_per_sec": 0, 00:12:30.407 "r_mbytes_per_sec": 0, 00:12:30.407 "w_mbytes_per_sec": 0 00:12:30.407 }, 00:12:30.407 "claimed": true, 00:12:30.407 "claim_type": "exclusive_write", 00:12:30.407 "zoned": false, 00:12:30.407 "supported_io_types": { 00:12:30.407 "read": true, 00:12:30.407 "write": true, 00:12:30.407 "unmap": true, 00:12:30.407 "write_zeroes": true, 00:12:30.407 "flush": true, 00:12:30.407 "reset": true, 00:12:30.407 "compare": false, 00:12:30.407 "compare_and_write": false, 00:12:30.407 "abort": true, 00:12:30.407 "nvme_admin": false, 00:12:30.407 "nvme_io": false 00:12:30.407 }, 00:12:30.407 "memory_domains": [ 00:12:30.407 { 00:12:30.407 "dma_device_id": "system", 00:12:30.407 "dma_device_type": 1 00:12:30.407 }, 00:12:30.407 { 00:12:30.407 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:30.407 "dma_device_type": 2 00:12:30.407 } 00:12:30.407 ], 00:12:30.407 "driver_specific": {} 00:12:30.407 }' 00:12:30.407 23:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:30.407 23:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:30.666 23:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:30.666 23:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:30.666 23:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:30.666 23:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:30.666 23:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:30.666 23:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:30.666 23:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:30.666 23:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:30.666 23:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:30.924 23:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:30.924 23:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:31.183 [2024-05-14 23:54:31.515811] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:31.183 [2024-05-14 23:54:31.515835] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:31.183 [2024-05-14 23:54:31.515898] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:31.183 [2024-05-14 23:54:31.515950] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:31.183 [2024-05-14 23:54:31.515963] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22d4f70 name Existed_Raid, state offline 00:12:31.183 23:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@342 -- # killprocess 400278 00:12:31.183 23:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@946 -- # '[' -z 400278 ']' 00:12:31.183 23:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # kill -0 400278 00:12:31.183 23:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # uname 00:12:31.183 23:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:12:31.183 23:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 400278 00:12:31.183 23:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:12:31.183 23:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:12:31.183 23:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 400278' 00:12:31.183 killing process with pid 400278 00:12:31.183 23:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@965 -- # kill 400278 00:12:31.183 [2024-05-14 23:54:31.585816] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:31.183 23:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@970 -- # wait 400278 00:12:31.183 [2024-05-14 23:54:31.612917] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@344 -- # return 0 00:12:31.442 00:12:31.442 real 0m27.727s 00:12:31.442 user 0m50.755s 00:12:31.442 sys 0m4.993s 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:31.442 ************************************ 00:12:31.442 END TEST raid_state_function_test 00:12:31.442 ************************************ 00:12:31.442 23:54:31 bdev_raid -- bdev/bdev_raid.sh@816 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:12:31.442 23:54:31 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:12:31.442 23:54:31 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:12:31.442 23:54:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:31.442 ************************************ 00:12:31.442 START TEST raid_state_function_test_sb 00:12:31.442 ************************************ 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1121 -- # raid_state_function_test raid0 3 true 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local raid_level=raid0 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=3 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # '[' raid0 '!=' raid1 ']' 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # raid_pid=404910 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 404910' 00:12:31.442 Process raid pid: 404910 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@247 -- # waitforlisten 404910 /var/tmp/spdk-raid.sock 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@827 -- # '[' -z 404910 ']' 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:31.442 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:12:31.442 23:54:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:31.442 [2024-05-14 23:54:32.012673] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:12:31.442 [2024-05-14 23:54:32.012743] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:31.701 [2024-05-14 23:54:32.141442] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:31.701 [2024-05-14 23:54:32.246834] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:12:31.959 [2024-05-14 23:54:32.322546] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:31.959 [2024-05-14 23:54:32.322581] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:32.526 23:54:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:12:32.526 23:54:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # return 0 00:12:32.526 23:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:32.784 [2024-05-14 23:54:33.159041] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:32.784 [2024-05-14 23:54:33.159083] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:32.784 [2024-05-14 23:54:33.159095] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:32.784 [2024-05-14 23:54:33.159107] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:32.784 [2024-05-14 23:54:33.159116] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:32.784 [2024-05-14 23:54:33.159127] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:32.784 23:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:32.784 23:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:32.784 23:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:32.784 23:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:32.784 23:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:32.784 23:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:32.784 23:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:32.784 23:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:32.784 23:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:32.784 23:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:32.784 23:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:32.784 23:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:33.042 23:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:33.042 "name": "Existed_Raid", 00:12:33.042 "uuid": "36913aeb-6ab6-4c60-89a6-f4d89c4ad3bb", 00:12:33.042 "strip_size_kb": 64, 00:12:33.042 "state": "configuring", 00:12:33.042 "raid_level": "raid0", 00:12:33.042 "superblock": true, 00:12:33.042 "num_base_bdevs": 3, 00:12:33.042 "num_base_bdevs_discovered": 0, 00:12:33.042 "num_base_bdevs_operational": 3, 00:12:33.042 "base_bdevs_list": [ 00:12:33.042 { 00:12:33.042 "name": "BaseBdev1", 00:12:33.042 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:33.042 "is_configured": false, 00:12:33.042 "data_offset": 0, 00:12:33.042 "data_size": 0 00:12:33.042 }, 00:12:33.042 { 00:12:33.042 "name": "BaseBdev2", 00:12:33.042 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:33.042 "is_configured": false, 00:12:33.042 "data_offset": 0, 00:12:33.042 "data_size": 0 00:12:33.042 }, 00:12:33.042 { 00:12:33.042 "name": "BaseBdev3", 00:12:33.042 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:33.042 "is_configured": false, 00:12:33.042 "data_offset": 0, 00:12:33.042 "data_size": 0 00:12:33.042 } 00:12:33.042 ] 00:12:33.042 }' 00:12:33.042 23:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:33.042 23:54:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:33.607 23:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:33.864 [2024-05-14 23:54:34.229715] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:33.864 [2024-05-14 23:54:34.229745] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x195fbe0 name Existed_Raid, state configuring 00:12:33.864 23:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:34.122 [2024-05-14 23:54:34.474382] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:34.122 [2024-05-14 23:54:34.474416] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:34.122 [2024-05-14 23:54:34.474427] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:34.122 [2024-05-14 23:54:34.474438] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:34.122 [2024-05-14 23:54:34.474447] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:34.122 [2024-05-14 23:54:34.474458] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:34.122 23:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:34.381 [2024-05-14 23:54:34.732908] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:34.381 BaseBdev1 00:12:34.381 23:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:12:34.381 23:54:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:12:34.381 23:54:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:34.381 23:54:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:12:34.381 23:54:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:34.381 23:54:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:34.381 23:54:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:34.641 23:54:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:34.641 [ 00:12:34.641 { 00:12:34.641 "name": "BaseBdev1", 00:12:34.641 "aliases": [ 00:12:34.641 "d88724e6-1a5a-4a55-ac1d-98eeb6079026" 00:12:34.641 ], 00:12:34.641 "product_name": "Malloc disk", 00:12:34.641 "block_size": 512, 00:12:34.641 "num_blocks": 65536, 00:12:34.641 "uuid": "d88724e6-1a5a-4a55-ac1d-98eeb6079026", 00:12:34.641 "assigned_rate_limits": { 00:12:34.641 "rw_ios_per_sec": 0, 00:12:34.641 "rw_mbytes_per_sec": 0, 00:12:34.641 "r_mbytes_per_sec": 0, 00:12:34.641 "w_mbytes_per_sec": 0 00:12:34.641 }, 00:12:34.641 "claimed": true, 00:12:34.641 "claim_type": "exclusive_write", 00:12:34.641 "zoned": false, 00:12:34.641 "supported_io_types": { 00:12:34.641 "read": true, 00:12:34.641 "write": true, 00:12:34.641 "unmap": true, 00:12:34.641 "write_zeroes": true, 00:12:34.641 "flush": true, 00:12:34.641 "reset": true, 00:12:34.641 "compare": false, 00:12:34.641 "compare_and_write": false, 00:12:34.641 "abort": true, 00:12:34.641 "nvme_admin": false, 00:12:34.641 "nvme_io": false 00:12:34.641 }, 00:12:34.641 "memory_domains": [ 00:12:34.641 { 00:12:34.641 "dma_device_id": "system", 00:12:34.641 "dma_device_type": 1 00:12:34.641 }, 00:12:34.641 { 00:12:34.641 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:34.641 "dma_device_type": 2 00:12:34.641 } 00:12:34.641 ], 00:12:34.641 "driver_specific": {} 00:12:34.641 } 00:12:34.641 ] 00:12:34.900 23:54:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:12:34.900 23:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:34.900 23:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:34.900 23:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:34.900 23:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:34.900 23:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:34.900 23:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:34.900 23:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:34.900 23:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:34.900 23:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:34.900 23:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:34.900 23:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:34.900 23:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:34.900 23:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:34.900 "name": "Existed_Raid", 00:12:34.900 "uuid": "fd516adb-6cd8-4de5-a611-894ff4a9ac41", 00:12:34.900 "strip_size_kb": 64, 00:12:34.900 "state": "configuring", 00:12:34.900 "raid_level": "raid0", 00:12:34.900 "superblock": true, 00:12:34.900 "num_base_bdevs": 3, 00:12:34.900 "num_base_bdevs_discovered": 1, 00:12:34.900 "num_base_bdevs_operational": 3, 00:12:34.900 "base_bdevs_list": [ 00:12:34.900 { 00:12:34.900 "name": "BaseBdev1", 00:12:34.900 "uuid": "d88724e6-1a5a-4a55-ac1d-98eeb6079026", 00:12:34.900 "is_configured": true, 00:12:34.900 "data_offset": 2048, 00:12:34.900 "data_size": 63488 00:12:34.900 }, 00:12:34.900 { 00:12:34.900 "name": "BaseBdev2", 00:12:34.900 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:34.900 "is_configured": false, 00:12:34.900 "data_offset": 0, 00:12:34.900 "data_size": 0 00:12:34.900 }, 00:12:34.900 { 00:12:34.900 "name": "BaseBdev3", 00:12:34.900 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:34.900 "is_configured": false, 00:12:34.900 "data_offset": 0, 00:12:34.900 "data_size": 0 00:12:34.900 } 00:12:34.900 ] 00:12:34.900 }' 00:12:34.900 23:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:34.900 23:54:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:35.836 23:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:35.836 [2024-05-14 23:54:36.301064] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:35.836 [2024-05-14 23:54:36.301106] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x195f4b0 name Existed_Raid, state configuring 00:12:35.836 23:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:36.095 [2024-05-14 23:54:36.541764] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:36.095 [2024-05-14 23:54:36.543310] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:36.095 [2024-05-14 23:54:36.543344] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:36.095 [2024-05-14 23:54:36.543355] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:36.095 [2024-05-14 23:54:36.543367] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:36.095 23:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:12:36.095 23:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:12:36.095 23:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:36.095 23:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:36.095 23:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:36.095 23:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:36.095 23:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:36.095 23:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:36.095 23:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:36.095 23:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:36.095 23:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:36.095 23:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:36.095 23:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:36.095 23:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:36.353 23:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:36.353 "name": "Existed_Raid", 00:12:36.353 "uuid": "1caae88e-3b35-4971-9daa-70a5ae4472c3", 00:12:36.353 "strip_size_kb": 64, 00:12:36.353 "state": "configuring", 00:12:36.353 "raid_level": "raid0", 00:12:36.353 "superblock": true, 00:12:36.353 "num_base_bdevs": 3, 00:12:36.353 "num_base_bdevs_discovered": 1, 00:12:36.353 "num_base_bdevs_operational": 3, 00:12:36.353 "base_bdevs_list": [ 00:12:36.353 { 00:12:36.353 "name": "BaseBdev1", 00:12:36.353 "uuid": "d88724e6-1a5a-4a55-ac1d-98eeb6079026", 00:12:36.353 "is_configured": true, 00:12:36.353 "data_offset": 2048, 00:12:36.353 "data_size": 63488 00:12:36.353 }, 00:12:36.353 { 00:12:36.353 "name": "BaseBdev2", 00:12:36.354 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:36.354 "is_configured": false, 00:12:36.354 "data_offset": 0, 00:12:36.354 "data_size": 0 00:12:36.354 }, 00:12:36.354 { 00:12:36.354 "name": "BaseBdev3", 00:12:36.354 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:36.354 "is_configured": false, 00:12:36.354 "data_offset": 0, 00:12:36.354 "data_size": 0 00:12:36.354 } 00:12:36.354 ] 00:12:36.354 }' 00:12:36.354 23:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:36.354 23:54:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:36.920 23:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:37.178 [2024-05-14 23:54:37.623978] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:37.178 BaseBdev2 00:12:37.178 23:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:12:37.178 23:54:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:12:37.178 23:54:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:37.178 23:54:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:12:37.178 23:54:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:37.178 23:54:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:37.178 23:54:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:37.436 23:54:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:37.695 [ 00:12:37.695 { 00:12:37.695 "name": "BaseBdev2", 00:12:37.695 "aliases": [ 00:12:37.695 "12199460-0fff-4ac4-b46b-0c374ed94e91" 00:12:37.695 ], 00:12:37.695 "product_name": "Malloc disk", 00:12:37.695 "block_size": 512, 00:12:37.695 "num_blocks": 65536, 00:12:37.695 "uuid": "12199460-0fff-4ac4-b46b-0c374ed94e91", 00:12:37.695 "assigned_rate_limits": { 00:12:37.695 "rw_ios_per_sec": 0, 00:12:37.695 "rw_mbytes_per_sec": 0, 00:12:37.695 "r_mbytes_per_sec": 0, 00:12:37.695 "w_mbytes_per_sec": 0 00:12:37.695 }, 00:12:37.695 "claimed": true, 00:12:37.695 "claim_type": "exclusive_write", 00:12:37.695 "zoned": false, 00:12:37.695 "supported_io_types": { 00:12:37.695 "read": true, 00:12:37.695 "write": true, 00:12:37.695 "unmap": true, 00:12:37.695 "write_zeroes": true, 00:12:37.695 "flush": true, 00:12:37.695 "reset": true, 00:12:37.695 "compare": false, 00:12:37.695 "compare_and_write": false, 00:12:37.695 "abort": true, 00:12:37.695 "nvme_admin": false, 00:12:37.695 "nvme_io": false 00:12:37.695 }, 00:12:37.695 "memory_domains": [ 00:12:37.695 { 00:12:37.695 "dma_device_id": "system", 00:12:37.695 "dma_device_type": 1 00:12:37.695 }, 00:12:37.695 { 00:12:37.695 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:37.695 "dma_device_type": 2 00:12:37.695 } 00:12:37.695 ], 00:12:37.695 "driver_specific": {} 00:12:37.695 } 00:12:37.695 ] 00:12:37.695 23:54:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:12:37.695 23:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:12:37.695 23:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:12:37.695 23:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:37.695 23:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:37.695 23:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:37.695 23:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:37.695 23:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:37.695 23:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:37.695 23:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:37.695 23:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:37.696 23:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:37.696 23:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:37.696 23:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:37.696 23:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:37.696 23:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:37.696 "name": "Existed_Raid", 00:12:37.696 "uuid": "1caae88e-3b35-4971-9daa-70a5ae4472c3", 00:12:37.696 "strip_size_kb": 64, 00:12:37.696 "state": "configuring", 00:12:37.696 "raid_level": "raid0", 00:12:37.696 "superblock": true, 00:12:37.696 "num_base_bdevs": 3, 00:12:37.696 "num_base_bdevs_discovered": 2, 00:12:37.696 "num_base_bdevs_operational": 3, 00:12:37.696 "base_bdevs_list": [ 00:12:37.696 { 00:12:37.696 "name": "BaseBdev1", 00:12:37.696 "uuid": "d88724e6-1a5a-4a55-ac1d-98eeb6079026", 00:12:37.696 "is_configured": true, 00:12:37.696 "data_offset": 2048, 00:12:37.696 "data_size": 63488 00:12:37.696 }, 00:12:37.696 { 00:12:37.696 "name": "BaseBdev2", 00:12:37.696 "uuid": "12199460-0fff-4ac4-b46b-0c374ed94e91", 00:12:37.696 "is_configured": true, 00:12:37.696 "data_offset": 2048, 00:12:37.696 "data_size": 63488 00:12:37.696 }, 00:12:37.696 { 00:12:37.696 "name": "BaseBdev3", 00:12:37.696 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:37.696 "is_configured": false, 00:12:37.696 "data_offset": 0, 00:12:37.696 "data_size": 0 00:12:37.696 } 00:12:37.696 ] 00:12:37.696 }' 00:12:37.696 23:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:37.696 23:54:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:38.263 23:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:38.521 [2024-05-14 23:54:39.071366] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:38.521 [2024-05-14 23:54:39.071541] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1960560 00:12:38.521 [2024-05-14 23:54:39.071557] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:38.521 [2024-05-14 23:54:39.071735] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1977490 00:12:38.521 [2024-05-14 23:54:39.071856] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1960560 00:12:38.521 [2024-05-14 23:54:39.071871] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1960560 00:12:38.521 [2024-05-14 23:54:39.071965] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:38.521 BaseBdev3 00:12:38.521 23:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:12:38.521 23:54:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:12:38.521 23:54:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:38.521 23:54:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:12:38.521 23:54:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:38.521 23:54:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:38.521 23:54:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:38.778 23:54:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:39.035 [ 00:12:39.035 { 00:12:39.035 "name": "BaseBdev3", 00:12:39.035 "aliases": [ 00:12:39.035 "9102d0da-af89-44c0-bfc8-8306f9f109c6" 00:12:39.035 ], 00:12:39.035 "product_name": "Malloc disk", 00:12:39.035 "block_size": 512, 00:12:39.035 "num_blocks": 65536, 00:12:39.035 "uuid": "9102d0da-af89-44c0-bfc8-8306f9f109c6", 00:12:39.035 "assigned_rate_limits": { 00:12:39.035 "rw_ios_per_sec": 0, 00:12:39.035 "rw_mbytes_per_sec": 0, 00:12:39.035 "r_mbytes_per_sec": 0, 00:12:39.035 "w_mbytes_per_sec": 0 00:12:39.035 }, 00:12:39.035 "claimed": true, 00:12:39.035 "claim_type": "exclusive_write", 00:12:39.035 "zoned": false, 00:12:39.035 "supported_io_types": { 00:12:39.035 "read": true, 00:12:39.035 "write": true, 00:12:39.035 "unmap": true, 00:12:39.035 "write_zeroes": true, 00:12:39.035 "flush": true, 00:12:39.035 "reset": true, 00:12:39.035 "compare": false, 00:12:39.035 "compare_and_write": false, 00:12:39.035 "abort": true, 00:12:39.035 "nvme_admin": false, 00:12:39.035 "nvme_io": false 00:12:39.035 }, 00:12:39.035 "memory_domains": [ 00:12:39.035 { 00:12:39.035 "dma_device_id": "system", 00:12:39.035 "dma_device_type": 1 00:12:39.035 }, 00:12:39.035 { 00:12:39.035 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:39.035 "dma_device_type": 2 00:12:39.035 } 00:12:39.035 ], 00:12:39.035 "driver_specific": {} 00:12:39.035 } 00:12:39.035 ] 00:12:39.035 23:54:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:12:39.035 23:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:12:39.035 23:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:12:39.035 23:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:12:39.035 23:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:39.035 23:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:12:39.035 23:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:39.035 23:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:39.035 23:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:39.035 23:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:39.035 23:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:39.035 23:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:39.035 23:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:39.035 23:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:39.035 23:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:39.292 23:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:39.292 "name": "Existed_Raid", 00:12:39.292 "uuid": "1caae88e-3b35-4971-9daa-70a5ae4472c3", 00:12:39.292 "strip_size_kb": 64, 00:12:39.292 "state": "online", 00:12:39.292 "raid_level": "raid0", 00:12:39.292 "superblock": true, 00:12:39.292 "num_base_bdevs": 3, 00:12:39.292 "num_base_bdevs_discovered": 3, 00:12:39.292 "num_base_bdevs_operational": 3, 00:12:39.292 "base_bdevs_list": [ 00:12:39.292 { 00:12:39.293 "name": "BaseBdev1", 00:12:39.293 "uuid": "d88724e6-1a5a-4a55-ac1d-98eeb6079026", 00:12:39.293 "is_configured": true, 00:12:39.293 "data_offset": 2048, 00:12:39.293 "data_size": 63488 00:12:39.293 }, 00:12:39.293 { 00:12:39.293 "name": "BaseBdev2", 00:12:39.293 "uuid": "12199460-0fff-4ac4-b46b-0c374ed94e91", 00:12:39.293 "is_configured": true, 00:12:39.293 "data_offset": 2048, 00:12:39.293 "data_size": 63488 00:12:39.293 }, 00:12:39.293 { 00:12:39.293 "name": "BaseBdev3", 00:12:39.293 "uuid": "9102d0da-af89-44c0-bfc8-8306f9f109c6", 00:12:39.293 "is_configured": true, 00:12:39.293 "data_offset": 2048, 00:12:39.293 "data_size": 63488 00:12:39.293 } 00:12:39.293 ] 00:12:39.293 }' 00:12:39.293 23:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:39.293 23:54:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:39.863 23:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:12:39.863 23:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:12:39.863 23:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:12:39.863 23:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:12:39.863 23:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:12:39.863 23:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:12:39.863 23:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:39.863 23:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:12:40.121 [2024-05-14 23:54:40.515512] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:40.121 23:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:12:40.121 "name": "Existed_Raid", 00:12:40.121 "aliases": [ 00:12:40.121 "1caae88e-3b35-4971-9daa-70a5ae4472c3" 00:12:40.121 ], 00:12:40.121 "product_name": "Raid Volume", 00:12:40.121 "block_size": 512, 00:12:40.121 "num_blocks": 190464, 00:12:40.121 "uuid": "1caae88e-3b35-4971-9daa-70a5ae4472c3", 00:12:40.121 "assigned_rate_limits": { 00:12:40.121 "rw_ios_per_sec": 0, 00:12:40.121 "rw_mbytes_per_sec": 0, 00:12:40.121 "r_mbytes_per_sec": 0, 00:12:40.121 "w_mbytes_per_sec": 0 00:12:40.121 }, 00:12:40.121 "claimed": false, 00:12:40.121 "zoned": false, 00:12:40.121 "supported_io_types": { 00:12:40.121 "read": true, 00:12:40.121 "write": true, 00:12:40.121 "unmap": true, 00:12:40.121 "write_zeroes": true, 00:12:40.121 "flush": true, 00:12:40.121 "reset": true, 00:12:40.121 "compare": false, 00:12:40.121 "compare_and_write": false, 00:12:40.121 "abort": false, 00:12:40.121 "nvme_admin": false, 00:12:40.121 "nvme_io": false 00:12:40.121 }, 00:12:40.121 "memory_domains": [ 00:12:40.121 { 00:12:40.121 "dma_device_id": "system", 00:12:40.121 "dma_device_type": 1 00:12:40.121 }, 00:12:40.121 { 00:12:40.121 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:40.121 "dma_device_type": 2 00:12:40.121 }, 00:12:40.121 { 00:12:40.121 "dma_device_id": "system", 00:12:40.121 "dma_device_type": 1 00:12:40.121 }, 00:12:40.121 { 00:12:40.121 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:40.121 "dma_device_type": 2 00:12:40.121 }, 00:12:40.121 { 00:12:40.121 "dma_device_id": "system", 00:12:40.121 "dma_device_type": 1 00:12:40.121 }, 00:12:40.121 { 00:12:40.121 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:40.121 "dma_device_type": 2 00:12:40.121 } 00:12:40.121 ], 00:12:40.121 "driver_specific": { 00:12:40.121 "raid": { 00:12:40.121 "uuid": "1caae88e-3b35-4971-9daa-70a5ae4472c3", 00:12:40.121 "strip_size_kb": 64, 00:12:40.121 "state": "online", 00:12:40.121 "raid_level": "raid0", 00:12:40.121 "superblock": true, 00:12:40.121 "num_base_bdevs": 3, 00:12:40.121 "num_base_bdevs_discovered": 3, 00:12:40.121 "num_base_bdevs_operational": 3, 00:12:40.121 "base_bdevs_list": [ 00:12:40.121 { 00:12:40.121 "name": "BaseBdev1", 00:12:40.121 "uuid": "d88724e6-1a5a-4a55-ac1d-98eeb6079026", 00:12:40.121 "is_configured": true, 00:12:40.121 "data_offset": 2048, 00:12:40.121 "data_size": 63488 00:12:40.121 }, 00:12:40.121 { 00:12:40.121 "name": "BaseBdev2", 00:12:40.121 "uuid": "12199460-0fff-4ac4-b46b-0c374ed94e91", 00:12:40.121 "is_configured": true, 00:12:40.121 "data_offset": 2048, 00:12:40.121 "data_size": 63488 00:12:40.121 }, 00:12:40.121 { 00:12:40.121 "name": "BaseBdev3", 00:12:40.121 "uuid": "9102d0da-af89-44c0-bfc8-8306f9f109c6", 00:12:40.121 "is_configured": true, 00:12:40.121 "data_offset": 2048, 00:12:40.121 "data_size": 63488 00:12:40.121 } 00:12:40.121 ] 00:12:40.121 } 00:12:40.121 } 00:12:40.121 }' 00:12:40.121 23:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:40.122 23:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:12:40.122 BaseBdev2 00:12:40.122 BaseBdev3' 00:12:40.122 23:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:40.122 23:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:40.122 23:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:40.379 23:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:40.379 "name": "BaseBdev1", 00:12:40.379 "aliases": [ 00:12:40.379 "d88724e6-1a5a-4a55-ac1d-98eeb6079026" 00:12:40.379 ], 00:12:40.379 "product_name": "Malloc disk", 00:12:40.379 "block_size": 512, 00:12:40.379 "num_blocks": 65536, 00:12:40.379 "uuid": "d88724e6-1a5a-4a55-ac1d-98eeb6079026", 00:12:40.379 "assigned_rate_limits": { 00:12:40.379 "rw_ios_per_sec": 0, 00:12:40.379 "rw_mbytes_per_sec": 0, 00:12:40.379 "r_mbytes_per_sec": 0, 00:12:40.379 "w_mbytes_per_sec": 0 00:12:40.379 }, 00:12:40.379 "claimed": true, 00:12:40.379 "claim_type": "exclusive_write", 00:12:40.379 "zoned": false, 00:12:40.379 "supported_io_types": { 00:12:40.379 "read": true, 00:12:40.379 "write": true, 00:12:40.379 "unmap": true, 00:12:40.379 "write_zeroes": true, 00:12:40.379 "flush": true, 00:12:40.379 "reset": true, 00:12:40.379 "compare": false, 00:12:40.379 "compare_and_write": false, 00:12:40.379 "abort": true, 00:12:40.379 "nvme_admin": false, 00:12:40.379 "nvme_io": false 00:12:40.379 }, 00:12:40.379 "memory_domains": [ 00:12:40.379 { 00:12:40.379 "dma_device_id": "system", 00:12:40.379 "dma_device_type": 1 00:12:40.379 }, 00:12:40.379 { 00:12:40.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:40.379 "dma_device_type": 2 00:12:40.379 } 00:12:40.379 ], 00:12:40.379 "driver_specific": {} 00:12:40.379 }' 00:12:40.379 23:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:40.379 23:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:40.379 23:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:40.379 23:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:40.637 23:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:40.637 23:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:40.637 23:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:40.637 23:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:40.637 23:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:40.637 23:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:40.637 23:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:40.637 23:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:40.637 23:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:40.637 23:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:40.637 23:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:40.895 23:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:40.895 "name": "BaseBdev2", 00:12:40.895 "aliases": [ 00:12:40.895 "12199460-0fff-4ac4-b46b-0c374ed94e91" 00:12:40.895 ], 00:12:40.896 "product_name": "Malloc disk", 00:12:40.896 "block_size": 512, 00:12:40.896 "num_blocks": 65536, 00:12:40.896 "uuid": "12199460-0fff-4ac4-b46b-0c374ed94e91", 00:12:40.896 "assigned_rate_limits": { 00:12:40.896 "rw_ios_per_sec": 0, 00:12:40.896 "rw_mbytes_per_sec": 0, 00:12:40.896 "r_mbytes_per_sec": 0, 00:12:40.896 "w_mbytes_per_sec": 0 00:12:40.896 }, 00:12:40.896 "claimed": true, 00:12:40.896 "claim_type": "exclusive_write", 00:12:40.896 "zoned": false, 00:12:40.896 "supported_io_types": { 00:12:40.896 "read": true, 00:12:40.896 "write": true, 00:12:40.896 "unmap": true, 00:12:40.896 "write_zeroes": true, 00:12:40.896 "flush": true, 00:12:40.896 "reset": true, 00:12:40.896 "compare": false, 00:12:40.896 "compare_and_write": false, 00:12:40.896 "abort": true, 00:12:40.896 "nvme_admin": false, 00:12:40.896 "nvme_io": false 00:12:40.896 }, 00:12:40.896 "memory_domains": [ 00:12:40.896 { 00:12:40.896 "dma_device_id": "system", 00:12:40.896 "dma_device_type": 1 00:12:40.896 }, 00:12:40.896 { 00:12:40.896 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:40.896 "dma_device_type": 2 00:12:40.896 } 00:12:40.896 ], 00:12:40.896 "driver_specific": {} 00:12:40.896 }' 00:12:40.896 23:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:40.896 23:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:41.154 23:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:41.154 23:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:41.154 23:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:41.154 23:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:41.154 23:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:41.154 23:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:41.154 23:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:41.154 23:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:41.154 23:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:41.413 23:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:41.413 23:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:41.413 23:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:41.413 23:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:41.671 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:41.671 "name": "BaseBdev3", 00:12:41.671 "aliases": [ 00:12:41.671 "9102d0da-af89-44c0-bfc8-8306f9f109c6" 00:12:41.671 ], 00:12:41.671 "product_name": "Malloc disk", 00:12:41.671 "block_size": 512, 00:12:41.671 "num_blocks": 65536, 00:12:41.671 "uuid": "9102d0da-af89-44c0-bfc8-8306f9f109c6", 00:12:41.671 "assigned_rate_limits": { 00:12:41.671 "rw_ios_per_sec": 0, 00:12:41.671 "rw_mbytes_per_sec": 0, 00:12:41.671 "r_mbytes_per_sec": 0, 00:12:41.671 "w_mbytes_per_sec": 0 00:12:41.671 }, 00:12:41.671 "claimed": true, 00:12:41.671 "claim_type": "exclusive_write", 00:12:41.671 "zoned": false, 00:12:41.671 "supported_io_types": { 00:12:41.671 "read": true, 00:12:41.671 "write": true, 00:12:41.671 "unmap": true, 00:12:41.671 "write_zeroes": true, 00:12:41.671 "flush": true, 00:12:41.671 "reset": true, 00:12:41.671 "compare": false, 00:12:41.671 "compare_and_write": false, 00:12:41.671 "abort": true, 00:12:41.671 "nvme_admin": false, 00:12:41.671 "nvme_io": false 00:12:41.671 }, 00:12:41.671 "memory_domains": [ 00:12:41.671 { 00:12:41.671 "dma_device_id": "system", 00:12:41.671 "dma_device_type": 1 00:12:41.671 }, 00:12:41.671 { 00:12:41.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:41.671 "dma_device_type": 2 00:12:41.671 } 00:12:41.671 ], 00:12:41.671 "driver_specific": {} 00:12:41.671 }' 00:12:41.671 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:41.671 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:41.671 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:41.671 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:41.671 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:41.671 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:41.671 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:41.671 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:41.929 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:41.929 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:41.929 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:41.929 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:41.929 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:42.187 [2024-05-14 23:54:42.604891] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:42.187 [2024-05-14 23:54:42.604925] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:42.187 [2024-05-14 23:54:42.604968] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:42.187 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # local expected_state 00:12:42.187 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # has_redundancy raid0 00:12:42.187 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # case $1 in 00:12:42.187 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@216 -- # return 1 00:12:42.187 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:12:42.187 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:12:42.187 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:42.187 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:12:42.187 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:42.187 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:42.187 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:12:42.187 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:42.187 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:42.187 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:42.187 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:42.187 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:42.187 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:42.444 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:42.444 "name": "Existed_Raid", 00:12:42.444 "uuid": "1caae88e-3b35-4971-9daa-70a5ae4472c3", 00:12:42.444 "strip_size_kb": 64, 00:12:42.444 "state": "offline", 00:12:42.444 "raid_level": "raid0", 00:12:42.444 "superblock": true, 00:12:42.444 "num_base_bdevs": 3, 00:12:42.444 "num_base_bdevs_discovered": 2, 00:12:42.444 "num_base_bdevs_operational": 2, 00:12:42.444 "base_bdevs_list": [ 00:12:42.444 { 00:12:42.444 "name": null, 00:12:42.444 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:42.444 "is_configured": false, 00:12:42.444 "data_offset": 2048, 00:12:42.444 "data_size": 63488 00:12:42.444 }, 00:12:42.444 { 00:12:42.444 "name": "BaseBdev2", 00:12:42.444 "uuid": "12199460-0fff-4ac4-b46b-0c374ed94e91", 00:12:42.444 "is_configured": true, 00:12:42.444 "data_offset": 2048, 00:12:42.444 "data_size": 63488 00:12:42.444 }, 00:12:42.444 { 00:12:42.444 "name": "BaseBdev3", 00:12:42.444 "uuid": "9102d0da-af89-44c0-bfc8-8306f9f109c6", 00:12:42.444 "is_configured": true, 00:12:42.444 "data_offset": 2048, 00:12:42.444 "data_size": 63488 00:12:42.444 } 00:12:42.444 ] 00:12:42.444 }' 00:12:42.444 23:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:42.444 23:54:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:43.009 23:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:12:43.009 23:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:12:43.009 23:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:43.009 23:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:12:43.267 23:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:12:43.267 23:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:43.267 23:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:43.525 [2024-05-14 23:54:43.914355] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:43.525 23:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:12:43.525 23:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:12:43.525 23:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:43.525 23:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:12:43.783 23:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:12:43.783 23:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:43.783 23:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:12:44.041 [2024-05-14 23:54:44.420059] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:44.041 [2024-05-14 23:54:44.420105] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1960560 name Existed_Raid, state offline 00:12:44.041 23:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:12:44.041 23:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:12:44.041 23:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:44.041 23:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:12:44.299 23:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:12:44.299 23:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:12:44.299 23:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@300 -- # '[' 3 -gt 2 ']' 00:12:44.299 23:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:12:44.299 23:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:12:44.299 23:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:44.557 BaseBdev2 00:12:44.557 23:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:12:44.557 23:54:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:12:44.557 23:54:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:44.557 23:54:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:12:44.558 23:54:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:44.558 23:54:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:44.558 23:54:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:44.816 23:54:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:45.075 [ 00:12:45.075 { 00:12:45.075 "name": "BaseBdev2", 00:12:45.075 "aliases": [ 00:12:45.075 "a3f1771e-dfd0-403a-9bee-5ed69ea01a7c" 00:12:45.075 ], 00:12:45.075 "product_name": "Malloc disk", 00:12:45.075 "block_size": 512, 00:12:45.075 "num_blocks": 65536, 00:12:45.075 "uuid": "a3f1771e-dfd0-403a-9bee-5ed69ea01a7c", 00:12:45.075 "assigned_rate_limits": { 00:12:45.075 "rw_ios_per_sec": 0, 00:12:45.075 "rw_mbytes_per_sec": 0, 00:12:45.075 "r_mbytes_per_sec": 0, 00:12:45.075 "w_mbytes_per_sec": 0 00:12:45.075 }, 00:12:45.075 "claimed": false, 00:12:45.075 "zoned": false, 00:12:45.075 "supported_io_types": { 00:12:45.075 "read": true, 00:12:45.075 "write": true, 00:12:45.075 "unmap": true, 00:12:45.075 "write_zeroes": true, 00:12:45.075 "flush": true, 00:12:45.075 "reset": true, 00:12:45.075 "compare": false, 00:12:45.075 "compare_and_write": false, 00:12:45.075 "abort": true, 00:12:45.075 "nvme_admin": false, 00:12:45.075 "nvme_io": false 00:12:45.075 }, 00:12:45.075 "memory_domains": [ 00:12:45.075 { 00:12:45.075 "dma_device_id": "system", 00:12:45.075 "dma_device_type": 1 00:12:45.075 }, 00:12:45.075 { 00:12:45.075 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:45.075 "dma_device_type": 2 00:12:45.075 } 00:12:45.075 ], 00:12:45.075 "driver_specific": {} 00:12:45.075 } 00:12:45.075 ] 00:12:45.075 23:54:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:12:45.075 23:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:12:45.075 23:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:12:45.075 23:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:45.075 BaseBdev3 00:12:45.075 23:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:12:45.075 23:54:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:12:45.075 23:54:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:45.075 23:54:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:12:45.075 23:54:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:45.075 23:54:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:45.075 23:54:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:45.334 23:54:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:45.592 [ 00:12:45.592 { 00:12:45.592 "name": "BaseBdev3", 00:12:45.592 "aliases": [ 00:12:45.592 "f371ffa5-5545-4f7d-b0a6-f42c45d8df19" 00:12:45.592 ], 00:12:45.592 "product_name": "Malloc disk", 00:12:45.592 "block_size": 512, 00:12:45.592 "num_blocks": 65536, 00:12:45.592 "uuid": "f371ffa5-5545-4f7d-b0a6-f42c45d8df19", 00:12:45.592 "assigned_rate_limits": { 00:12:45.592 "rw_ios_per_sec": 0, 00:12:45.592 "rw_mbytes_per_sec": 0, 00:12:45.592 "r_mbytes_per_sec": 0, 00:12:45.592 "w_mbytes_per_sec": 0 00:12:45.592 }, 00:12:45.592 "claimed": false, 00:12:45.592 "zoned": false, 00:12:45.592 "supported_io_types": { 00:12:45.592 "read": true, 00:12:45.592 "write": true, 00:12:45.592 "unmap": true, 00:12:45.592 "write_zeroes": true, 00:12:45.592 "flush": true, 00:12:45.592 "reset": true, 00:12:45.592 "compare": false, 00:12:45.592 "compare_and_write": false, 00:12:45.592 "abort": true, 00:12:45.592 "nvme_admin": false, 00:12:45.592 "nvme_io": false 00:12:45.592 }, 00:12:45.592 "memory_domains": [ 00:12:45.592 { 00:12:45.592 "dma_device_id": "system", 00:12:45.592 "dma_device_type": 1 00:12:45.592 }, 00:12:45.592 { 00:12:45.592 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:45.592 "dma_device_type": 2 00:12:45.592 } 00:12:45.592 ], 00:12:45.592 "driver_specific": {} 00:12:45.592 } 00:12:45.592 ] 00:12:45.592 23:54:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:12:45.592 23:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:12:45.592 23:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:12:45.592 23:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:45.850 [2024-05-14 23:54:46.391768] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:45.850 [2024-05-14 23:54:46.391812] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:45.850 [2024-05-14 23:54:46.391833] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:45.850 [2024-05-14 23:54:46.393180] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:45.850 23:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:45.850 23:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:45.850 23:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:45.850 23:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:45.850 23:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:45.850 23:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:45.850 23:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:45.850 23:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:45.850 23:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:45.850 23:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:45.850 23:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:45.850 23:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:46.108 23:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:46.108 "name": "Existed_Raid", 00:12:46.108 "uuid": "a9ef7918-fe23-4009-909b-ca7684e88306", 00:12:46.108 "strip_size_kb": 64, 00:12:46.108 "state": "configuring", 00:12:46.108 "raid_level": "raid0", 00:12:46.108 "superblock": true, 00:12:46.108 "num_base_bdevs": 3, 00:12:46.108 "num_base_bdevs_discovered": 2, 00:12:46.108 "num_base_bdevs_operational": 3, 00:12:46.108 "base_bdevs_list": [ 00:12:46.108 { 00:12:46.108 "name": "BaseBdev1", 00:12:46.108 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:46.108 "is_configured": false, 00:12:46.108 "data_offset": 0, 00:12:46.108 "data_size": 0 00:12:46.108 }, 00:12:46.108 { 00:12:46.108 "name": "BaseBdev2", 00:12:46.108 "uuid": "a3f1771e-dfd0-403a-9bee-5ed69ea01a7c", 00:12:46.108 "is_configured": true, 00:12:46.108 "data_offset": 2048, 00:12:46.108 "data_size": 63488 00:12:46.108 }, 00:12:46.108 { 00:12:46.108 "name": "BaseBdev3", 00:12:46.108 "uuid": "f371ffa5-5545-4f7d-b0a6-f42c45d8df19", 00:12:46.108 "is_configured": true, 00:12:46.108 "data_offset": 2048, 00:12:46.108 "data_size": 63488 00:12:46.108 } 00:12:46.108 ] 00:12:46.108 }' 00:12:46.108 23:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:46.108 23:54:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:46.674 23:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:12:46.932 [2024-05-14 23:54:47.478633] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:46.932 23:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:46.932 23:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:46.932 23:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:46.932 23:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:46.932 23:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:46.932 23:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:46.932 23:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:46.932 23:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:46.932 23:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:46.932 23:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:46.932 23:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:46.933 23:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:47.191 23:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:47.191 "name": "Existed_Raid", 00:12:47.191 "uuid": "a9ef7918-fe23-4009-909b-ca7684e88306", 00:12:47.191 "strip_size_kb": 64, 00:12:47.191 "state": "configuring", 00:12:47.191 "raid_level": "raid0", 00:12:47.191 "superblock": true, 00:12:47.191 "num_base_bdevs": 3, 00:12:47.191 "num_base_bdevs_discovered": 1, 00:12:47.191 "num_base_bdevs_operational": 3, 00:12:47.191 "base_bdevs_list": [ 00:12:47.191 { 00:12:47.191 "name": "BaseBdev1", 00:12:47.191 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:47.191 "is_configured": false, 00:12:47.191 "data_offset": 0, 00:12:47.191 "data_size": 0 00:12:47.191 }, 00:12:47.191 { 00:12:47.191 "name": null, 00:12:47.191 "uuid": "a3f1771e-dfd0-403a-9bee-5ed69ea01a7c", 00:12:47.191 "is_configured": false, 00:12:47.191 "data_offset": 2048, 00:12:47.191 "data_size": 63488 00:12:47.191 }, 00:12:47.191 { 00:12:47.191 "name": "BaseBdev3", 00:12:47.191 "uuid": "f371ffa5-5545-4f7d-b0a6-f42c45d8df19", 00:12:47.191 "is_configured": true, 00:12:47.191 "data_offset": 2048, 00:12:47.191 "data_size": 63488 00:12:47.191 } 00:12:47.191 ] 00:12:47.191 }' 00:12:47.191 23:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:47.191 23:54:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:48.125 23:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:48.125 23:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:48.125 23:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:12:48.125 23:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:48.384 [2024-05-14 23:54:48.830852] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:48.384 BaseBdev1 00:12:48.384 23:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:12:48.384 23:54:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:12:48.385 23:54:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:48.385 23:54:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:12:48.385 23:54:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:48.385 23:54:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:48.385 23:54:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:48.643 23:54:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:48.902 [ 00:12:48.902 { 00:12:48.902 "name": "BaseBdev1", 00:12:48.902 "aliases": [ 00:12:48.902 "38b899a0-024b-4773-ad3a-5530ae39acc9" 00:12:48.902 ], 00:12:48.902 "product_name": "Malloc disk", 00:12:48.902 "block_size": 512, 00:12:48.902 "num_blocks": 65536, 00:12:48.902 "uuid": "38b899a0-024b-4773-ad3a-5530ae39acc9", 00:12:48.902 "assigned_rate_limits": { 00:12:48.902 "rw_ios_per_sec": 0, 00:12:48.902 "rw_mbytes_per_sec": 0, 00:12:48.902 "r_mbytes_per_sec": 0, 00:12:48.902 "w_mbytes_per_sec": 0 00:12:48.902 }, 00:12:48.902 "claimed": true, 00:12:48.902 "claim_type": "exclusive_write", 00:12:48.902 "zoned": false, 00:12:48.902 "supported_io_types": { 00:12:48.902 "read": true, 00:12:48.902 "write": true, 00:12:48.902 "unmap": true, 00:12:48.902 "write_zeroes": true, 00:12:48.902 "flush": true, 00:12:48.902 "reset": true, 00:12:48.902 "compare": false, 00:12:48.902 "compare_and_write": false, 00:12:48.902 "abort": true, 00:12:48.902 "nvme_admin": false, 00:12:48.902 "nvme_io": false 00:12:48.902 }, 00:12:48.902 "memory_domains": [ 00:12:48.902 { 00:12:48.902 "dma_device_id": "system", 00:12:48.902 "dma_device_type": 1 00:12:48.902 }, 00:12:48.902 { 00:12:48.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:48.902 "dma_device_type": 2 00:12:48.902 } 00:12:48.902 ], 00:12:48.902 "driver_specific": {} 00:12:48.902 } 00:12:48.902 ] 00:12:48.902 23:54:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:12:48.902 23:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:48.902 23:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:48.902 23:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:48.902 23:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:48.902 23:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:48.902 23:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:48.902 23:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:48.902 23:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:48.902 23:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:48.902 23:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:48.902 23:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:48.902 23:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:49.161 23:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:49.161 "name": "Existed_Raid", 00:12:49.161 "uuid": "a9ef7918-fe23-4009-909b-ca7684e88306", 00:12:49.161 "strip_size_kb": 64, 00:12:49.161 "state": "configuring", 00:12:49.161 "raid_level": "raid0", 00:12:49.161 "superblock": true, 00:12:49.161 "num_base_bdevs": 3, 00:12:49.161 "num_base_bdevs_discovered": 2, 00:12:49.161 "num_base_bdevs_operational": 3, 00:12:49.161 "base_bdevs_list": [ 00:12:49.161 { 00:12:49.161 "name": "BaseBdev1", 00:12:49.161 "uuid": "38b899a0-024b-4773-ad3a-5530ae39acc9", 00:12:49.161 "is_configured": true, 00:12:49.161 "data_offset": 2048, 00:12:49.161 "data_size": 63488 00:12:49.161 }, 00:12:49.161 { 00:12:49.161 "name": null, 00:12:49.161 "uuid": "a3f1771e-dfd0-403a-9bee-5ed69ea01a7c", 00:12:49.161 "is_configured": false, 00:12:49.161 "data_offset": 2048, 00:12:49.161 "data_size": 63488 00:12:49.161 }, 00:12:49.161 { 00:12:49.161 "name": "BaseBdev3", 00:12:49.161 "uuid": "f371ffa5-5545-4f7d-b0a6-f42c45d8df19", 00:12:49.161 "is_configured": true, 00:12:49.161 "data_offset": 2048, 00:12:49.161 "data_size": 63488 00:12:49.161 } 00:12:49.161 ] 00:12:49.161 }' 00:12:49.161 23:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:49.161 23:54:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:49.729 23:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:49.729 23:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:49.988 23:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:12:49.988 23:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:12:50.246 [2024-05-14 23:54:50.583524] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:50.246 23:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:50.246 23:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:50.246 23:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:50.246 23:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:50.246 23:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:50.246 23:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:50.246 23:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:50.246 23:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:50.246 23:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:50.246 23:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:50.246 23:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:50.246 23:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:50.528 23:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:50.528 "name": "Existed_Raid", 00:12:50.528 "uuid": "a9ef7918-fe23-4009-909b-ca7684e88306", 00:12:50.528 "strip_size_kb": 64, 00:12:50.528 "state": "configuring", 00:12:50.528 "raid_level": "raid0", 00:12:50.528 "superblock": true, 00:12:50.528 "num_base_bdevs": 3, 00:12:50.528 "num_base_bdevs_discovered": 1, 00:12:50.528 "num_base_bdevs_operational": 3, 00:12:50.528 "base_bdevs_list": [ 00:12:50.528 { 00:12:50.528 "name": "BaseBdev1", 00:12:50.528 "uuid": "38b899a0-024b-4773-ad3a-5530ae39acc9", 00:12:50.528 "is_configured": true, 00:12:50.528 "data_offset": 2048, 00:12:50.528 "data_size": 63488 00:12:50.528 }, 00:12:50.528 { 00:12:50.528 "name": null, 00:12:50.528 "uuid": "a3f1771e-dfd0-403a-9bee-5ed69ea01a7c", 00:12:50.528 "is_configured": false, 00:12:50.528 "data_offset": 2048, 00:12:50.528 "data_size": 63488 00:12:50.528 }, 00:12:50.528 { 00:12:50.528 "name": null, 00:12:50.528 "uuid": "f371ffa5-5545-4f7d-b0a6-f42c45d8df19", 00:12:50.528 "is_configured": false, 00:12:50.528 "data_offset": 2048, 00:12:50.528 "data_size": 63488 00:12:50.528 } 00:12:50.528 ] 00:12:50.528 }' 00:12:50.528 23:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:50.528 23:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:51.105 23:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:51.105 23:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:51.105 23:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:12:51.105 23:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:12:51.364 [2024-05-14 23:54:51.830849] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:51.364 23:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:51.364 23:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:51.364 23:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:51.364 23:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:51.364 23:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:51.364 23:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:51.364 23:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:51.364 23:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:51.364 23:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:51.364 23:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:51.364 23:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:51.364 23:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:51.624 23:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:51.624 "name": "Existed_Raid", 00:12:51.624 "uuid": "a9ef7918-fe23-4009-909b-ca7684e88306", 00:12:51.624 "strip_size_kb": 64, 00:12:51.624 "state": "configuring", 00:12:51.624 "raid_level": "raid0", 00:12:51.624 "superblock": true, 00:12:51.624 "num_base_bdevs": 3, 00:12:51.624 "num_base_bdevs_discovered": 2, 00:12:51.624 "num_base_bdevs_operational": 3, 00:12:51.624 "base_bdevs_list": [ 00:12:51.624 { 00:12:51.624 "name": "BaseBdev1", 00:12:51.624 "uuid": "38b899a0-024b-4773-ad3a-5530ae39acc9", 00:12:51.624 "is_configured": true, 00:12:51.624 "data_offset": 2048, 00:12:51.624 "data_size": 63488 00:12:51.624 }, 00:12:51.624 { 00:12:51.624 "name": null, 00:12:51.624 "uuid": "a3f1771e-dfd0-403a-9bee-5ed69ea01a7c", 00:12:51.624 "is_configured": false, 00:12:51.624 "data_offset": 2048, 00:12:51.624 "data_size": 63488 00:12:51.624 }, 00:12:51.624 { 00:12:51.624 "name": "BaseBdev3", 00:12:51.624 "uuid": "f371ffa5-5545-4f7d-b0a6-f42c45d8df19", 00:12:51.624 "is_configured": true, 00:12:51.624 "data_offset": 2048, 00:12:51.624 "data_size": 63488 00:12:51.624 } 00:12:51.624 ] 00:12:51.624 }' 00:12:51.624 23:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:51.624 23:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:52.192 23:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:52.192 23:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:52.451 23:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:12:52.451 23:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:52.710 [2024-05-14 23:54:53.130289] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:52.710 23:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:52.710 23:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:52.710 23:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:52.710 23:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:52.710 23:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:52.710 23:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:52.710 23:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:52.710 23:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:52.710 23:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:52.710 23:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:52.710 23:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:52.710 23:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:52.968 23:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:52.968 "name": "Existed_Raid", 00:12:52.968 "uuid": "a9ef7918-fe23-4009-909b-ca7684e88306", 00:12:52.968 "strip_size_kb": 64, 00:12:52.968 "state": "configuring", 00:12:52.968 "raid_level": "raid0", 00:12:52.968 "superblock": true, 00:12:52.968 "num_base_bdevs": 3, 00:12:52.968 "num_base_bdevs_discovered": 1, 00:12:52.968 "num_base_bdevs_operational": 3, 00:12:52.968 "base_bdevs_list": [ 00:12:52.968 { 00:12:52.968 "name": null, 00:12:52.968 "uuid": "38b899a0-024b-4773-ad3a-5530ae39acc9", 00:12:52.968 "is_configured": false, 00:12:52.968 "data_offset": 2048, 00:12:52.968 "data_size": 63488 00:12:52.968 }, 00:12:52.968 { 00:12:52.968 "name": null, 00:12:52.968 "uuid": "a3f1771e-dfd0-403a-9bee-5ed69ea01a7c", 00:12:52.968 "is_configured": false, 00:12:52.968 "data_offset": 2048, 00:12:52.968 "data_size": 63488 00:12:52.968 }, 00:12:52.968 { 00:12:52.968 "name": "BaseBdev3", 00:12:52.968 "uuid": "f371ffa5-5545-4f7d-b0a6-f42c45d8df19", 00:12:52.968 "is_configured": true, 00:12:52.968 "data_offset": 2048, 00:12:52.968 "data_size": 63488 00:12:52.968 } 00:12:52.968 ] 00:12:52.968 }' 00:12:52.968 23:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:52.968 23:54:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:53.535 23:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:53.535 23:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:53.795 23:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:12:53.795 23:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:12:54.055 [2024-05-14 23:54:54.433995] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:54.055 23:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:54.055 23:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:54.055 23:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:54.055 23:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:54.055 23:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:54.055 23:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:54.055 23:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:54.055 23:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:54.055 23:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:54.055 23:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:54.055 23:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:54.055 23:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:54.314 23:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:54.314 "name": "Existed_Raid", 00:12:54.314 "uuid": "a9ef7918-fe23-4009-909b-ca7684e88306", 00:12:54.314 "strip_size_kb": 64, 00:12:54.314 "state": "configuring", 00:12:54.314 "raid_level": "raid0", 00:12:54.314 "superblock": true, 00:12:54.314 "num_base_bdevs": 3, 00:12:54.314 "num_base_bdevs_discovered": 2, 00:12:54.314 "num_base_bdevs_operational": 3, 00:12:54.314 "base_bdevs_list": [ 00:12:54.314 { 00:12:54.314 "name": null, 00:12:54.314 "uuid": "38b899a0-024b-4773-ad3a-5530ae39acc9", 00:12:54.314 "is_configured": false, 00:12:54.314 "data_offset": 2048, 00:12:54.314 "data_size": 63488 00:12:54.314 }, 00:12:54.314 { 00:12:54.314 "name": "BaseBdev2", 00:12:54.314 "uuid": "a3f1771e-dfd0-403a-9bee-5ed69ea01a7c", 00:12:54.314 "is_configured": true, 00:12:54.314 "data_offset": 2048, 00:12:54.314 "data_size": 63488 00:12:54.314 }, 00:12:54.314 { 00:12:54.314 "name": "BaseBdev3", 00:12:54.314 "uuid": "f371ffa5-5545-4f7d-b0a6-f42c45d8df19", 00:12:54.314 "is_configured": true, 00:12:54.314 "data_offset": 2048, 00:12:54.314 "data_size": 63488 00:12:54.314 } 00:12:54.314 ] 00:12:54.314 }' 00:12:54.314 23:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:54.314 23:54:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:54.880 23:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:54.880 23:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:54.880 23:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:12:54.880 23:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:54.880 23:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:12:55.139 23:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 38b899a0-024b-4773-ad3a-5530ae39acc9 00:12:55.398 [2024-05-14 23:54:55.857241] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:12:55.398 [2024-05-14 23:54:55.857393] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b04650 00:12:55.398 [2024-05-14 23:54:55.857421] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:55.398 [2024-05-14 23:54:55.857597] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b11120 00:12:55.398 [2024-05-14 23:54:55.857718] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b04650 00:12:55.398 [2024-05-14 23:54:55.857734] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1b04650 00:12:55.398 [2024-05-14 23:54:55.857828] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:55.398 NewBaseBdev 00:12:55.398 23:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:12:55.398 23:54:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:12:55.398 23:54:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:55.398 23:54:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:12:55.398 23:54:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:55.398 23:54:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:55.398 23:54:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:55.656 23:54:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:12:55.914 [ 00:12:55.914 { 00:12:55.914 "name": "NewBaseBdev", 00:12:55.914 "aliases": [ 00:12:55.914 "38b899a0-024b-4773-ad3a-5530ae39acc9" 00:12:55.914 ], 00:12:55.914 "product_name": "Malloc disk", 00:12:55.914 "block_size": 512, 00:12:55.914 "num_blocks": 65536, 00:12:55.914 "uuid": "38b899a0-024b-4773-ad3a-5530ae39acc9", 00:12:55.914 "assigned_rate_limits": { 00:12:55.914 "rw_ios_per_sec": 0, 00:12:55.914 "rw_mbytes_per_sec": 0, 00:12:55.914 "r_mbytes_per_sec": 0, 00:12:55.914 "w_mbytes_per_sec": 0 00:12:55.914 }, 00:12:55.914 "claimed": true, 00:12:55.914 "claim_type": "exclusive_write", 00:12:55.914 "zoned": false, 00:12:55.914 "supported_io_types": { 00:12:55.914 "read": true, 00:12:55.914 "write": true, 00:12:55.914 "unmap": true, 00:12:55.914 "write_zeroes": true, 00:12:55.914 "flush": true, 00:12:55.914 "reset": true, 00:12:55.914 "compare": false, 00:12:55.914 "compare_and_write": false, 00:12:55.914 "abort": true, 00:12:55.914 "nvme_admin": false, 00:12:55.914 "nvme_io": false 00:12:55.914 }, 00:12:55.914 "memory_domains": [ 00:12:55.914 { 00:12:55.914 "dma_device_id": "system", 00:12:55.914 "dma_device_type": 1 00:12:55.914 }, 00:12:55.914 { 00:12:55.914 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:55.914 "dma_device_type": 2 00:12:55.914 } 00:12:55.914 ], 00:12:55.914 "driver_specific": {} 00:12:55.914 } 00:12:55.914 ] 00:12:55.914 23:54:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:12:55.914 23:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:12:55.914 23:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:55.914 23:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:12:55.914 23:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:55.914 23:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:55.914 23:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:55.914 23:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:55.914 23:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:55.914 23:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:55.914 23:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:55.914 23:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:55.914 23:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:56.172 23:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:56.172 "name": "Existed_Raid", 00:12:56.172 "uuid": "a9ef7918-fe23-4009-909b-ca7684e88306", 00:12:56.172 "strip_size_kb": 64, 00:12:56.172 "state": "online", 00:12:56.172 "raid_level": "raid0", 00:12:56.172 "superblock": true, 00:12:56.172 "num_base_bdevs": 3, 00:12:56.172 "num_base_bdevs_discovered": 3, 00:12:56.172 "num_base_bdevs_operational": 3, 00:12:56.172 "base_bdevs_list": [ 00:12:56.172 { 00:12:56.172 "name": "NewBaseBdev", 00:12:56.172 "uuid": "38b899a0-024b-4773-ad3a-5530ae39acc9", 00:12:56.172 "is_configured": true, 00:12:56.172 "data_offset": 2048, 00:12:56.172 "data_size": 63488 00:12:56.172 }, 00:12:56.172 { 00:12:56.172 "name": "BaseBdev2", 00:12:56.172 "uuid": "a3f1771e-dfd0-403a-9bee-5ed69ea01a7c", 00:12:56.173 "is_configured": true, 00:12:56.173 "data_offset": 2048, 00:12:56.173 "data_size": 63488 00:12:56.173 }, 00:12:56.173 { 00:12:56.173 "name": "BaseBdev3", 00:12:56.173 "uuid": "f371ffa5-5545-4f7d-b0a6-f42c45d8df19", 00:12:56.173 "is_configured": true, 00:12:56.173 "data_offset": 2048, 00:12:56.173 "data_size": 63488 00:12:56.173 } 00:12:56.173 ] 00:12:56.173 }' 00:12:56.173 23:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:56.173 23:54:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:56.764 23:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:12:56.764 23:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:12:56.764 23:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:12:56.764 23:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:12:56.764 23:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:12:56.764 23:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:12:56.764 23:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:56.764 23:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:12:56.764 [2024-05-14 23:54:57.341454] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:57.021 23:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:12:57.021 "name": "Existed_Raid", 00:12:57.021 "aliases": [ 00:12:57.021 "a9ef7918-fe23-4009-909b-ca7684e88306" 00:12:57.021 ], 00:12:57.021 "product_name": "Raid Volume", 00:12:57.021 "block_size": 512, 00:12:57.021 "num_blocks": 190464, 00:12:57.021 "uuid": "a9ef7918-fe23-4009-909b-ca7684e88306", 00:12:57.021 "assigned_rate_limits": { 00:12:57.021 "rw_ios_per_sec": 0, 00:12:57.021 "rw_mbytes_per_sec": 0, 00:12:57.021 "r_mbytes_per_sec": 0, 00:12:57.021 "w_mbytes_per_sec": 0 00:12:57.021 }, 00:12:57.021 "claimed": false, 00:12:57.021 "zoned": false, 00:12:57.021 "supported_io_types": { 00:12:57.021 "read": true, 00:12:57.021 "write": true, 00:12:57.021 "unmap": true, 00:12:57.021 "write_zeroes": true, 00:12:57.021 "flush": true, 00:12:57.021 "reset": true, 00:12:57.021 "compare": false, 00:12:57.021 "compare_and_write": false, 00:12:57.021 "abort": false, 00:12:57.021 "nvme_admin": false, 00:12:57.021 "nvme_io": false 00:12:57.021 }, 00:12:57.021 "memory_domains": [ 00:12:57.021 { 00:12:57.021 "dma_device_id": "system", 00:12:57.021 "dma_device_type": 1 00:12:57.021 }, 00:12:57.021 { 00:12:57.021 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:57.021 "dma_device_type": 2 00:12:57.021 }, 00:12:57.021 { 00:12:57.021 "dma_device_id": "system", 00:12:57.021 "dma_device_type": 1 00:12:57.021 }, 00:12:57.021 { 00:12:57.021 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:57.021 "dma_device_type": 2 00:12:57.021 }, 00:12:57.021 { 00:12:57.021 "dma_device_id": "system", 00:12:57.021 "dma_device_type": 1 00:12:57.021 }, 00:12:57.021 { 00:12:57.021 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:57.022 "dma_device_type": 2 00:12:57.022 } 00:12:57.022 ], 00:12:57.022 "driver_specific": { 00:12:57.022 "raid": { 00:12:57.022 "uuid": "a9ef7918-fe23-4009-909b-ca7684e88306", 00:12:57.022 "strip_size_kb": 64, 00:12:57.022 "state": "online", 00:12:57.022 "raid_level": "raid0", 00:12:57.022 "superblock": true, 00:12:57.022 "num_base_bdevs": 3, 00:12:57.022 "num_base_bdevs_discovered": 3, 00:12:57.022 "num_base_bdevs_operational": 3, 00:12:57.022 "base_bdevs_list": [ 00:12:57.022 { 00:12:57.022 "name": "NewBaseBdev", 00:12:57.022 "uuid": "38b899a0-024b-4773-ad3a-5530ae39acc9", 00:12:57.022 "is_configured": true, 00:12:57.022 "data_offset": 2048, 00:12:57.022 "data_size": 63488 00:12:57.022 }, 00:12:57.022 { 00:12:57.022 "name": "BaseBdev2", 00:12:57.022 "uuid": "a3f1771e-dfd0-403a-9bee-5ed69ea01a7c", 00:12:57.022 "is_configured": true, 00:12:57.022 "data_offset": 2048, 00:12:57.022 "data_size": 63488 00:12:57.022 }, 00:12:57.022 { 00:12:57.022 "name": "BaseBdev3", 00:12:57.022 "uuid": "f371ffa5-5545-4f7d-b0a6-f42c45d8df19", 00:12:57.022 "is_configured": true, 00:12:57.022 "data_offset": 2048, 00:12:57.022 "data_size": 63488 00:12:57.022 } 00:12:57.022 ] 00:12:57.022 } 00:12:57.022 } 00:12:57.022 }' 00:12:57.022 23:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:57.022 23:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:12:57.022 BaseBdev2 00:12:57.022 BaseBdev3' 00:12:57.022 23:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:57.022 23:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:12:57.022 23:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:57.279 23:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:57.279 "name": "NewBaseBdev", 00:12:57.279 "aliases": [ 00:12:57.279 "38b899a0-024b-4773-ad3a-5530ae39acc9" 00:12:57.279 ], 00:12:57.279 "product_name": "Malloc disk", 00:12:57.279 "block_size": 512, 00:12:57.279 "num_blocks": 65536, 00:12:57.279 "uuid": "38b899a0-024b-4773-ad3a-5530ae39acc9", 00:12:57.279 "assigned_rate_limits": { 00:12:57.279 "rw_ios_per_sec": 0, 00:12:57.279 "rw_mbytes_per_sec": 0, 00:12:57.279 "r_mbytes_per_sec": 0, 00:12:57.279 "w_mbytes_per_sec": 0 00:12:57.279 }, 00:12:57.279 "claimed": true, 00:12:57.279 "claim_type": "exclusive_write", 00:12:57.279 "zoned": false, 00:12:57.279 "supported_io_types": { 00:12:57.279 "read": true, 00:12:57.279 "write": true, 00:12:57.279 "unmap": true, 00:12:57.279 "write_zeroes": true, 00:12:57.279 "flush": true, 00:12:57.279 "reset": true, 00:12:57.279 "compare": false, 00:12:57.279 "compare_and_write": false, 00:12:57.279 "abort": true, 00:12:57.279 "nvme_admin": false, 00:12:57.279 "nvme_io": false 00:12:57.279 }, 00:12:57.279 "memory_domains": [ 00:12:57.279 { 00:12:57.279 "dma_device_id": "system", 00:12:57.279 "dma_device_type": 1 00:12:57.279 }, 00:12:57.279 { 00:12:57.279 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:57.279 "dma_device_type": 2 00:12:57.279 } 00:12:57.279 ], 00:12:57.279 "driver_specific": {} 00:12:57.279 }' 00:12:57.279 23:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:57.279 23:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:57.279 23:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:57.279 23:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:57.280 23:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:57.280 23:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:57.280 23:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:57.538 23:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:57.538 23:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:57.538 23:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:57.538 23:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:57.538 23:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:57.538 23:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:57.538 23:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:57.538 23:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:57.796 23:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:57.796 "name": "BaseBdev2", 00:12:57.796 "aliases": [ 00:12:57.796 "a3f1771e-dfd0-403a-9bee-5ed69ea01a7c" 00:12:57.796 ], 00:12:57.796 "product_name": "Malloc disk", 00:12:57.796 "block_size": 512, 00:12:57.796 "num_blocks": 65536, 00:12:57.796 "uuid": "a3f1771e-dfd0-403a-9bee-5ed69ea01a7c", 00:12:57.796 "assigned_rate_limits": { 00:12:57.796 "rw_ios_per_sec": 0, 00:12:57.796 "rw_mbytes_per_sec": 0, 00:12:57.796 "r_mbytes_per_sec": 0, 00:12:57.796 "w_mbytes_per_sec": 0 00:12:57.796 }, 00:12:57.796 "claimed": true, 00:12:57.796 "claim_type": "exclusive_write", 00:12:57.796 "zoned": false, 00:12:57.796 "supported_io_types": { 00:12:57.796 "read": true, 00:12:57.796 "write": true, 00:12:57.796 "unmap": true, 00:12:57.796 "write_zeroes": true, 00:12:57.796 "flush": true, 00:12:57.796 "reset": true, 00:12:57.796 "compare": false, 00:12:57.796 "compare_and_write": false, 00:12:57.796 "abort": true, 00:12:57.796 "nvme_admin": false, 00:12:57.796 "nvme_io": false 00:12:57.796 }, 00:12:57.796 "memory_domains": [ 00:12:57.796 { 00:12:57.796 "dma_device_id": "system", 00:12:57.796 "dma_device_type": 1 00:12:57.796 }, 00:12:57.796 { 00:12:57.796 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:57.796 "dma_device_type": 2 00:12:57.796 } 00:12:57.796 ], 00:12:57.796 "driver_specific": {} 00:12:57.796 }' 00:12:57.796 23:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:57.796 23:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:57.796 23:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:57.796 23:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:57.796 23:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:58.055 23:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:58.055 23:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:58.055 23:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:58.055 23:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:58.055 23:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:58.055 23:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:58.055 23:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:58.055 23:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:58.055 23:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:58.055 23:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:58.313 23:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:58.313 "name": "BaseBdev3", 00:12:58.313 "aliases": [ 00:12:58.313 "f371ffa5-5545-4f7d-b0a6-f42c45d8df19" 00:12:58.313 ], 00:12:58.313 "product_name": "Malloc disk", 00:12:58.313 "block_size": 512, 00:12:58.313 "num_blocks": 65536, 00:12:58.313 "uuid": "f371ffa5-5545-4f7d-b0a6-f42c45d8df19", 00:12:58.313 "assigned_rate_limits": { 00:12:58.313 "rw_ios_per_sec": 0, 00:12:58.313 "rw_mbytes_per_sec": 0, 00:12:58.313 "r_mbytes_per_sec": 0, 00:12:58.313 "w_mbytes_per_sec": 0 00:12:58.313 }, 00:12:58.313 "claimed": true, 00:12:58.313 "claim_type": "exclusive_write", 00:12:58.313 "zoned": false, 00:12:58.313 "supported_io_types": { 00:12:58.313 "read": true, 00:12:58.313 "write": true, 00:12:58.313 "unmap": true, 00:12:58.313 "write_zeroes": true, 00:12:58.313 "flush": true, 00:12:58.313 "reset": true, 00:12:58.313 "compare": false, 00:12:58.313 "compare_and_write": false, 00:12:58.313 "abort": true, 00:12:58.313 "nvme_admin": false, 00:12:58.313 "nvme_io": false 00:12:58.313 }, 00:12:58.313 "memory_domains": [ 00:12:58.313 { 00:12:58.313 "dma_device_id": "system", 00:12:58.313 "dma_device_type": 1 00:12:58.313 }, 00:12:58.313 { 00:12:58.313 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:58.313 "dma_device_type": 2 00:12:58.313 } 00:12:58.313 ], 00:12:58.313 "driver_specific": {} 00:12:58.313 }' 00:12:58.313 23:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:58.313 23:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:58.572 23:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:58.572 23:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:58.572 23:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:58.572 23:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:58.572 23:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:58.572 23:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:58.572 23:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:58.572 23:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:58.572 23:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:58.831 23:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:58.831 23:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:58.831 [2024-05-14 23:54:59.414707] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:58.831 [2024-05-14 23:54:59.414734] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:58.831 [2024-05-14 23:54:59.414799] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:58.831 [2024-05-14 23:54:59.414851] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:58.831 [2024-05-14 23:54:59.414864] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b04650 name Existed_Raid, state offline 00:12:59.089 23:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@342 -- # killprocess 404910 00:12:59.089 23:54:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@946 -- # '[' -z 404910 ']' 00:12:59.089 23:54:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # kill -0 404910 00:12:59.089 23:54:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # uname 00:12:59.089 23:54:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:12:59.089 23:54:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 404910 00:12:59.089 23:54:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:12:59.089 23:54:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:12:59.089 23:54:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 404910' 00:12:59.089 killing process with pid 404910 00:12:59.089 23:54:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@965 -- # kill 404910 00:12:59.089 [2024-05-14 23:54:59.483520] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:59.089 23:54:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@970 -- # wait 404910 00:12:59.089 [2024-05-14 23:54:59.514193] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:59.348 23:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@344 -- # return 0 00:12:59.348 00:12:59.348 real 0m27.815s 00:12:59.348 user 0m50.948s 00:12:59.348 sys 0m5.020s 00:12:59.348 23:54:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:12:59.348 23:54:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:59.348 ************************************ 00:12:59.348 END TEST raid_state_function_test_sb 00:12:59.348 ************************************ 00:12:59.348 23:54:59 bdev_raid -- bdev/bdev_raid.sh@817 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:12:59.348 23:54:59 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:12:59.348 23:54:59 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:12:59.348 23:54:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:59.348 ************************************ 00:12:59.348 START TEST raid_superblock_test 00:12:59.348 ************************************ 00:12:59.348 23:54:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1121 -- # raid_superblock_test raid0 3 00:12:59.348 23:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local raid_level=raid0 00:12:59.348 23:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=3 00:12:59.348 23:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:12:59.348 23:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:12:59.348 23:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:12:59.348 23:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:12:59.348 23:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:12:59.349 23:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:12:59.349 23:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:12:59.349 23:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size 00:12:59.349 23:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:12:59.349 23:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:12:59.349 23:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:12:59.349 23:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # '[' raid0 '!=' raid1 ']' 00:12:59.349 23:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size=64 00:12:59.349 23:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@406 -- # strip_size_create_arg='-z 64' 00:12:59.349 23:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # raid_pid=409047 00:12:59.349 23:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # waitforlisten 409047 /var/tmp/spdk-raid.sock 00:12:59.349 23:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:12:59.349 23:54:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@827 -- # '[' -z 409047 ']' 00:12:59.349 23:54:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:59.349 23:54:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:12:59.349 23:54:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:59.349 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:59.349 23:54:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:12:59.349 23:54:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:59.349 [2024-05-14 23:54:59.913423] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:12:59.349 [2024-05-14 23:54:59.913486] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid409047 ] 00:12:59.608 [2024-05-14 23:55:00.043180] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:59.608 [2024-05-14 23:55:00.148577] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:12:59.867 [2024-05-14 23:55:00.213967] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:59.867 [2024-05-14 23:55:00.214007] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:00.435 23:55:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:13:00.435 23:55:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # return 0 00:13:00.435 23:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:13:00.435 23:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:13:00.435 23:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:13:00.435 23:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:13:00.435 23:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:13:00.435 23:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:00.435 23:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:13:00.435 23:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:00.435 23:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:13:00.694 malloc1 00:13:00.694 23:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:00.953 [2024-05-14 23:55:01.303694] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:00.953 [2024-05-14 23:55:01.303744] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:00.953 [2024-05-14 23:55:01.303768] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfc2780 00:13:00.953 [2024-05-14 23:55:01.303780] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:00.953 [2024-05-14 23:55:01.305554] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:00.953 [2024-05-14 23:55:01.305589] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:00.953 pt1 00:13:00.953 23:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:13:00.953 23:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:13:00.953 23:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:13:00.953 23:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:13:00.953 23:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:13:00.953 23:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:00.953 23:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:13:00.953 23:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:00.953 23:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:13:01.212 malloc2 00:13:01.212 23:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:01.212 [2024-05-14 23:55:01.794710] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:01.212 [2024-05-14 23:55:01.794758] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:01.212 [2024-05-14 23:55:01.794777] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfc3b60 00:13:01.212 [2024-05-14 23:55:01.794790] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:01.212 [2024-05-14 23:55:01.796411] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:01.212 [2024-05-14 23:55:01.796439] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:01.212 pt2 00:13:01.471 23:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:13:01.471 23:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:13:01.471 23:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc3 00:13:01.471 23:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt3 00:13:01.471 23:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:13:01.471 23:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:01.471 23:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:13:01.471 23:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:01.471 23:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:13:01.471 malloc3 00:13:01.730 23:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:01.730 [2024-05-14 23:55:02.296628] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:01.730 [2024-05-14 23:55:02.296674] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:01.730 [2024-05-14 23:55:02.296693] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x116e080 00:13:01.730 [2024-05-14 23:55:02.296706] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:01.730 [2024-05-14 23:55:02.298282] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:01.730 [2024-05-14 23:55:02.298310] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:01.730 pt3 00:13:01.730 23:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:13:01.730 23:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:13:01.730 23:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:13:01.988 [2024-05-14 23:55:02.541355] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:01.988 [2024-05-14 23:55:02.542714] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:01.988 [2024-05-14 23:55:02.542769] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:01.988 [2024-05-14 23:55:02.542933] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1171910 00:13:01.988 [2024-05-14 23:55:02.542944] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:01.988 [2024-05-14 23:55:02.543152] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1170e30 00:13:01.988 [2024-05-14 23:55:02.543295] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1171910 00:13:01.988 [2024-05-14 23:55:02.543306] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1171910 00:13:01.988 [2024-05-14 23:55:02.543420] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:01.988 23:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:01.988 23:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:13:01.988 23:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:13:01.988 23:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:13:01.988 23:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:01.988 23:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:01.988 23:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:01.988 23:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:01.988 23:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:01.988 23:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:01.988 23:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:01.989 23:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:02.247 23:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:02.247 "name": "raid_bdev1", 00:13:02.247 "uuid": "13d7b689-19d7-44da-8054-993ed310c93a", 00:13:02.247 "strip_size_kb": 64, 00:13:02.247 "state": "online", 00:13:02.247 "raid_level": "raid0", 00:13:02.247 "superblock": true, 00:13:02.247 "num_base_bdevs": 3, 00:13:02.247 "num_base_bdevs_discovered": 3, 00:13:02.247 "num_base_bdevs_operational": 3, 00:13:02.247 "base_bdevs_list": [ 00:13:02.247 { 00:13:02.247 "name": "pt1", 00:13:02.247 "uuid": "bbdc22c1-7736-5a87-87aa-cae18fbdead0", 00:13:02.247 "is_configured": true, 00:13:02.247 "data_offset": 2048, 00:13:02.247 "data_size": 63488 00:13:02.247 }, 00:13:02.247 { 00:13:02.247 "name": "pt2", 00:13:02.247 "uuid": "3b4996a5-5627-57d8-9054-7f891a5f9234", 00:13:02.247 "is_configured": true, 00:13:02.247 "data_offset": 2048, 00:13:02.247 "data_size": 63488 00:13:02.247 }, 00:13:02.247 { 00:13:02.247 "name": "pt3", 00:13:02.247 "uuid": "d2a30968-ed49-5435-8d8d-760271a89ead", 00:13:02.247 "is_configured": true, 00:13:02.247 "data_offset": 2048, 00:13:02.247 "data_size": 63488 00:13:02.247 } 00:13:02.247 ] 00:13:02.247 }' 00:13:02.247 23:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:02.247 23:55:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:02.813 23:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:13:02.813 23:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:13:02.813 23:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:13:02.813 23:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:13:02.813 23:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:13:02.813 23:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:13:02.813 23:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:02.813 23:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:13:03.071 [2024-05-14 23:55:03.524287] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:03.071 23:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:13:03.071 "name": "raid_bdev1", 00:13:03.071 "aliases": [ 00:13:03.071 "13d7b689-19d7-44da-8054-993ed310c93a" 00:13:03.071 ], 00:13:03.071 "product_name": "Raid Volume", 00:13:03.071 "block_size": 512, 00:13:03.071 "num_blocks": 190464, 00:13:03.071 "uuid": "13d7b689-19d7-44da-8054-993ed310c93a", 00:13:03.071 "assigned_rate_limits": { 00:13:03.071 "rw_ios_per_sec": 0, 00:13:03.071 "rw_mbytes_per_sec": 0, 00:13:03.071 "r_mbytes_per_sec": 0, 00:13:03.071 "w_mbytes_per_sec": 0 00:13:03.071 }, 00:13:03.071 "claimed": false, 00:13:03.071 "zoned": false, 00:13:03.071 "supported_io_types": { 00:13:03.071 "read": true, 00:13:03.071 "write": true, 00:13:03.071 "unmap": true, 00:13:03.071 "write_zeroes": true, 00:13:03.071 "flush": true, 00:13:03.071 "reset": true, 00:13:03.071 "compare": false, 00:13:03.071 "compare_and_write": false, 00:13:03.071 "abort": false, 00:13:03.071 "nvme_admin": false, 00:13:03.071 "nvme_io": false 00:13:03.071 }, 00:13:03.071 "memory_domains": [ 00:13:03.071 { 00:13:03.071 "dma_device_id": "system", 00:13:03.071 "dma_device_type": 1 00:13:03.071 }, 00:13:03.071 { 00:13:03.071 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:03.071 "dma_device_type": 2 00:13:03.071 }, 00:13:03.071 { 00:13:03.071 "dma_device_id": "system", 00:13:03.071 "dma_device_type": 1 00:13:03.071 }, 00:13:03.071 { 00:13:03.071 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:03.071 "dma_device_type": 2 00:13:03.071 }, 00:13:03.071 { 00:13:03.071 "dma_device_id": "system", 00:13:03.071 "dma_device_type": 1 00:13:03.071 }, 00:13:03.071 { 00:13:03.071 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:03.071 "dma_device_type": 2 00:13:03.071 } 00:13:03.071 ], 00:13:03.071 "driver_specific": { 00:13:03.071 "raid": { 00:13:03.071 "uuid": "13d7b689-19d7-44da-8054-993ed310c93a", 00:13:03.071 "strip_size_kb": 64, 00:13:03.071 "state": "online", 00:13:03.071 "raid_level": "raid0", 00:13:03.071 "superblock": true, 00:13:03.071 "num_base_bdevs": 3, 00:13:03.071 "num_base_bdevs_discovered": 3, 00:13:03.071 "num_base_bdevs_operational": 3, 00:13:03.071 "base_bdevs_list": [ 00:13:03.071 { 00:13:03.071 "name": "pt1", 00:13:03.071 "uuid": "bbdc22c1-7736-5a87-87aa-cae18fbdead0", 00:13:03.071 "is_configured": true, 00:13:03.071 "data_offset": 2048, 00:13:03.071 "data_size": 63488 00:13:03.071 }, 00:13:03.071 { 00:13:03.071 "name": "pt2", 00:13:03.071 "uuid": "3b4996a5-5627-57d8-9054-7f891a5f9234", 00:13:03.071 "is_configured": true, 00:13:03.071 "data_offset": 2048, 00:13:03.071 "data_size": 63488 00:13:03.071 }, 00:13:03.071 { 00:13:03.071 "name": "pt3", 00:13:03.071 "uuid": "d2a30968-ed49-5435-8d8d-760271a89ead", 00:13:03.071 "is_configured": true, 00:13:03.071 "data_offset": 2048, 00:13:03.071 "data_size": 63488 00:13:03.071 } 00:13:03.071 ] 00:13:03.071 } 00:13:03.071 } 00:13:03.071 }' 00:13:03.071 23:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:03.071 23:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:13:03.071 pt2 00:13:03.071 pt3' 00:13:03.071 23:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:03.071 23:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:03.071 23:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:03.330 23:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:03.330 "name": "pt1", 00:13:03.330 "aliases": [ 00:13:03.330 "bbdc22c1-7736-5a87-87aa-cae18fbdead0" 00:13:03.330 ], 00:13:03.330 "product_name": "passthru", 00:13:03.330 "block_size": 512, 00:13:03.330 "num_blocks": 65536, 00:13:03.330 "uuid": "bbdc22c1-7736-5a87-87aa-cae18fbdead0", 00:13:03.330 "assigned_rate_limits": { 00:13:03.330 "rw_ios_per_sec": 0, 00:13:03.330 "rw_mbytes_per_sec": 0, 00:13:03.330 "r_mbytes_per_sec": 0, 00:13:03.330 "w_mbytes_per_sec": 0 00:13:03.330 }, 00:13:03.330 "claimed": true, 00:13:03.330 "claim_type": "exclusive_write", 00:13:03.330 "zoned": false, 00:13:03.330 "supported_io_types": { 00:13:03.330 "read": true, 00:13:03.330 "write": true, 00:13:03.330 "unmap": true, 00:13:03.330 "write_zeroes": true, 00:13:03.330 "flush": true, 00:13:03.330 "reset": true, 00:13:03.330 "compare": false, 00:13:03.330 "compare_and_write": false, 00:13:03.330 "abort": true, 00:13:03.330 "nvme_admin": false, 00:13:03.330 "nvme_io": false 00:13:03.330 }, 00:13:03.330 "memory_domains": [ 00:13:03.330 { 00:13:03.330 "dma_device_id": "system", 00:13:03.330 "dma_device_type": 1 00:13:03.330 }, 00:13:03.330 { 00:13:03.330 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:03.330 "dma_device_type": 2 00:13:03.330 } 00:13:03.330 ], 00:13:03.330 "driver_specific": { 00:13:03.330 "passthru": { 00:13:03.330 "name": "pt1", 00:13:03.330 "base_bdev_name": "malloc1" 00:13:03.330 } 00:13:03.330 } 00:13:03.330 }' 00:13:03.330 23:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:03.330 23:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:03.330 23:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:03.330 23:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:03.591 23:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:03.591 23:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:03.591 23:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:03.591 23:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:03.591 23:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:03.591 23:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:03.591 23:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:03.591 23:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:03.591 23:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:03.591 23:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:03.591 23:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:03.848 23:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:03.848 "name": "pt2", 00:13:03.848 "aliases": [ 00:13:03.848 "3b4996a5-5627-57d8-9054-7f891a5f9234" 00:13:03.848 ], 00:13:03.848 "product_name": "passthru", 00:13:03.848 "block_size": 512, 00:13:03.848 "num_blocks": 65536, 00:13:03.848 "uuid": "3b4996a5-5627-57d8-9054-7f891a5f9234", 00:13:03.848 "assigned_rate_limits": { 00:13:03.848 "rw_ios_per_sec": 0, 00:13:03.848 "rw_mbytes_per_sec": 0, 00:13:03.848 "r_mbytes_per_sec": 0, 00:13:03.848 "w_mbytes_per_sec": 0 00:13:03.848 }, 00:13:03.848 "claimed": true, 00:13:03.848 "claim_type": "exclusive_write", 00:13:03.848 "zoned": false, 00:13:03.848 "supported_io_types": { 00:13:03.848 "read": true, 00:13:03.848 "write": true, 00:13:03.848 "unmap": true, 00:13:03.848 "write_zeroes": true, 00:13:03.849 "flush": true, 00:13:03.849 "reset": true, 00:13:03.849 "compare": false, 00:13:03.849 "compare_and_write": false, 00:13:03.849 "abort": true, 00:13:03.849 "nvme_admin": false, 00:13:03.849 "nvme_io": false 00:13:03.849 }, 00:13:03.849 "memory_domains": [ 00:13:03.849 { 00:13:03.849 "dma_device_id": "system", 00:13:03.849 "dma_device_type": 1 00:13:03.849 }, 00:13:03.849 { 00:13:03.849 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:03.849 "dma_device_type": 2 00:13:03.849 } 00:13:03.849 ], 00:13:03.849 "driver_specific": { 00:13:03.849 "passthru": { 00:13:03.849 "name": "pt2", 00:13:03.849 "base_bdev_name": "malloc2" 00:13:03.849 } 00:13:03.849 } 00:13:03.849 }' 00:13:03.849 23:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:04.107 23:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:04.107 23:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:04.107 23:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:04.107 23:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:04.107 23:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:04.107 23:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:04.107 23:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:04.107 23:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:04.107 23:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:04.365 23:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:04.365 23:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:04.365 23:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:04.365 23:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:04.365 23:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:13:04.623 23:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:04.623 "name": "pt3", 00:13:04.623 "aliases": [ 00:13:04.623 "d2a30968-ed49-5435-8d8d-760271a89ead" 00:13:04.623 ], 00:13:04.623 "product_name": "passthru", 00:13:04.623 "block_size": 512, 00:13:04.623 "num_blocks": 65536, 00:13:04.623 "uuid": "d2a30968-ed49-5435-8d8d-760271a89ead", 00:13:04.623 "assigned_rate_limits": { 00:13:04.623 "rw_ios_per_sec": 0, 00:13:04.623 "rw_mbytes_per_sec": 0, 00:13:04.623 "r_mbytes_per_sec": 0, 00:13:04.623 "w_mbytes_per_sec": 0 00:13:04.623 }, 00:13:04.623 "claimed": true, 00:13:04.623 "claim_type": "exclusive_write", 00:13:04.623 "zoned": false, 00:13:04.623 "supported_io_types": { 00:13:04.623 "read": true, 00:13:04.623 "write": true, 00:13:04.623 "unmap": true, 00:13:04.623 "write_zeroes": true, 00:13:04.623 "flush": true, 00:13:04.623 "reset": true, 00:13:04.623 "compare": false, 00:13:04.623 "compare_and_write": false, 00:13:04.623 "abort": true, 00:13:04.623 "nvme_admin": false, 00:13:04.623 "nvme_io": false 00:13:04.623 }, 00:13:04.623 "memory_domains": [ 00:13:04.623 { 00:13:04.623 "dma_device_id": "system", 00:13:04.623 "dma_device_type": 1 00:13:04.623 }, 00:13:04.623 { 00:13:04.623 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:04.623 "dma_device_type": 2 00:13:04.623 } 00:13:04.623 ], 00:13:04.623 "driver_specific": { 00:13:04.623 "passthru": { 00:13:04.623 "name": "pt3", 00:13:04.623 "base_bdev_name": "malloc3" 00:13:04.623 } 00:13:04.624 } 00:13:04.624 }' 00:13:04.624 23:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:04.624 23:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:04.624 23:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:04.624 23:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:04.624 23:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:04.624 23:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:04.624 23:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:04.882 23:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:04.882 23:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:04.882 23:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:04.882 23:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:04.882 23:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:04.882 23:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:04.882 23:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:13:05.140 [2024-05-14 23:55:05.581769] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:05.140 23:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=13d7b689-19d7-44da-8054-993ed310c93a 00:13:05.140 23:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # '[' -z 13d7b689-19d7-44da-8054-993ed310c93a ']' 00:13:05.140 23:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:05.400 [2024-05-14 23:55:05.822168] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:05.400 [2024-05-14 23:55:05.822190] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:05.400 [2024-05-14 23:55:05.822242] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:05.400 [2024-05-14 23:55:05.822298] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:05.400 [2024-05-14 23:55:05.822310] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1171910 name raid_bdev1, state offline 00:13:05.400 23:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:05.400 23:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:13:05.659 23:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:13:05.659 23:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:13:05.659 23:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:13:05.659 23:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:05.917 23:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:13:05.917 23:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:06.176 23:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:13:06.176 23:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:13:06.434 23:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:13:06.434 23:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:13:06.434 23:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:13:06.434 23:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:06.434 23:55:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:13:06.434 23:55:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:06.434 23:55:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:06.692 23:55:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:06.692 23:55:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:06.692 23:55:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:06.692 23:55:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:06.692 23:55:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:06.692 23:55:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:06.692 23:55:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:13:06.692 23:55:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:06.692 [2024-05-14 23:55:07.253908] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:13:06.692 [2024-05-14 23:55:07.255299] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:13:06.692 [2024-05-14 23:55:07.255342] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:13:06.692 [2024-05-14 23:55:07.255390] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:13:06.692 [2024-05-14 23:55:07.255438] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:13:06.692 [2024-05-14 23:55:07.255461] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:13:06.692 [2024-05-14 23:55:07.255479] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:06.692 [2024-05-14 23:55:07.255489] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x116e480 name raid_bdev1, state configuring 00:13:06.692 request: 00:13:06.692 { 00:13:06.692 "name": "raid_bdev1", 00:13:06.692 "raid_level": "raid0", 00:13:06.692 "base_bdevs": [ 00:13:06.692 "malloc1", 00:13:06.692 "malloc2", 00:13:06.692 "malloc3" 00:13:06.692 ], 00:13:06.692 "superblock": false, 00:13:06.692 "strip_size_kb": 64, 00:13:06.692 "method": "bdev_raid_create", 00:13:06.692 "req_id": 1 00:13:06.692 } 00:13:06.692 Got JSON-RPC error response 00:13:06.692 response: 00:13:06.692 { 00:13:06.692 "code": -17, 00:13:06.692 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:13:06.692 } 00:13:06.692 23:55:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:13:06.692 23:55:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:13:06.692 23:55:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:13:06.692 23:55:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:13:06.692 23:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:06.692 23:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:13:06.982 23:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:13:06.982 23:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:13:06.982 23:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:07.247 [2024-05-14 23:55:07.739119] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:07.247 [2024-05-14 23:55:07.739172] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:07.247 [2024-05-14 23:55:07.739193] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x116baa0 00:13:07.247 [2024-05-14 23:55:07.739206] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:07.247 [2024-05-14 23:55:07.740858] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:07.248 [2024-05-14 23:55:07.740887] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:07.248 [2024-05-14 23:55:07.740956] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt1 00:13:07.248 [2024-05-14 23:55:07.740982] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:07.248 pt1 00:13:07.248 23:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:13:07.248 23:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:13:07.248 23:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:07.248 23:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:13:07.248 23:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:07.248 23:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:07.248 23:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:07.248 23:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:07.248 23:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:07.248 23:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:07.248 23:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:07.248 23:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:07.506 23:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:07.506 "name": "raid_bdev1", 00:13:07.506 "uuid": "13d7b689-19d7-44da-8054-993ed310c93a", 00:13:07.506 "strip_size_kb": 64, 00:13:07.506 "state": "configuring", 00:13:07.506 "raid_level": "raid0", 00:13:07.506 "superblock": true, 00:13:07.506 "num_base_bdevs": 3, 00:13:07.506 "num_base_bdevs_discovered": 1, 00:13:07.506 "num_base_bdevs_operational": 3, 00:13:07.506 "base_bdevs_list": [ 00:13:07.506 { 00:13:07.506 "name": "pt1", 00:13:07.506 "uuid": "bbdc22c1-7736-5a87-87aa-cae18fbdead0", 00:13:07.506 "is_configured": true, 00:13:07.506 "data_offset": 2048, 00:13:07.506 "data_size": 63488 00:13:07.506 }, 00:13:07.506 { 00:13:07.506 "name": null, 00:13:07.506 "uuid": "3b4996a5-5627-57d8-9054-7f891a5f9234", 00:13:07.506 "is_configured": false, 00:13:07.506 "data_offset": 2048, 00:13:07.506 "data_size": 63488 00:13:07.506 }, 00:13:07.506 { 00:13:07.506 "name": null, 00:13:07.506 "uuid": "d2a30968-ed49-5435-8d8d-760271a89ead", 00:13:07.506 "is_configured": false, 00:13:07.506 "data_offset": 2048, 00:13:07.506 "data_size": 63488 00:13:07.506 } 00:13:07.506 ] 00:13:07.506 }' 00:13:07.506 23:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:07.506 23:55:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:08.074 23:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@470 -- # '[' 3 -gt 2 ']' 00:13:08.074 23:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:08.332 [2024-05-14 23:55:08.801946] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:08.332 [2024-05-14 23:55:08.801996] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:08.332 [2024-05-14 23:55:08.802020] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x116d470 00:13:08.332 [2024-05-14 23:55:08.802034] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:08.332 [2024-05-14 23:55:08.802370] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:08.332 [2024-05-14 23:55:08.802387] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:08.332 [2024-05-14 23:55:08.802461] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:13:08.332 [2024-05-14 23:55:08.802482] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:08.332 pt2 00:13:08.332 23:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:08.591 [2024-05-14 23:55:08.974421] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:13:08.591 23:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:13:08.591 23:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:13:08.591 23:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:08.591 23:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:13:08.591 23:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:08.591 23:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:08.591 23:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:08.591 23:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:08.591 23:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:08.591 23:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:08.591 23:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:08.591 23:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:08.850 23:55:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:08.850 "name": "raid_bdev1", 00:13:08.850 "uuid": "13d7b689-19d7-44da-8054-993ed310c93a", 00:13:08.850 "strip_size_kb": 64, 00:13:08.850 "state": "configuring", 00:13:08.850 "raid_level": "raid0", 00:13:08.850 "superblock": true, 00:13:08.850 "num_base_bdevs": 3, 00:13:08.850 "num_base_bdevs_discovered": 1, 00:13:08.850 "num_base_bdevs_operational": 3, 00:13:08.850 "base_bdevs_list": [ 00:13:08.850 { 00:13:08.850 "name": "pt1", 00:13:08.850 "uuid": "bbdc22c1-7736-5a87-87aa-cae18fbdead0", 00:13:08.850 "is_configured": true, 00:13:08.850 "data_offset": 2048, 00:13:08.850 "data_size": 63488 00:13:08.850 }, 00:13:08.850 { 00:13:08.850 "name": null, 00:13:08.850 "uuid": "3b4996a5-5627-57d8-9054-7f891a5f9234", 00:13:08.850 "is_configured": false, 00:13:08.850 "data_offset": 2048, 00:13:08.850 "data_size": 63488 00:13:08.850 }, 00:13:08.850 { 00:13:08.850 "name": null, 00:13:08.850 "uuid": "d2a30968-ed49-5435-8d8d-760271a89ead", 00:13:08.850 "is_configured": false, 00:13:08.850 "data_offset": 2048, 00:13:08.850 "data_size": 63488 00:13:08.850 } 00:13:08.850 ] 00:13:08.850 }' 00:13:08.850 23:55:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:08.850 23:55:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:09.417 23:55:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:13:09.417 23:55:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:13:09.417 23:55:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:09.676 [2024-05-14 23:55:10.041258] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:09.676 [2024-05-14 23:55:10.041316] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:09.676 [2024-05-14 23:55:10.041336] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfc3370 00:13:09.676 [2024-05-14 23:55:10.041349] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:09.676 [2024-05-14 23:55:10.041711] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:09.676 [2024-05-14 23:55:10.041729] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:09.676 [2024-05-14 23:55:10.041799] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:13:09.676 [2024-05-14 23:55:10.041820] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:09.676 pt2 00:13:09.676 23:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:13:09.676 23:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:13:09.676 23:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:09.935 [2024-05-14 23:55:10.285908] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:09.935 [2024-05-14 23:55:10.285953] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:09.935 [2024-05-14 23:55:10.285972] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x116f7f0 00:13:09.935 [2024-05-14 23:55:10.285985] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:09.935 [2024-05-14 23:55:10.286313] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:09.935 [2024-05-14 23:55:10.286329] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:09.935 [2024-05-14 23:55:10.286393] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt3 00:13:09.935 [2024-05-14 23:55:10.286425] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:09.935 [2024-05-14 23:55:10.286537] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x116cb40 00:13:09.935 [2024-05-14 23:55:10.286547] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:09.935 [2024-05-14 23:55:10.286718] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11712b0 00:13:09.935 [2024-05-14 23:55:10.286846] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x116cb40 00:13:09.935 [2024-05-14 23:55:10.286855] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x116cb40 00:13:09.935 [2024-05-14 23:55:10.286952] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:09.935 pt3 00:13:09.935 23:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:13:09.935 23:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:13:09.935 23:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:09.935 23:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:13:09.935 23:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:13:09.935 23:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:13:09.935 23:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:09.935 23:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:09.935 23:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:09.935 23:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:09.935 23:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:09.935 23:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:09.935 23:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:09.935 23:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:10.194 23:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:10.194 "name": "raid_bdev1", 00:13:10.194 "uuid": "13d7b689-19d7-44da-8054-993ed310c93a", 00:13:10.194 "strip_size_kb": 64, 00:13:10.194 "state": "online", 00:13:10.194 "raid_level": "raid0", 00:13:10.194 "superblock": true, 00:13:10.194 "num_base_bdevs": 3, 00:13:10.194 "num_base_bdevs_discovered": 3, 00:13:10.194 "num_base_bdevs_operational": 3, 00:13:10.194 "base_bdevs_list": [ 00:13:10.194 { 00:13:10.194 "name": "pt1", 00:13:10.194 "uuid": "bbdc22c1-7736-5a87-87aa-cae18fbdead0", 00:13:10.194 "is_configured": true, 00:13:10.194 "data_offset": 2048, 00:13:10.194 "data_size": 63488 00:13:10.194 }, 00:13:10.194 { 00:13:10.194 "name": "pt2", 00:13:10.194 "uuid": "3b4996a5-5627-57d8-9054-7f891a5f9234", 00:13:10.194 "is_configured": true, 00:13:10.194 "data_offset": 2048, 00:13:10.194 "data_size": 63488 00:13:10.194 }, 00:13:10.194 { 00:13:10.194 "name": "pt3", 00:13:10.194 "uuid": "d2a30968-ed49-5435-8d8d-760271a89ead", 00:13:10.194 "is_configured": true, 00:13:10.194 "data_offset": 2048, 00:13:10.194 "data_size": 63488 00:13:10.195 } 00:13:10.195 ] 00:13:10.195 }' 00:13:10.195 23:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:10.195 23:55:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:10.762 23:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:13:10.762 23:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:13:10.762 23:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:13:10.762 23:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:13:10.762 23:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:13:10.762 23:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:13:10.762 23:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:10.762 23:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:13:11.021 [2024-05-14 23:55:11.409136] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:11.021 23:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:13:11.021 "name": "raid_bdev1", 00:13:11.021 "aliases": [ 00:13:11.021 "13d7b689-19d7-44da-8054-993ed310c93a" 00:13:11.021 ], 00:13:11.021 "product_name": "Raid Volume", 00:13:11.021 "block_size": 512, 00:13:11.021 "num_blocks": 190464, 00:13:11.021 "uuid": "13d7b689-19d7-44da-8054-993ed310c93a", 00:13:11.021 "assigned_rate_limits": { 00:13:11.021 "rw_ios_per_sec": 0, 00:13:11.021 "rw_mbytes_per_sec": 0, 00:13:11.021 "r_mbytes_per_sec": 0, 00:13:11.021 "w_mbytes_per_sec": 0 00:13:11.021 }, 00:13:11.021 "claimed": false, 00:13:11.021 "zoned": false, 00:13:11.021 "supported_io_types": { 00:13:11.021 "read": true, 00:13:11.021 "write": true, 00:13:11.021 "unmap": true, 00:13:11.021 "write_zeroes": true, 00:13:11.021 "flush": true, 00:13:11.021 "reset": true, 00:13:11.021 "compare": false, 00:13:11.021 "compare_and_write": false, 00:13:11.021 "abort": false, 00:13:11.021 "nvme_admin": false, 00:13:11.021 "nvme_io": false 00:13:11.021 }, 00:13:11.021 "memory_domains": [ 00:13:11.021 { 00:13:11.021 "dma_device_id": "system", 00:13:11.021 "dma_device_type": 1 00:13:11.021 }, 00:13:11.021 { 00:13:11.022 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:11.022 "dma_device_type": 2 00:13:11.022 }, 00:13:11.022 { 00:13:11.022 "dma_device_id": "system", 00:13:11.022 "dma_device_type": 1 00:13:11.022 }, 00:13:11.022 { 00:13:11.022 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:11.022 "dma_device_type": 2 00:13:11.022 }, 00:13:11.022 { 00:13:11.022 "dma_device_id": "system", 00:13:11.022 "dma_device_type": 1 00:13:11.022 }, 00:13:11.022 { 00:13:11.022 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:11.022 "dma_device_type": 2 00:13:11.022 } 00:13:11.022 ], 00:13:11.022 "driver_specific": { 00:13:11.022 "raid": { 00:13:11.022 "uuid": "13d7b689-19d7-44da-8054-993ed310c93a", 00:13:11.022 "strip_size_kb": 64, 00:13:11.022 "state": "online", 00:13:11.022 "raid_level": "raid0", 00:13:11.022 "superblock": true, 00:13:11.022 "num_base_bdevs": 3, 00:13:11.022 "num_base_bdevs_discovered": 3, 00:13:11.022 "num_base_bdevs_operational": 3, 00:13:11.022 "base_bdevs_list": [ 00:13:11.022 { 00:13:11.022 "name": "pt1", 00:13:11.022 "uuid": "bbdc22c1-7736-5a87-87aa-cae18fbdead0", 00:13:11.022 "is_configured": true, 00:13:11.022 "data_offset": 2048, 00:13:11.022 "data_size": 63488 00:13:11.022 }, 00:13:11.022 { 00:13:11.022 "name": "pt2", 00:13:11.022 "uuid": "3b4996a5-5627-57d8-9054-7f891a5f9234", 00:13:11.022 "is_configured": true, 00:13:11.022 "data_offset": 2048, 00:13:11.022 "data_size": 63488 00:13:11.022 }, 00:13:11.022 { 00:13:11.022 "name": "pt3", 00:13:11.022 "uuid": "d2a30968-ed49-5435-8d8d-760271a89ead", 00:13:11.022 "is_configured": true, 00:13:11.022 "data_offset": 2048, 00:13:11.022 "data_size": 63488 00:13:11.022 } 00:13:11.022 ] 00:13:11.022 } 00:13:11.022 } 00:13:11.022 }' 00:13:11.022 23:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:11.022 23:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:13:11.022 pt2 00:13:11.022 pt3' 00:13:11.022 23:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:11.022 23:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:11.022 23:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:11.280 23:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:11.280 "name": "pt1", 00:13:11.280 "aliases": [ 00:13:11.280 "bbdc22c1-7736-5a87-87aa-cae18fbdead0" 00:13:11.280 ], 00:13:11.280 "product_name": "passthru", 00:13:11.280 "block_size": 512, 00:13:11.280 "num_blocks": 65536, 00:13:11.280 "uuid": "bbdc22c1-7736-5a87-87aa-cae18fbdead0", 00:13:11.280 "assigned_rate_limits": { 00:13:11.280 "rw_ios_per_sec": 0, 00:13:11.280 "rw_mbytes_per_sec": 0, 00:13:11.280 "r_mbytes_per_sec": 0, 00:13:11.280 "w_mbytes_per_sec": 0 00:13:11.280 }, 00:13:11.280 "claimed": true, 00:13:11.280 "claim_type": "exclusive_write", 00:13:11.280 "zoned": false, 00:13:11.280 "supported_io_types": { 00:13:11.280 "read": true, 00:13:11.280 "write": true, 00:13:11.280 "unmap": true, 00:13:11.280 "write_zeroes": true, 00:13:11.280 "flush": true, 00:13:11.280 "reset": true, 00:13:11.280 "compare": false, 00:13:11.280 "compare_and_write": false, 00:13:11.280 "abort": true, 00:13:11.280 "nvme_admin": false, 00:13:11.280 "nvme_io": false 00:13:11.280 }, 00:13:11.280 "memory_domains": [ 00:13:11.280 { 00:13:11.280 "dma_device_id": "system", 00:13:11.280 "dma_device_type": 1 00:13:11.280 }, 00:13:11.280 { 00:13:11.280 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:11.280 "dma_device_type": 2 00:13:11.280 } 00:13:11.280 ], 00:13:11.280 "driver_specific": { 00:13:11.280 "passthru": { 00:13:11.281 "name": "pt1", 00:13:11.281 "base_bdev_name": "malloc1" 00:13:11.281 } 00:13:11.281 } 00:13:11.281 }' 00:13:11.281 23:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:11.281 23:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:11.281 23:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:11.281 23:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:11.281 23:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:11.539 23:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:11.539 23:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:11.539 23:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:11.539 23:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:11.539 23:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:11.539 23:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:11.539 23:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:11.539 23:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:11.539 23:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:11.539 23:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:11.798 23:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:11.798 "name": "pt2", 00:13:11.798 "aliases": [ 00:13:11.798 "3b4996a5-5627-57d8-9054-7f891a5f9234" 00:13:11.798 ], 00:13:11.798 "product_name": "passthru", 00:13:11.798 "block_size": 512, 00:13:11.798 "num_blocks": 65536, 00:13:11.798 "uuid": "3b4996a5-5627-57d8-9054-7f891a5f9234", 00:13:11.798 "assigned_rate_limits": { 00:13:11.798 "rw_ios_per_sec": 0, 00:13:11.798 "rw_mbytes_per_sec": 0, 00:13:11.798 "r_mbytes_per_sec": 0, 00:13:11.798 "w_mbytes_per_sec": 0 00:13:11.798 }, 00:13:11.798 "claimed": true, 00:13:11.798 "claim_type": "exclusive_write", 00:13:11.798 "zoned": false, 00:13:11.798 "supported_io_types": { 00:13:11.798 "read": true, 00:13:11.798 "write": true, 00:13:11.798 "unmap": true, 00:13:11.798 "write_zeroes": true, 00:13:11.798 "flush": true, 00:13:11.798 "reset": true, 00:13:11.798 "compare": false, 00:13:11.798 "compare_and_write": false, 00:13:11.798 "abort": true, 00:13:11.798 "nvme_admin": false, 00:13:11.798 "nvme_io": false 00:13:11.798 }, 00:13:11.798 "memory_domains": [ 00:13:11.798 { 00:13:11.798 "dma_device_id": "system", 00:13:11.798 "dma_device_type": 1 00:13:11.798 }, 00:13:11.798 { 00:13:11.798 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:11.798 "dma_device_type": 2 00:13:11.798 } 00:13:11.798 ], 00:13:11.798 "driver_specific": { 00:13:11.798 "passthru": { 00:13:11.798 "name": "pt2", 00:13:11.798 "base_bdev_name": "malloc2" 00:13:11.798 } 00:13:11.798 } 00:13:11.798 }' 00:13:11.798 23:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:11.798 23:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:12.055 23:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:12.055 23:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:12.055 23:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:12.055 23:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:12.055 23:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:12.055 23:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:12.055 23:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:12.055 23:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:12.055 23:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:12.312 23:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:12.312 23:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:12.312 23:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:13:12.312 23:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:12.569 23:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:12.569 "name": "pt3", 00:13:12.569 "aliases": [ 00:13:12.569 "d2a30968-ed49-5435-8d8d-760271a89ead" 00:13:12.569 ], 00:13:12.569 "product_name": "passthru", 00:13:12.570 "block_size": 512, 00:13:12.570 "num_blocks": 65536, 00:13:12.570 "uuid": "d2a30968-ed49-5435-8d8d-760271a89ead", 00:13:12.570 "assigned_rate_limits": { 00:13:12.570 "rw_ios_per_sec": 0, 00:13:12.570 "rw_mbytes_per_sec": 0, 00:13:12.570 "r_mbytes_per_sec": 0, 00:13:12.570 "w_mbytes_per_sec": 0 00:13:12.570 }, 00:13:12.570 "claimed": true, 00:13:12.570 "claim_type": "exclusive_write", 00:13:12.570 "zoned": false, 00:13:12.570 "supported_io_types": { 00:13:12.570 "read": true, 00:13:12.570 "write": true, 00:13:12.570 "unmap": true, 00:13:12.570 "write_zeroes": true, 00:13:12.570 "flush": true, 00:13:12.570 "reset": true, 00:13:12.570 "compare": false, 00:13:12.570 "compare_and_write": false, 00:13:12.570 "abort": true, 00:13:12.570 "nvme_admin": false, 00:13:12.570 "nvme_io": false 00:13:12.570 }, 00:13:12.570 "memory_domains": [ 00:13:12.570 { 00:13:12.570 "dma_device_id": "system", 00:13:12.570 "dma_device_type": 1 00:13:12.570 }, 00:13:12.570 { 00:13:12.570 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:12.570 "dma_device_type": 2 00:13:12.570 } 00:13:12.570 ], 00:13:12.570 "driver_specific": { 00:13:12.570 "passthru": { 00:13:12.570 "name": "pt3", 00:13:12.570 "base_bdev_name": "malloc3" 00:13:12.570 } 00:13:12.570 } 00:13:12.570 }' 00:13:12.570 23:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:12.570 23:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:12.570 23:55:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:12.570 23:55:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:12.570 23:55:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:12.570 23:55:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:12.570 23:55:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:12.570 23:55:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:12.831 23:55:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:12.831 23:55:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:12.831 23:55:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:12.831 23:55:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:12.831 23:55:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:12.831 23:55:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:13:13.089 [2024-05-14 23:55:13.502696] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:13.089 23:55:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # '[' 13d7b689-19d7-44da-8054-993ed310c93a '!=' 13d7b689-19d7-44da-8054-993ed310c93a ']' 00:13:13.089 23:55:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@491 -- # has_redundancy raid0 00:13:13.089 23:55:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:13:13.089 23:55:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@216 -- # return 1 00:13:13.089 23:55:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@568 -- # killprocess 409047 00:13:13.089 23:55:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@946 -- # '[' -z 409047 ']' 00:13:13.089 23:55:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # kill -0 409047 00:13:13.089 23:55:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # uname 00:13:13.089 23:55:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:13:13.089 23:55:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 409047 00:13:13.089 23:55:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:13:13.089 23:55:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:13:13.089 23:55:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 409047' 00:13:13.089 killing process with pid 409047 00:13:13.089 23:55:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@965 -- # kill 409047 00:13:13.089 [2024-05-14 23:55:13.575780] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:13.089 [2024-05-14 23:55:13.575848] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:13.089 [2024-05-14 23:55:13.575903] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:13.089 [2024-05-14 23:55:13.575915] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x116cb40 name raid_bdev1, state offline 00:13:13.089 23:55:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@970 -- # wait 409047 00:13:13.089 [2024-05-14 23:55:13.607054] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:13.347 23:55:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # return 0 00:13:13.348 00:13:13.348 real 0m14.003s 00:13:13.348 user 0m25.174s 00:13:13.348 sys 0m2.519s 00:13:13.348 23:55:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:13:13.348 23:55:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:13.348 ************************************ 00:13:13.348 END TEST raid_superblock_test 00:13:13.348 ************************************ 00:13:13.348 23:55:13 bdev_raid -- bdev/bdev_raid.sh@814 -- # for level in raid0 concat raid1 00:13:13.348 23:55:13 bdev_raid -- bdev/bdev_raid.sh@815 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:13:13.348 23:55:13 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:13:13.348 23:55:13 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:13:13.348 23:55:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:13.607 ************************************ 00:13:13.607 START TEST raid_state_function_test 00:13:13.607 ************************************ 00:13:13.607 23:55:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1121 -- # raid_state_function_test concat 3 false 00:13:13.607 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local raid_level=concat 00:13:13.607 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=3 00:13:13.607 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local superblock=false 00:13:13.607 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:13:13.607 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # '[' concat '!=' raid1 ']' 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@238 -- # '[' false = true ']' 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@241 -- # superblock_create_arg= 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # raid_pid=411259 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 411259' 00:13:13.608 Process raid pid: 411259 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@247 -- # waitforlisten 411259 /var/tmp/spdk-raid.sock 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@827 -- # '[' -z 411259 ']' 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:13.608 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:13:13.608 23:55:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:13.608 [2024-05-14 23:55:14.017913] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:13:13.608 [2024-05-14 23:55:14.017977] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:13.608 [2024-05-14 23:55:14.146574] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:13.866 [2024-05-14 23:55:14.252034] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:13:13.866 [2024-05-14 23:55:14.309172] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:13.866 [2024-05-14 23:55:14.309206] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:14.433 23:55:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:13:14.433 23:55:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # return 0 00:13:14.433 23:55:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:14.691 [2024-05-14 23:55:15.165847] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:14.691 [2024-05-14 23:55:15.165891] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:14.691 [2024-05-14 23:55:15.165903] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:14.691 [2024-05-14 23:55:15.165915] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:14.691 [2024-05-14 23:55:15.165924] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:14.691 [2024-05-14 23:55:15.165935] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:14.691 23:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:14.691 23:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:14.691 23:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:14.691 23:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:14.692 23:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:14.692 23:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:14.692 23:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:14.692 23:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:14.692 23:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:14.692 23:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:14.692 23:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:14.692 23:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:14.950 23:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:14.950 "name": "Existed_Raid", 00:13:14.950 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:14.950 "strip_size_kb": 64, 00:13:14.950 "state": "configuring", 00:13:14.950 "raid_level": "concat", 00:13:14.950 "superblock": false, 00:13:14.950 "num_base_bdevs": 3, 00:13:14.950 "num_base_bdevs_discovered": 0, 00:13:14.950 "num_base_bdevs_operational": 3, 00:13:14.950 "base_bdevs_list": [ 00:13:14.950 { 00:13:14.950 "name": "BaseBdev1", 00:13:14.950 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:14.950 "is_configured": false, 00:13:14.950 "data_offset": 0, 00:13:14.950 "data_size": 0 00:13:14.950 }, 00:13:14.950 { 00:13:14.950 "name": "BaseBdev2", 00:13:14.950 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:14.950 "is_configured": false, 00:13:14.950 "data_offset": 0, 00:13:14.950 "data_size": 0 00:13:14.950 }, 00:13:14.950 { 00:13:14.950 "name": "BaseBdev3", 00:13:14.950 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:14.950 "is_configured": false, 00:13:14.950 "data_offset": 0, 00:13:14.950 "data_size": 0 00:13:14.950 } 00:13:14.950 ] 00:13:14.950 }' 00:13:14.950 23:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:14.950 23:55:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:15.517 23:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:15.775 [2024-05-14 23:55:16.184395] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:15.775 [2024-05-14 23:55:16.184431] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1253be0 name Existed_Raid, state configuring 00:13:15.775 23:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:16.033 [2024-05-14 23:55:16.372927] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:16.033 [2024-05-14 23:55:16.372955] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:16.033 [2024-05-14 23:55:16.372966] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:16.033 [2024-05-14 23:55:16.372977] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:16.033 [2024-05-14 23:55:16.372986] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:16.033 [2024-05-14 23:55:16.372997] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:16.033 23:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:16.291 [2024-05-14 23:55:16.631407] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:16.291 BaseBdev1 00:13:16.291 23:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:13:16.291 23:55:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:13:16.291 23:55:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:16.291 23:55:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:13:16.291 23:55:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:16.291 23:55:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:16.291 23:55:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:16.291 23:55:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:16.548 [ 00:13:16.548 { 00:13:16.548 "name": "BaseBdev1", 00:13:16.548 "aliases": [ 00:13:16.548 "0a213dec-4e04-4bd5-ab33-f7bcee0e2078" 00:13:16.548 ], 00:13:16.548 "product_name": "Malloc disk", 00:13:16.548 "block_size": 512, 00:13:16.548 "num_blocks": 65536, 00:13:16.548 "uuid": "0a213dec-4e04-4bd5-ab33-f7bcee0e2078", 00:13:16.548 "assigned_rate_limits": { 00:13:16.548 "rw_ios_per_sec": 0, 00:13:16.548 "rw_mbytes_per_sec": 0, 00:13:16.548 "r_mbytes_per_sec": 0, 00:13:16.548 "w_mbytes_per_sec": 0 00:13:16.548 }, 00:13:16.548 "claimed": true, 00:13:16.548 "claim_type": "exclusive_write", 00:13:16.548 "zoned": false, 00:13:16.548 "supported_io_types": { 00:13:16.548 "read": true, 00:13:16.548 "write": true, 00:13:16.548 "unmap": true, 00:13:16.548 "write_zeroes": true, 00:13:16.548 "flush": true, 00:13:16.548 "reset": true, 00:13:16.548 "compare": false, 00:13:16.548 "compare_and_write": false, 00:13:16.548 "abort": true, 00:13:16.548 "nvme_admin": false, 00:13:16.548 "nvme_io": false 00:13:16.548 }, 00:13:16.548 "memory_domains": [ 00:13:16.548 { 00:13:16.548 "dma_device_id": "system", 00:13:16.548 "dma_device_type": 1 00:13:16.548 }, 00:13:16.548 { 00:13:16.548 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:16.548 "dma_device_type": 2 00:13:16.548 } 00:13:16.548 ], 00:13:16.548 "driver_specific": {} 00:13:16.548 } 00:13:16.548 ] 00:13:16.548 23:55:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:13:16.548 23:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:16.548 23:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:16.548 23:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:16.548 23:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:16.548 23:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:16.548 23:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:16.548 23:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:16.548 23:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:16.548 23:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:16.548 23:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:16.548 23:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:16.548 23:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:16.806 23:55:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:16.806 "name": "Existed_Raid", 00:13:16.806 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:16.806 "strip_size_kb": 64, 00:13:16.806 "state": "configuring", 00:13:16.806 "raid_level": "concat", 00:13:16.806 "superblock": false, 00:13:16.806 "num_base_bdevs": 3, 00:13:16.806 "num_base_bdevs_discovered": 1, 00:13:16.806 "num_base_bdevs_operational": 3, 00:13:16.806 "base_bdevs_list": [ 00:13:16.806 { 00:13:16.806 "name": "BaseBdev1", 00:13:16.806 "uuid": "0a213dec-4e04-4bd5-ab33-f7bcee0e2078", 00:13:16.806 "is_configured": true, 00:13:16.806 "data_offset": 0, 00:13:16.806 "data_size": 65536 00:13:16.806 }, 00:13:16.806 { 00:13:16.806 "name": "BaseBdev2", 00:13:16.806 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:16.806 "is_configured": false, 00:13:16.806 "data_offset": 0, 00:13:16.806 "data_size": 0 00:13:16.806 }, 00:13:16.806 { 00:13:16.806 "name": "BaseBdev3", 00:13:16.806 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:16.806 "is_configured": false, 00:13:16.806 "data_offset": 0, 00:13:16.806 "data_size": 0 00:13:16.806 } 00:13:16.806 ] 00:13:16.806 }' 00:13:16.806 23:55:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:16.806 23:55:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:17.372 23:55:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:17.372 [2024-05-14 23:55:17.938862] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:17.372 [2024-05-14 23:55:17.938906] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12534b0 name Existed_Raid, state configuring 00:13:17.372 23:55:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:17.631 [2024-05-14 23:55:18.175530] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:17.631 [2024-05-14 23:55:18.177170] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:17.631 [2024-05-14 23:55:18.177203] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:17.631 [2024-05-14 23:55:18.177213] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:17.631 [2024-05-14 23:55:18.177225] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:17.631 23:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:13:17.631 23:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:13:17.631 23:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:17.631 23:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:17.631 23:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:17.631 23:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:17.631 23:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:17.631 23:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:17.631 23:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:17.631 23:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:17.631 23:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:17.631 23:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:17.631 23:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:17.631 23:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:17.890 23:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:17.890 "name": "Existed_Raid", 00:13:17.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:17.890 "strip_size_kb": 64, 00:13:17.890 "state": "configuring", 00:13:17.890 "raid_level": "concat", 00:13:17.890 "superblock": false, 00:13:17.890 "num_base_bdevs": 3, 00:13:17.890 "num_base_bdevs_discovered": 1, 00:13:17.890 "num_base_bdevs_operational": 3, 00:13:17.890 "base_bdevs_list": [ 00:13:17.890 { 00:13:17.890 "name": "BaseBdev1", 00:13:17.890 "uuid": "0a213dec-4e04-4bd5-ab33-f7bcee0e2078", 00:13:17.890 "is_configured": true, 00:13:17.890 "data_offset": 0, 00:13:17.890 "data_size": 65536 00:13:17.890 }, 00:13:17.890 { 00:13:17.890 "name": "BaseBdev2", 00:13:17.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:17.890 "is_configured": false, 00:13:17.890 "data_offset": 0, 00:13:17.890 "data_size": 0 00:13:17.890 }, 00:13:17.890 { 00:13:17.890 "name": "BaseBdev3", 00:13:17.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:17.890 "is_configured": false, 00:13:17.890 "data_offset": 0, 00:13:17.890 "data_size": 0 00:13:17.890 } 00:13:17.890 ] 00:13:17.890 }' 00:13:17.890 23:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:17.890 23:55:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:18.458 23:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:18.717 [2024-05-14 23:55:19.269813] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:18.717 BaseBdev2 00:13:18.717 23:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:13:18.717 23:55:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:13:18.717 23:55:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:18.717 23:55:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:13:18.717 23:55:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:18.717 23:55:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:18.717 23:55:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:18.975 23:55:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:19.233 [ 00:13:19.233 { 00:13:19.233 "name": "BaseBdev2", 00:13:19.233 "aliases": [ 00:13:19.233 "b8323c66-ba33-4cb6-9a8d-b90e8b410bbd" 00:13:19.233 ], 00:13:19.233 "product_name": "Malloc disk", 00:13:19.233 "block_size": 512, 00:13:19.233 "num_blocks": 65536, 00:13:19.233 "uuid": "b8323c66-ba33-4cb6-9a8d-b90e8b410bbd", 00:13:19.233 "assigned_rate_limits": { 00:13:19.233 "rw_ios_per_sec": 0, 00:13:19.233 "rw_mbytes_per_sec": 0, 00:13:19.233 "r_mbytes_per_sec": 0, 00:13:19.233 "w_mbytes_per_sec": 0 00:13:19.233 }, 00:13:19.233 "claimed": true, 00:13:19.233 "claim_type": "exclusive_write", 00:13:19.233 "zoned": false, 00:13:19.233 "supported_io_types": { 00:13:19.233 "read": true, 00:13:19.233 "write": true, 00:13:19.233 "unmap": true, 00:13:19.233 "write_zeroes": true, 00:13:19.233 "flush": true, 00:13:19.233 "reset": true, 00:13:19.233 "compare": false, 00:13:19.233 "compare_and_write": false, 00:13:19.233 "abort": true, 00:13:19.233 "nvme_admin": false, 00:13:19.233 "nvme_io": false 00:13:19.233 }, 00:13:19.233 "memory_domains": [ 00:13:19.233 { 00:13:19.233 "dma_device_id": "system", 00:13:19.233 "dma_device_type": 1 00:13:19.233 }, 00:13:19.234 { 00:13:19.234 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:19.234 "dma_device_type": 2 00:13:19.234 } 00:13:19.234 ], 00:13:19.234 "driver_specific": {} 00:13:19.234 } 00:13:19.234 ] 00:13:19.234 23:55:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:13:19.234 23:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:13:19.234 23:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:13:19.234 23:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:19.234 23:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:19.234 23:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:19.234 23:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:19.234 23:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:19.234 23:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:19.234 23:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:19.234 23:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:19.234 23:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:19.234 23:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:19.234 23:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:19.234 23:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:19.492 23:55:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:19.492 "name": "Existed_Raid", 00:13:19.492 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:19.492 "strip_size_kb": 64, 00:13:19.492 "state": "configuring", 00:13:19.492 "raid_level": "concat", 00:13:19.492 "superblock": false, 00:13:19.492 "num_base_bdevs": 3, 00:13:19.492 "num_base_bdevs_discovered": 2, 00:13:19.492 "num_base_bdevs_operational": 3, 00:13:19.492 "base_bdevs_list": [ 00:13:19.492 { 00:13:19.492 "name": "BaseBdev1", 00:13:19.492 "uuid": "0a213dec-4e04-4bd5-ab33-f7bcee0e2078", 00:13:19.492 "is_configured": true, 00:13:19.492 "data_offset": 0, 00:13:19.492 "data_size": 65536 00:13:19.492 }, 00:13:19.492 { 00:13:19.492 "name": "BaseBdev2", 00:13:19.492 "uuid": "b8323c66-ba33-4cb6-9a8d-b90e8b410bbd", 00:13:19.492 "is_configured": true, 00:13:19.492 "data_offset": 0, 00:13:19.492 "data_size": 65536 00:13:19.492 }, 00:13:19.492 { 00:13:19.492 "name": "BaseBdev3", 00:13:19.492 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:19.492 "is_configured": false, 00:13:19.492 "data_offset": 0, 00:13:19.492 "data_size": 0 00:13:19.492 } 00:13:19.492 ] 00:13:19.492 }' 00:13:19.492 23:55:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:19.492 23:55:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:20.060 23:55:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:20.319 [2024-05-14 23:55:20.865602] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:20.319 [2024-05-14 23:55:20.865638] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1254560 00:13:20.319 [2024-05-14 23:55:20.865647] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:13:20.319 [2024-05-14 23:55:20.865837] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x126b490 00:13:20.319 [2024-05-14 23:55:20.865959] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1254560 00:13:20.319 [2024-05-14 23:55:20.865969] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1254560 00:13:20.319 [2024-05-14 23:55:20.866138] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:20.319 BaseBdev3 00:13:20.319 23:55:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:13:20.319 23:55:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:13:20.319 23:55:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:20.319 23:55:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:13:20.319 23:55:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:20.319 23:55:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:20.319 23:55:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:20.577 23:55:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:20.836 [ 00:13:20.836 { 00:13:20.836 "name": "BaseBdev3", 00:13:20.836 "aliases": [ 00:13:20.836 "f7f74f1a-de62-4217-8f9d-f0aa9501dcc8" 00:13:20.836 ], 00:13:20.836 "product_name": "Malloc disk", 00:13:20.836 "block_size": 512, 00:13:20.836 "num_blocks": 65536, 00:13:20.836 "uuid": "f7f74f1a-de62-4217-8f9d-f0aa9501dcc8", 00:13:20.836 "assigned_rate_limits": { 00:13:20.836 "rw_ios_per_sec": 0, 00:13:20.836 "rw_mbytes_per_sec": 0, 00:13:20.836 "r_mbytes_per_sec": 0, 00:13:20.836 "w_mbytes_per_sec": 0 00:13:20.836 }, 00:13:20.837 "claimed": true, 00:13:20.837 "claim_type": "exclusive_write", 00:13:20.837 "zoned": false, 00:13:20.837 "supported_io_types": { 00:13:20.837 "read": true, 00:13:20.837 "write": true, 00:13:20.837 "unmap": true, 00:13:20.837 "write_zeroes": true, 00:13:20.837 "flush": true, 00:13:20.837 "reset": true, 00:13:20.837 "compare": false, 00:13:20.837 "compare_and_write": false, 00:13:20.837 "abort": true, 00:13:20.837 "nvme_admin": false, 00:13:20.837 "nvme_io": false 00:13:20.837 }, 00:13:20.837 "memory_domains": [ 00:13:20.837 { 00:13:20.837 "dma_device_id": "system", 00:13:20.837 "dma_device_type": 1 00:13:20.837 }, 00:13:20.837 { 00:13:20.837 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:20.837 "dma_device_type": 2 00:13:20.837 } 00:13:20.837 ], 00:13:20.837 "driver_specific": {} 00:13:20.837 } 00:13:20.837 ] 00:13:20.837 23:55:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:13:20.837 23:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:13:20.837 23:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:13:20.837 23:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:13:20.837 23:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:20.837 23:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:13:20.837 23:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:20.837 23:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:20.837 23:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:20.837 23:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:20.837 23:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:20.837 23:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:20.837 23:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:20.837 23:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:20.837 23:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:21.096 23:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:21.096 "name": "Existed_Raid", 00:13:21.096 "uuid": "bfdb4eb5-1446-4696-87ae-d9ffd1b83f52", 00:13:21.096 "strip_size_kb": 64, 00:13:21.096 "state": "online", 00:13:21.096 "raid_level": "concat", 00:13:21.096 "superblock": false, 00:13:21.096 "num_base_bdevs": 3, 00:13:21.096 "num_base_bdevs_discovered": 3, 00:13:21.096 "num_base_bdevs_operational": 3, 00:13:21.096 "base_bdevs_list": [ 00:13:21.096 { 00:13:21.096 "name": "BaseBdev1", 00:13:21.096 "uuid": "0a213dec-4e04-4bd5-ab33-f7bcee0e2078", 00:13:21.096 "is_configured": true, 00:13:21.096 "data_offset": 0, 00:13:21.096 "data_size": 65536 00:13:21.096 }, 00:13:21.096 { 00:13:21.096 "name": "BaseBdev2", 00:13:21.096 "uuid": "b8323c66-ba33-4cb6-9a8d-b90e8b410bbd", 00:13:21.096 "is_configured": true, 00:13:21.096 "data_offset": 0, 00:13:21.096 "data_size": 65536 00:13:21.096 }, 00:13:21.096 { 00:13:21.096 "name": "BaseBdev3", 00:13:21.096 "uuid": "f7f74f1a-de62-4217-8f9d-f0aa9501dcc8", 00:13:21.096 "is_configured": true, 00:13:21.096 "data_offset": 0, 00:13:21.096 "data_size": 65536 00:13:21.096 } 00:13:21.096 ] 00:13:21.096 }' 00:13:21.096 23:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:21.096 23:55:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:21.663 23:55:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:13:21.663 23:55:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:13:21.663 23:55:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:13:21.663 23:55:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:13:21.663 23:55:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:13:21.663 23:55:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:13:21.663 23:55:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:13:21.663 23:55:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:21.922 [2024-05-14 23:55:22.430035] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:21.922 23:55:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:13:21.922 "name": "Existed_Raid", 00:13:21.922 "aliases": [ 00:13:21.922 "bfdb4eb5-1446-4696-87ae-d9ffd1b83f52" 00:13:21.922 ], 00:13:21.922 "product_name": "Raid Volume", 00:13:21.922 "block_size": 512, 00:13:21.922 "num_blocks": 196608, 00:13:21.922 "uuid": "bfdb4eb5-1446-4696-87ae-d9ffd1b83f52", 00:13:21.922 "assigned_rate_limits": { 00:13:21.922 "rw_ios_per_sec": 0, 00:13:21.922 "rw_mbytes_per_sec": 0, 00:13:21.922 "r_mbytes_per_sec": 0, 00:13:21.922 "w_mbytes_per_sec": 0 00:13:21.922 }, 00:13:21.922 "claimed": false, 00:13:21.922 "zoned": false, 00:13:21.922 "supported_io_types": { 00:13:21.922 "read": true, 00:13:21.922 "write": true, 00:13:21.922 "unmap": true, 00:13:21.922 "write_zeroes": true, 00:13:21.922 "flush": true, 00:13:21.922 "reset": true, 00:13:21.922 "compare": false, 00:13:21.922 "compare_and_write": false, 00:13:21.922 "abort": false, 00:13:21.922 "nvme_admin": false, 00:13:21.922 "nvme_io": false 00:13:21.922 }, 00:13:21.922 "memory_domains": [ 00:13:21.922 { 00:13:21.922 "dma_device_id": "system", 00:13:21.922 "dma_device_type": 1 00:13:21.922 }, 00:13:21.922 { 00:13:21.922 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:21.922 "dma_device_type": 2 00:13:21.922 }, 00:13:21.922 { 00:13:21.922 "dma_device_id": "system", 00:13:21.922 "dma_device_type": 1 00:13:21.922 }, 00:13:21.922 { 00:13:21.922 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:21.922 "dma_device_type": 2 00:13:21.922 }, 00:13:21.922 { 00:13:21.922 "dma_device_id": "system", 00:13:21.922 "dma_device_type": 1 00:13:21.922 }, 00:13:21.922 { 00:13:21.922 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:21.922 "dma_device_type": 2 00:13:21.922 } 00:13:21.922 ], 00:13:21.922 "driver_specific": { 00:13:21.922 "raid": { 00:13:21.922 "uuid": "bfdb4eb5-1446-4696-87ae-d9ffd1b83f52", 00:13:21.922 "strip_size_kb": 64, 00:13:21.922 "state": "online", 00:13:21.922 "raid_level": "concat", 00:13:21.922 "superblock": false, 00:13:21.922 "num_base_bdevs": 3, 00:13:21.922 "num_base_bdevs_discovered": 3, 00:13:21.922 "num_base_bdevs_operational": 3, 00:13:21.922 "base_bdevs_list": [ 00:13:21.922 { 00:13:21.922 "name": "BaseBdev1", 00:13:21.922 "uuid": "0a213dec-4e04-4bd5-ab33-f7bcee0e2078", 00:13:21.922 "is_configured": true, 00:13:21.922 "data_offset": 0, 00:13:21.922 "data_size": 65536 00:13:21.922 }, 00:13:21.922 { 00:13:21.922 "name": "BaseBdev2", 00:13:21.922 "uuid": "b8323c66-ba33-4cb6-9a8d-b90e8b410bbd", 00:13:21.922 "is_configured": true, 00:13:21.922 "data_offset": 0, 00:13:21.922 "data_size": 65536 00:13:21.922 }, 00:13:21.922 { 00:13:21.922 "name": "BaseBdev3", 00:13:21.922 "uuid": "f7f74f1a-de62-4217-8f9d-f0aa9501dcc8", 00:13:21.922 "is_configured": true, 00:13:21.922 "data_offset": 0, 00:13:21.922 "data_size": 65536 00:13:21.922 } 00:13:21.922 ] 00:13:21.922 } 00:13:21.922 } 00:13:21.922 }' 00:13:21.922 23:55:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:21.922 23:55:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:13:21.922 BaseBdev2 00:13:21.922 BaseBdev3' 00:13:21.922 23:55:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:21.922 23:55:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:21.922 23:55:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:22.181 23:55:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:22.181 "name": "BaseBdev1", 00:13:22.181 "aliases": [ 00:13:22.181 "0a213dec-4e04-4bd5-ab33-f7bcee0e2078" 00:13:22.181 ], 00:13:22.181 "product_name": "Malloc disk", 00:13:22.181 "block_size": 512, 00:13:22.181 "num_blocks": 65536, 00:13:22.181 "uuid": "0a213dec-4e04-4bd5-ab33-f7bcee0e2078", 00:13:22.181 "assigned_rate_limits": { 00:13:22.181 "rw_ios_per_sec": 0, 00:13:22.181 "rw_mbytes_per_sec": 0, 00:13:22.181 "r_mbytes_per_sec": 0, 00:13:22.181 "w_mbytes_per_sec": 0 00:13:22.181 }, 00:13:22.181 "claimed": true, 00:13:22.181 "claim_type": "exclusive_write", 00:13:22.181 "zoned": false, 00:13:22.181 "supported_io_types": { 00:13:22.181 "read": true, 00:13:22.181 "write": true, 00:13:22.181 "unmap": true, 00:13:22.181 "write_zeroes": true, 00:13:22.181 "flush": true, 00:13:22.181 "reset": true, 00:13:22.181 "compare": false, 00:13:22.181 "compare_and_write": false, 00:13:22.181 "abort": true, 00:13:22.181 "nvme_admin": false, 00:13:22.181 "nvme_io": false 00:13:22.181 }, 00:13:22.181 "memory_domains": [ 00:13:22.181 { 00:13:22.181 "dma_device_id": "system", 00:13:22.181 "dma_device_type": 1 00:13:22.181 }, 00:13:22.181 { 00:13:22.181 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:22.181 "dma_device_type": 2 00:13:22.181 } 00:13:22.181 ], 00:13:22.181 "driver_specific": {} 00:13:22.181 }' 00:13:22.181 23:55:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:22.439 23:55:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:22.439 23:55:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:22.439 23:55:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:22.439 23:55:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:22.439 23:55:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:22.439 23:55:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:22.439 23:55:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:22.439 23:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:22.439 23:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:22.698 23:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:22.698 23:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:22.698 23:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:22.698 23:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:22.698 23:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:22.698 23:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:22.698 "name": "BaseBdev2", 00:13:22.698 "aliases": [ 00:13:22.698 "b8323c66-ba33-4cb6-9a8d-b90e8b410bbd" 00:13:22.698 ], 00:13:22.698 "product_name": "Malloc disk", 00:13:22.698 "block_size": 512, 00:13:22.698 "num_blocks": 65536, 00:13:22.698 "uuid": "b8323c66-ba33-4cb6-9a8d-b90e8b410bbd", 00:13:22.698 "assigned_rate_limits": { 00:13:22.698 "rw_ios_per_sec": 0, 00:13:22.698 "rw_mbytes_per_sec": 0, 00:13:22.698 "r_mbytes_per_sec": 0, 00:13:22.698 "w_mbytes_per_sec": 0 00:13:22.698 }, 00:13:22.698 "claimed": true, 00:13:22.698 "claim_type": "exclusive_write", 00:13:22.698 "zoned": false, 00:13:22.698 "supported_io_types": { 00:13:22.698 "read": true, 00:13:22.698 "write": true, 00:13:22.698 "unmap": true, 00:13:22.698 "write_zeroes": true, 00:13:22.698 "flush": true, 00:13:22.698 "reset": true, 00:13:22.698 "compare": false, 00:13:22.698 "compare_and_write": false, 00:13:22.698 "abort": true, 00:13:22.698 "nvme_admin": false, 00:13:22.698 "nvme_io": false 00:13:22.698 }, 00:13:22.698 "memory_domains": [ 00:13:22.698 { 00:13:22.698 "dma_device_id": "system", 00:13:22.698 "dma_device_type": 1 00:13:22.698 }, 00:13:22.698 { 00:13:22.698 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:22.698 "dma_device_type": 2 00:13:22.698 } 00:13:22.698 ], 00:13:22.698 "driver_specific": {} 00:13:22.698 }' 00:13:22.698 23:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:22.956 23:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:22.956 23:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:22.956 23:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:22.956 23:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:22.956 23:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:22.956 23:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:22.956 23:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:23.222 23:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:23.222 23:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:23.222 23:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:23.222 23:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:23.222 23:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:23.222 23:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:23.222 23:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:23.513 23:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:23.513 "name": "BaseBdev3", 00:13:23.513 "aliases": [ 00:13:23.513 "f7f74f1a-de62-4217-8f9d-f0aa9501dcc8" 00:13:23.513 ], 00:13:23.513 "product_name": "Malloc disk", 00:13:23.513 "block_size": 512, 00:13:23.513 "num_blocks": 65536, 00:13:23.513 "uuid": "f7f74f1a-de62-4217-8f9d-f0aa9501dcc8", 00:13:23.513 "assigned_rate_limits": { 00:13:23.513 "rw_ios_per_sec": 0, 00:13:23.513 "rw_mbytes_per_sec": 0, 00:13:23.513 "r_mbytes_per_sec": 0, 00:13:23.513 "w_mbytes_per_sec": 0 00:13:23.513 }, 00:13:23.513 "claimed": true, 00:13:23.513 "claim_type": "exclusive_write", 00:13:23.513 "zoned": false, 00:13:23.513 "supported_io_types": { 00:13:23.513 "read": true, 00:13:23.513 "write": true, 00:13:23.513 "unmap": true, 00:13:23.513 "write_zeroes": true, 00:13:23.513 "flush": true, 00:13:23.513 "reset": true, 00:13:23.513 "compare": false, 00:13:23.513 "compare_and_write": false, 00:13:23.513 "abort": true, 00:13:23.513 "nvme_admin": false, 00:13:23.513 "nvme_io": false 00:13:23.513 }, 00:13:23.513 "memory_domains": [ 00:13:23.513 { 00:13:23.513 "dma_device_id": "system", 00:13:23.513 "dma_device_type": 1 00:13:23.513 }, 00:13:23.513 { 00:13:23.513 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:23.513 "dma_device_type": 2 00:13:23.513 } 00:13:23.513 ], 00:13:23.513 "driver_specific": {} 00:13:23.513 }' 00:13:23.513 23:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:23.513 23:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:23.513 23:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:23.513 23:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:23.513 23:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:23.513 23:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:23.513 23:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:23.513 23:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:23.771 23:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:23.771 23:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:23.771 23:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:23.771 23:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:23.771 23:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:24.029 [2024-05-14 23:55:24.439101] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:24.029 [2024-05-14 23:55:24.439129] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:24.029 [2024-05-14 23:55:24.439176] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:24.029 23:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # local expected_state 00:13:24.029 23:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # has_redundancy concat 00:13:24.029 23:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:13:24.029 23:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@216 -- # return 1 00:13:24.029 23:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:13:24.029 23:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:13:24.029 23:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:24.029 23:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:13:24.029 23:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:24.029 23:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:24.029 23:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:13:24.029 23:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:24.029 23:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:24.029 23:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:24.029 23:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:24.029 23:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:24.029 23:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:24.287 23:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:24.287 "name": "Existed_Raid", 00:13:24.287 "uuid": "bfdb4eb5-1446-4696-87ae-d9ffd1b83f52", 00:13:24.287 "strip_size_kb": 64, 00:13:24.287 "state": "offline", 00:13:24.287 "raid_level": "concat", 00:13:24.287 "superblock": false, 00:13:24.287 "num_base_bdevs": 3, 00:13:24.287 "num_base_bdevs_discovered": 2, 00:13:24.287 "num_base_bdevs_operational": 2, 00:13:24.287 "base_bdevs_list": [ 00:13:24.287 { 00:13:24.287 "name": null, 00:13:24.287 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:24.287 "is_configured": false, 00:13:24.287 "data_offset": 0, 00:13:24.287 "data_size": 65536 00:13:24.287 }, 00:13:24.287 { 00:13:24.287 "name": "BaseBdev2", 00:13:24.287 "uuid": "b8323c66-ba33-4cb6-9a8d-b90e8b410bbd", 00:13:24.287 "is_configured": true, 00:13:24.287 "data_offset": 0, 00:13:24.287 "data_size": 65536 00:13:24.287 }, 00:13:24.287 { 00:13:24.287 "name": "BaseBdev3", 00:13:24.287 "uuid": "f7f74f1a-de62-4217-8f9d-f0aa9501dcc8", 00:13:24.287 "is_configured": true, 00:13:24.287 "data_offset": 0, 00:13:24.287 "data_size": 65536 00:13:24.287 } 00:13:24.287 ] 00:13:24.287 }' 00:13:24.287 23:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:24.287 23:55:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:24.851 23:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:13:24.851 23:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:13:24.851 23:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:24.851 23:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:13:25.108 23:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:13:25.108 23:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:25.108 23:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:25.366 [2024-05-14 23:55:25.703541] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:25.366 23:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:13:25.366 23:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:13:25.366 23:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:25.366 23:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:13:25.624 23:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:13:25.624 23:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:25.624 23:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:25.624 [2024-05-14 23:55:26.211497] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:25.624 [2024-05-14 23:55:26.211541] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1254560 name Existed_Raid, state offline 00:13:25.882 23:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:13:25.882 23:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:13:25.882 23:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:25.882 23:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:13:26.140 23:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:13:26.140 23:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:13:26.140 23:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@300 -- # '[' 3 -gt 2 ']' 00:13:26.140 23:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:13:26.140 23:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:13:26.140 23:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:26.140 BaseBdev2 00:13:26.140 23:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:13:26.140 23:55:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:13:26.140 23:55:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:26.140 23:55:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:13:26.140 23:55:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:26.140 23:55:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:26.140 23:55:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:26.398 23:55:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:26.656 [ 00:13:26.656 { 00:13:26.656 "name": "BaseBdev2", 00:13:26.656 "aliases": [ 00:13:26.656 "546b3d8e-c484-47b9-b47b-454ddeba6f72" 00:13:26.656 ], 00:13:26.656 "product_name": "Malloc disk", 00:13:26.656 "block_size": 512, 00:13:26.656 "num_blocks": 65536, 00:13:26.656 "uuid": "546b3d8e-c484-47b9-b47b-454ddeba6f72", 00:13:26.656 "assigned_rate_limits": { 00:13:26.656 "rw_ios_per_sec": 0, 00:13:26.656 "rw_mbytes_per_sec": 0, 00:13:26.656 "r_mbytes_per_sec": 0, 00:13:26.656 "w_mbytes_per_sec": 0 00:13:26.656 }, 00:13:26.656 "claimed": false, 00:13:26.656 "zoned": false, 00:13:26.656 "supported_io_types": { 00:13:26.656 "read": true, 00:13:26.656 "write": true, 00:13:26.656 "unmap": true, 00:13:26.656 "write_zeroes": true, 00:13:26.656 "flush": true, 00:13:26.656 "reset": true, 00:13:26.656 "compare": false, 00:13:26.656 "compare_and_write": false, 00:13:26.656 "abort": true, 00:13:26.656 "nvme_admin": false, 00:13:26.656 "nvme_io": false 00:13:26.656 }, 00:13:26.656 "memory_domains": [ 00:13:26.656 { 00:13:26.656 "dma_device_id": "system", 00:13:26.656 "dma_device_type": 1 00:13:26.656 }, 00:13:26.656 { 00:13:26.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:26.656 "dma_device_type": 2 00:13:26.656 } 00:13:26.656 ], 00:13:26.656 "driver_specific": {} 00:13:26.656 } 00:13:26.656 ] 00:13:26.656 23:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:13:26.656 23:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:13:26.656 23:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:13:26.656 23:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:26.914 BaseBdev3 00:13:26.914 23:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:13:26.914 23:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:13:26.914 23:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:26.914 23:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:13:26.914 23:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:26.914 23:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:26.914 23:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:27.172 23:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:27.430 [ 00:13:27.430 { 00:13:27.430 "name": "BaseBdev3", 00:13:27.430 "aliases": [ 00:13:27.430 "bfce50e8-4715-4553-9dde-6fe97f7e118b" 00:13:27.430 ], 00:13:27.430 "product_name": "Malloc disk", 00:13:27.430 "block_size": 512, 00:13:27.430 "num_blocks": 65536, 00:13:27.430 "uuid": "bfce50e8-4715-4553-9dde-6fe97f7e118b", 00:13:27.430 "assigned_rate_limits": { 00:13:27.430 "rw_ios_per_sec": 0, 00:13:27.430 "rw_mbytes_per_sec": 0, 00:13:27.430 "r_mbytes_per_sec": 0, 00:13:27.430 "w_mbytes_per_sec": 0 00:13:27.430 }, 00:13:27.430 "claimed": false, 00:13:27.430 "zoned": false, 00:13:27.430 "supported_io_types": { 00:13:27.430 "read": true, 00:13:27.430 "write": true, 00:13:27.430 "unmap": true, 00:13:27.430 "write_zeroes": true, 00:13:27.430 "flush": true, 00:13:27.430 "reset": true, 00:13:27.430 "compare": false, 00:13:27.430 "compare_and_write": false, 00:13:27.430 "abort": true, 00:13:27.430 "nvme_admin": false, 00:13:27.430 "nvme_io": false 00:13:27.430 }, 00:13:27.430 "memory_domains": [ 00:13:27.430 { 00:13:27.430 "dma_device_id": "system", 00:13:27.430 "dma_device_type": 1 00:13:27.430 }, 00:13:27.430 { 00:13:27.430 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:27.430 "dma_device_type": 2 00:13:27.430 } 00:13:27.430 ], 00:13:27.430 "driver_specific": {} 00:13:27.430 } 00:13:27.430 ] 00:13:27.430 23:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:13:27.430 23:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:13:27.430 23:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:13:27.430 23:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:27.688 [2024-05-14 23:55:28.151816] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:27.688 [2024-05-14 23:55:28.151860] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:27.688 [2024-05-14 23:55:28.151881] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:27.688 [2024-05-14 23:55:28.153279] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:27.688 23:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:27.688 23:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:27.688 23:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:27.688 23:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:27.688 23:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:27.688 23:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:27.688 23:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:27.688 23:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:27.688 23:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:27.688 23:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:27.688 23:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:27.688 23:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:27.946 23:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:27.946 "name": "Existed_Raid", 00:13:27.946 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:27.946 "strip_size_kb": 64, 00:13:27.946 "state": "configuring", 00:13:27.946 "raid_level": "concat", 00:13:27.946 "superblock": false, 00:13:27.946 "num_base_bdevs": 3, 00:13:27.946 "num_base_bdevs_discovered": 2, 00:13:27.946 "num_base_bdevs_operational": 3, 00:13:27.946 "base_bdevs_list": [ 00:13:27.946 { 00:13:27.946 "name": "BaseBdev1", 00:13:27.946 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:27.946 "is_configured": false, 00:13:27.946 "data_offset": 0, 00:13:27.946 "data_size": 0 00:13:27.946 }, 00:13:27.946 { 00:13:27.946 "name": "BaseBdev2", 00:13:27.946 "uuid": "546b3d8e-c484-47b9-b47b-454ddeba6f72", 00:13:27.946 "is_configured": true, 00:13:27.946 "data_offset": 0, 00:13:27.946 "data_size": 65536 00:13:27.946 }, 00:13:27.946 { 00:13:27.946 "name": "BaseBdev3", 00:13:27.946 "uuid": "bfce50e8-4715-4553-9dde-6fe97f7e118b", 00:13:27.946 "is_configured": true, 00:13:27.946 "data_offset": 0, 00:13:27.946 "data_size": 65536 00:13:27.946 } 00:13:27.946 ] 00:13:27.946 }' 00:13:27.946 23:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:27.946 23:55:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:28.513 23:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:28.771 [2024-05-14 23:55:29.146434] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:28.771 23:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:28.771 23:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:28.771 23:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:28.771 23:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:28.771 23:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:28.771 23:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:28.771 23:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:28.771 23:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:28.771 23:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:28.771 23:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:28.771 23:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:28.771 23:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:28.771 23:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:28.771 "name": "Existed_Raid", 00:13:28.771 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:28.771 "strip_size_kb": 64, 00:13:28.771 "state": "configuring", 00:13:28.771 "raid_level": "concat", 00:13:28.771 "superblock": false, 00:13:28.771 "num_base_bdevs": 3, 00:13:28.771 "num_base_bdevs_discovered": 1, 00:13:28.771 "num_base_bdevs_operational": 3, 00:13:28.771 "base_bdevs_list": [ 00:13:28.771 { 00:13:28.771 "name": "BaseBdev1", 00:13:28.771 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:28.771 "is_configured": false, 00:13:28.771 "data_offset": 0, 00:13:28.771 "data_size": 0 00:13:28.771 }, 00:13:28.771 { 00:13:28.771 "name": null, 00:13:28.771 "uuid": "546b3d8e-c484-47b9-b47b-454ddeba6f72", 00:13:28.771 "is_configured": false, 00:13:28.771 "data_offset": 0, 00:13:28.771 "data_size": 65536 00:13:28.771 }, 00:13:28.771 { 00:13:28.771 "name": "BaseBdev3", 00:13:28.771 "uuid": "bfce50e8-4715-4553-9dde-6fe97f7e118b", 00:13:28.771 "is_configured": true, 00:13:28.771 "data_offset": 0, 00:13:28.771 "data_size": 65536 00:13:28.771 } 00:13:28.771 ] 00:13:28.771 }' 00:13:28.771 23:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:28.771 23:55:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:29.338 23:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:29.338 23:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:29.596 23:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:13:29.596 23:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:29.853 [2024-05-14 23:55:30.296915] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:29.853 BaseBdev1 00:13:29.853 23:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:13:29.853 23:55:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:13:29.853 23:55:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:29.853 23:55:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:13:29.853 23:55:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:29.853 23:55:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:29.853 23:55:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:30.112 23:55:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:30.370 [ 00:13:30.370 { 00:13:30.370 "name": "BaseBdev1", 00:13:30.370 "aliases": [ 00:13:30.370 "462708f9-24b6-465f-be31-f0a8c54abc4a" 00:13:30.370 ], 00:13:30.370 "product_name": "Malloc disk", 00:13:30.370 "block_size": 512, 00:13:30.370 "num_blocks": 65536, 00:13:30.370 "uuid": "462708f9-24b6-465f-be31-f0a8c54abc4a", 00:13:30.370 "assigned_rate_limits": { 00:13:30.370 "rw_ios_per_sec": 0, 00:13:30.370 "rw_mbytes_per_sec": 0, 00:13:30.370 "r_mbytes_per_sec": 0, 00:13:30.370 "w_mbytes_per_sec": 0 00:13:30.370 }, 00:13:30.370 "claimed": true, 00:13:30.370 "claim_type": "exclusive_write", 00:13:30.370 "zoned": false, 00:13:30.370 "supported_io_types": { 00:13:30.370 "read": true, 00:13:30.370 "write": true, 00:13:30.370 "unmap": true, 00:13:30.370 "write_zeroes": true, 00:13:30.370 "flush": true, 00:13:30.370 "reset": true, 00:13:30.370 "compare": false, 00:13:30.370 "compare_and_write": false, 00:13:30.370 "abort": true, 00:13:30.370 "nvme_admin": false, 00:13:30.370 "nvme_io": false 00:13:30.370 }, 00:13:30.370 "memory_domains": [ 00:13:30.370 { 00:13:30.370 "dma_device_id": "system", 00:13:30.370 "dma_device_type": 1 00:13:30.370 }, 00:13:30.370 { 00:13:30.370 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:30.370 "dma_device_type": 2 00:13:30.370 } 00:13:30.370 ], 00:13:30.370 "driver_specific": {} 00:13:30.370 } 00:13:30.370 ] 00:13:30.370 23:55:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:13:30.370 23:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:30.370 23:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:30.370 23:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:30.370 23:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:30.370 23:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:30.370 23:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:30.370 23:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:30.371 23:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:30.371 23:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:30.371 23:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:30.371 23:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:30.371 23:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:30.629 23:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:30.629 "name": "Existed_Raid", 00:13:30.629 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:30.629 "strip_size_kb": 64, 00:13:30.629 "state": "configuring", 00:13:30.629 "raid_level": "concat", 00:13:30.629 "superblock": false, 00:13:30.629 "num_base_bdevs": 3, 00:13:30.629 "num_base_bdevs_discovered": 2, 00:13:30.629 "num_base_bdevs_operational": 3, 00:13:30.629 "base_bdevs_list": [ 00:13:30.629 { 00:13:30.629 "name": "BaseBdev1", 00:13:30.629 "uuid": "462708f9-24b6-465f-be31-f0a8c54abc4a", 00:13:30.629 "is_configured": true, 00:13:30.629 "data_offset": 0, 00:13:30.629 "data_size": 65536 00:13:30.629 }, 00:13:30.629 { 00:13:30.629 "name": null, 00:13:30.629 "uuid": "546b3d8e-c484-47b9-b47b-454ddeba6f72", 00:13:30.629 "is_configured": false, 00:13:30.629 "data_offset": 0, 00:13:30.629 "data_size": 65536 00:13:30.629 }, 00:13:30.629 { 00:13:30.629 "name": "BaseBdev3", 00:13:30.629 "uuid": "bfce50e8-4715-4553-9dde-6fe97f7e118b", 00:13:30.629 "is_configured": true, 00:13:30.629 "data_offset": 0, 00:13:30.629 "data_size": 65536 00:13:30.629 } 00:13:30.629 ] 00:13:30.629 }' 00:13:30.629 23:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:30.629 23:55:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:31.195 23:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:31.195 23:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:31.454 23:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:13:31.454 23:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:31.712 [2024-05-14 23:55:32.117801] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:31.712 23:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:31.712 23:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:31.712 23:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:31.712 23:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:31.712 23:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:31.712 23:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:31.712 23:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:31.712 23:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:31.712 23:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:31.712 23:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:31.712 23:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:31.712 23:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:31.970 23:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:31.970 "name": "Existed_Raid", 00:13:31.970 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:31.970 "strip_size_kb": 64, 00:13:31.970 "state": "configuring", 00:13:31.970 "raid_level": "concat", 00:13:31.970 "superblock": false, 00:13:31.970 "num_base_bdevs": 3, 00:13:31.970 "num_base_bdevs_discovered": 1, 00:13:31.970 "num_base_bdevs_operational": 3, 00:13:31.970 "base_bdevs_list": [ 00:13:31.970 { 00:13:31.970 "name": "BaseBdev1", 00:13:31.970 "uuid": "462708f9-24b6-465f-be31-f0a8c54abc4a", 00:13:31.970 "is_configured": true, 00:13:31.970 "data_offset": 0, 00:13:31.970 "data_size": 65536 00:13:31.970 }, 00:13:31.970 { 00:13:31.970 "name": null, 00:13:31.970 "uuid": "546b3d8e-c484-47b9-b47b-454ddeba6f72", 00:13:31.970 "is_configured": false, 00:13:31.970 "data_offset": 0, 00:13:31.970 "data_size": 65536 00:13:31.970 }, 00:13:31.970 { 00:13:31.970 "name": null, 00:13:31.970 "uuid": "bfce50e8-4715-4553-9dde-6fe97f7e118b", 00:13:31.970 "is_configured": false, 00:13:31.970 "data_offset": 0, 00:13:31.970 "data_size": 65536 00:13:31.970 } 00:13:31.970 ] 00:13:31.970 }' 00:13:31.970 23:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:31.971 23:55:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:32.544 23:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:32.544 23:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:32.802 23:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:13:32.802 23:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:33.059 [2024-05-14 23:55:33.449346] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:33.059 23:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:33.059 23:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:33.059 23:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:33.059 23:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:33.059 23:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:33.059 23:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:33.059 23:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:33.059 23:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:33.059 23:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:33.059 23:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:33.059 23:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:33.059 23:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:33.317 23:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:33.317 "name": "Existed_Raid", 00:13:33.317 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:33.317 "strip_size_kb": 64, 00:13:33.317 "state": "configuring", 00:13:33.317 "raid_level": "concat", 00:13:33.317 "superblock": false, 00:13:33.317 "num_base_bdevs": 3, 00:13:33.317 "num_base_bdevs_discovered": 2, 00:13:33.317 "num_base_bdevs_operational": 3, 00:13:33.317 "base_bdevs_list": [ 00:13:33.317 { 00:13:33.317 "name": "BaseBdev1", 00:13:33.317 "uuid": "462708f9-24b6-465f-be31-f0a8c54abc4a", 00:13:33.317 "is_configured": true, 00:13:33.317 "data_offset": 0, 00:13:33.317 "data_size": 65536 00:13:33.317 }, 00:13:33.317 { 00:13:33.317 "name": null, 00:13:33.317 "uuid": "546b3d8e-c484-47b9-b47b-454ddeba6f72", 00:13:33.317 "is_configured": false, 00:13:33.317 "data_offset": 0, 00:13:33.317 "data_size": 65536 00:13:33.317 }, 00:13:33.317 { 00:13:33.317 "name": "BaseBdev3", 00:13:33.317 "uuid": "bfce50e8-4715-4553-9dde-6fe97f7e118b", 00:13:33.317 "is_configured": true, 00:13:33.317 "data_offset": 0, 00:13:33.317 "data_size": 65536 00:13:33.317 } 00:13:33.317 ] 00:13:33.317 }' 00:13:33.317 23:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:33.317 23:55:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:33.883 23:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:33.883 23:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:34.140 23:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:13:34.140 23:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:34.398 [2024-05-14 23:55:34.785108] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:34.398 23:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:34.398 23:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:34.398 23:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:34.398 23:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:34.398 23:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:34.398 23:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:34.398 23:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:34.398 23:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:34.398 23:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:34.398 23:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:34.398 23:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:34.398 23:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:34.656 23:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:34.656 "name": "Existed_Raid", 00:13:34.656 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:34.656 "strip_size_kb": 64, 00:13:34.656 "state": "configuring", 00:13:34.656 "raid_level": "concat", 00:13:34.656 "superblock": false, 00:13:34.656 "num_base_bdevs": 3, 00:13:34.656 "num_base_bdevs_discovered": 1, 00:13:34.656 "num_base_bdevs_operational": 3, 00:13:34.656 "base_bdevs_list": [ 00:13:34.656 { 00:13:34.656 "name": null, 00:13:34.656 "uuid": "462708f9-24b6-465f-be31-f0a8c54abc4a", 00:13:34.656 "is_configured": false, 00:13:34.656 "data_offset": 0, 00:13:34.656 "data_size": 65536 00:13:34.656 }, 00:13:34.656 { 00:13:34.656 "name": null, 00:13:34.656 "uuid": "546b3d8e-c484-47b9-b47b-454ddeba6f72", 00:13:34.656 "is_configured": false, 00:13:34.656 "data_offset": 0, 00:13:34.656 "data_size": 65536 00:13:34.656 }, 00:13:34.656 { 00:13:34.656 "name": "BaseBdev3", 00:13:34.656 "uuid": "bfce50e8-4715-4553-9dde-6fe97f7e118b", 00:13:34.656 "is_configured": true, 00:13:34.656 "data_offset": 0, 00:13:34.656 "data_size": 65536 00:13:34.656 } 00:13:34.656 ] 00:13:34.656 }' 00:13:34.656 23:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:34.656 23:55:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:35.223 23:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:35.223 23:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:35.480 23:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:13:35.480 23:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:13:35.738 [2024-05-14 23:55:36.092998] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:35.738 23:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:35.738 23:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:35.738 23:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:35.738 23:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:35.738 23:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:35.738 23:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:35.738 23:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:35.738 23:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:35.738 23:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:35.738 23:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:35.738 23:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:35.738 23:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:35.995 23:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:35.995 "name": "Existed_Raid", 00:13:35.995 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:35.995 "strip_size_kb": 64, 00:13:35.995 "state": "configuring", 00:13:35.995 "raid_level": "concat", 00:13:35.995 "superblock": false, 00:13:35.995 "num_base_bdevs": 3, 00:13:35.995 "num_base_bdevs_discovered": 2, 00:13:35.995 "num_base_bdevs_operational": 3, 00:13:35.995 "base_bdevs_list": [ 00:13:35.995 { 00:13:35.995 "name": null, 00:13:35.995 "uuid": "462708f9-24b6-465f-be31-f0a8c54abc4a", 00:13:35.995 "is_configured": false, 00:13:35.995 "data_offset": 0, 00:13:35.995 "data_size": 65536 00:13:35.995 }, 00:13:35.995 { 00:13:35.995 "name": "BaseBdev2", 00:13:35.995 "uuid": "546b3d8e-c484-47b9-b47b-454ddeba6f72", 00:13:35.995 "is_configured": true, 00:13:35.995 "data_offset": 0, 00:13:35.995 "data_size": 65536 00:13:35.995 }, 00:13:35.995 { 00:13:35.995 "name": "BaseBdev3", 00:13:35.995 "uuid": "bfce50e8-4715-4553-9dde-6fe97f7e118b", 00:13:35.995 "is_configured": true, 00:13:35.995 "data_offset": 0, 00:13:35.995 "data_size": 65536 00:13:35.995 } 00:13:35.995 ] 00:13:35.995 }' 00:13:35.995 23:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:35.995 23:55:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:36.560 23:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:36.561 23:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:36.819 23:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:13:36.819 23:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:13:36.819 23:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:37.077 23:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 462708f9-24b6-465f-be31-f0a8c54abc4a 00:13:37.335 [2024-05-14 23:55:37.689810] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:13:37.335 [2024-05-14 23:55:37.689850] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1252ee0 00:13:37.335 [2024-05-14 23:55:37.689859] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:13:37.335 [2024-05-14 23:55:37.690046] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13f8200 00:13:37.335 [2024-05-14 23:55:37.690172] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1252ee0 00:13:37.335 [2024-05-14 23:55:37.690182] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1252ee0 00:13:37.335 [2024-05-14 23:55:37.690345] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:37.335 NewBaseBdev 00:13:37.335 23:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:13:37.335 23:55:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:13:37.335 23:55:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:37.335 23:55:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:13:37.335 23:55:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:37.335 23:55:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:37.335 23:55:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:37.593 23:55:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:13:37.593 [ 00:13:37.593 { 00:13:37.593 "name": "NewBaseBdev", 00:13:37.593 "aliases": [ 00:13:37.593 "462708f9-24b6-465f-be31-f0a8c54abc4a" 00:13:37.593 ], 00:13:37.593 "product_name": "Malloc disk", 00:13:37.593 "block_size": 512, 00:13:37.593 "num_blocks": 65536, 00:13:37.593 "uuid": "462708f9-24b6-465f-be31-f0a8c54abc4a", 00:13:37.593 "assigned_rate_limits": { 00:13:37.593 "rw_ios_per_sec": 0, 00:13:37.593 "rw_mbytes_per_sec": 0, 00:13:37.593 "r_mbytes_per_sec": 0, 00:13:37.593 "w_mbytes_per_sec": 0 00:13:37.593 }, 00:13:37.593 "claimed": true, 00:13:37.593 "claim_type": "exclusive_write", 00:13:37.593 "zoned": false, 00:13:37.593 "supported_io_types": { 00:13:37.593 "read": true, 00:13:37.593 "write": true, 00:13:37.593 "unmap": true, 00:13:37.593 "write_zeroes": true, 00:13:37.593 "flush": true, 00:13:37.593 "reset": true, 00:13:37.593 "compare": false, 00:13:37.593 "compare_and_write": false, 00:13:37.593 "abort": true, 00:13:37.593 "nvme_admin": false, 00:13:37.593 "nvme_io": false 00:13:37.593 }, 00:13:37.593 "memory_domains": [ 00:13:37.593 { 00:13:37.593 "dma_device_id": "system", 00:13:37.593 "dma_device_type": 1 00:13:37.593 }, 00:13:37.593 { 00:13:37.593 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:37.593 "dma_device_type": 2 00:13:37.593 } 00:13:37.593 ], 00:13:37.593 "driver_specific": {} 00:13:37.593 } 00:13:37.593 ] 00:13:37.852 23:55:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:13:37.852 23:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:13:37.852 23:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:37.852 23:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:13:37.852 23:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:37.852 23:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:37.852 23:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:37.852 23:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:37.852 23:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:37.852 23:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:37.852 23:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:37.852 23:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:37.852 23:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:37.852 23:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:37.852 "name": "Existed_Raid", 00:13:37.852 "uuid": "3f2e5edc-58c3-4888-8678-7383c1fd1963", 00:13:37.852 "strip_size_kb": 64, 00:13:37.852 "state": "online", 00:13:37.852 "raid_level": "concat", 00:13:37.852 "superblock": false, 00:13:37.852 "num_base_bdevs": 3, 00:13:37.852 "num_base_bdevs_discovered": 3, 00:13:37.852 "num_base_bdevs_operational": 3, 00:13:37.852 "base_bdevs_list": [ 00:13:37.852 { 00:13:37.852 "name": "NewBaseBdev", 00:13:37.852 "uuid": "462708f9-24b6-465f-be31-f0a8c54abc4a", 00:13:37.852 "is_configured": true, 00:13:37.852 "data_offset": 0, 00:13:37.852 "data_size": 65536 00:13:37.852 }, 00:13:37.852 { 00:13:37.852 "name": "BaseBdev2", 00:13:37.852 "uuid": "546b3d8e-c484-47b9-b47b-454ddeba6f72", 00:13:37.852 "is_configured": true, 00:13:37.852 "data_offset": 0, 00:13:37.852 "data_size": 65536 00:13:37.852 }, 00:13:37.852 { 00:13:37.852 "name": "BaseBdev3", 00:13:37.852 "uuid": "bfce50e8-4715-4553-9dde-6fe97f7e118b", 00:13:37.852 "is_configured": true, 00:13:37.852 "data_offset": 0, 00:13:37.852 "data_size": 65536 00:13:37.852 } 00:13:37.852 ] 00:13:37.852 }' 00:13:37.852 23:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:37.852 23:55:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:38.419 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:13:38.676 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:13:38.676 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:13:38.676 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:13:38.676 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:13:38.676 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:13:38.676 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:38.676 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:13:38.676 [2024-05-14 23:55:39.234175] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:38.676 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:13:38.676 "name": "Existed_Raid", 00:13:38.676 "aliases": [ 00:13:38.677 "3f2e5edc-58c3-4888-8678-7383c1fd1963" 00:13:38.677 ], 00:13:38.677 "product_name": "Raid Volume", 00:13:38.677 "block_size": 512, 00:13:38.677 "num_blocks": 196608, 00:13:38.677 "uuid": "3f2e5edc-58c3-4888-8678-7383c1fd1963", 00:13:38.677 "assigned_rate_limits": { 00:13:38.677 "rw_ios_per_sec": 0, 00:13:38.677 "rw_mbytes_per_sec": 0, 00:13:38.677 "r_mbytes_per_sec": 0, 00:13:38.677 "w_mbytes_per_sec": 0 00:13:38.677 }, 00:13:38.677 "claimed": false, 00:13:38.677 "zoned": false, 00:13:38.677 "supported_io_types": { 00:13:38.677 "read": true, 00:13:38.677 "write": true, 00:13:38.677 "unmap": true, 00:13:38.677 "write_zeroes": true, 00:13:38.677 "flush": true, 00:13:38.677 "reset": true, 00:13:38.677 "compare": false, 00:13:38.677 "compare_and_write": false, 00:13:38.677 "abort": false, 00:13:38.677 "nvme_admin": false, 00:13:38.677 "nvme_io": false 00:13:38.677 }, 00:13:38.677 "memory_domains": [ 00:13:38.677 { 00:13:38.677 "dma_device_id": "system", 00:13:38.677 "dma_device_type": 1 00:13:38.677 }, 00:13:38.677 { 00:13:38.677 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:38.677 "dma_device_type": 2 00:13:38.677 }, 00:13:38.677 { 00:13:38.677 "dma_device_id": "system", 00:13:38.677 "dma_device_type": 1 00:13:38.677 }, 00:13:38.677 { 00:13:38.677 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:38.677 "dma_device_type": 2 00:13:38.677 }, 00:13:38.677 { 00:13:38.677 "dma_device_id": "system", 00:13:38.677 "dma_device_type": 1 00:13:38.677 }, 00:13:38.677 { 00:13:38.677 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:38.677 "dma_device_type": 2 00:13:38.677 } 00:13:38.677 ], 00:13:38.677 "driver_specific": { 00:13:38.677 "raid": { 00:13:38.677 "uuid": "3f2e5edc-58c3-4888-8678-7383c1fd1963", 00:13:38.677 "strip_size_kb": 64, 00:13:38.677 "state": "online", 00:13:38.677 "raid_level": "concat", 00:13:38.677 "superblock": false, 00:13:38.677 "num_base_bdevs": 3, 00:13:38.677 "num_base_bdevs_discovered": 3, 00:13:38.677 "num_base_bdevs_operational": 3, 00:13:38.677 "base_bdevs_list": [ 00:13:38.677 { 00:13:38.677 "name": "NewBaseBdev", 00:13:38.677 "uuid": "462708f9-24b6-465f-be31-f0a8c54abc4a", 00:13:38.677 "is_configured": true, 00:13:38.677 "data_offset": 0, 00:13:38.677 "data_size": 65536 00:13:38.677 }, 00:13:38.677 { 00:13:38.677 "name": "BaseBdev2", 00:13:38.677 "uuid": "546b3d8e-c484-47b9-b47b-454ddeba6f72", 00:13:38.677 "is_configured": true, 00:13:38.677 "data_offset": 0, 00:13:38.677 "data_size": 65536 00:13:38.677 }, 00:13:38.677 { 00:13:38.677 "name": "BaseBdev3", 00:13:38.677 "uuid": "bfce50e8-4715-4553-9dde-6fe97f7e118b", 00:13:38.677 "is_configured": true, 00:13:38.677 "data_offset": 0, 00:13:38.677 "data_size": 65536 00:13:38.677 } 00:13:38.677 ] 00:13:38.677 } 00:13:38.677 } 00:13:38.677 }' 00:13:38.677 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:38.934 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:13:38.934 BaseBdev2 00:13:38.934 BaseBdev3' 00:13:38.934 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:38.934 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:13:38.934 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:39.192 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:39.192 "name": "NewBaseBdev", 00:13:39.192 "aliases": [ 00:13:39.192 "462708f9-24b6-465f-be31-f0a8c54abc4a" 00:13:39.192 ], 00:13:39.192 "product_name": "Malloc disk", 00:13:39.192 "block_size": 512, 00:13:39.192 "num_blocks": 65536, 00:13:39.192 "uuid": "462708f9-24b6-465f-be31-f0a8c54abc4a", 00:13:39.192 "assigned_rate_limits": { 00:13:39.192 "rw_ios_per_sec": 0, 00:13:39.192 "rw_mbytes_per_sec": 0, 00:13:39.192 "r_mbytes_per_sec": 0, 00:13:39.192 "w_mbytes_per_sec": 0 00:13:39.192 }, 00:13:39.192 "claimed": true, 00:13:39.192 "claim_type": "exclusive_write", 00:13:39.192 "zoned": false, 00:13:39.192 "supported_io_types": { 00:13:39.192 "read": true, 00:13:39.192 "write": true, 00:13:39.192 "unmap": true, 00:13:39.192 "write_zeroes": true, 00:13:39.192 "flush": true, 00:13:39.192 "reset": true, 00:13:39.192 "compare": false, 00:13:39.192 "compare_and_write": false, 00:13:39.192 "abort": true, 00:13:39.192 "nvme_admin": false, 00:13:39.192 "nvme_io": false 00:13:39.192 }, 00:13:39.192 "memory_domains": [ 00:13:39.192 { 00:13:39.192 "dma_device_id": "system", 00:13:39.192 "dma_device_type": 1 00:13:39.192 }, 00:13:39.192 { 00:13:39.192 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:39.192 "dma_device_type": 2 00:13:39.192 } 00:13:39.192 ], 00:13:39.192 "driver_specific": {} 00:13:39.192 }' 00:13:39.192 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:39.192 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:39.192 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:39.192 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:39.192 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:39.192 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:39.192 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:39.192 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:39.450 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:39.450 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:39.450 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:39.450 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:39.450 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:39.450 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:39.450 23:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:39.721 23:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:39.721 "name": "BaseBdev2", 00:13:39.721 "aliases": [ 00:13:39.721 "546b3d8e-c484-47b9-b47b-454ddeba6f72" 00:13:39.721 ], 00:13:39.721 "product_name": "Malloc disk", 00:13:39.721 "block_size": 512, 00:13:39.721 "num_blocks": 65536, 00:13:39.721 "uuid": "546b3d8e-c484-47b9-b47b-454ddeba6f72", 00:13:39.721 "assigned_rate_limits": { 00:13:39.721 "rw_ios_per_sec": 0, 00:13:39.721 "rw_mbytes_per_sec": 0, 00:13:39.721 "r_mbytes_per_sec": 0, 00:13:39.721 "w_mbytes_per_sec": 0 00:13:39.721 }, 00:13:39.721 "claimed": true, 00:13:39.721 "claim_type": "exclusive_write", 00:13:39.721 "zoned": false, 00:13:39.721 "supported_io_types": { 00:13:39.721 "read": true, 00:13:39.721 "write": true, 00:13:39.721 "unmap": true, 00:13:39.721 "write_zeroes": true, 00:13:39.721 "flush": true, 00:13:39.721 "reset": true, 00:13:39.721 "compare": false, 00:13:39.721 "compare_and_write": false, 00:13:39.721 "abort": true, 00:13:39.721 "nvme_admin": false, 00:13:39.721 "nvme_io": false 00:13:39.721 }, 00:13:39.721 "memory_domains": [ 00:13:39.721 { 00:13:39.721 "dma_device_id": "system", 00:13:39.721 "dma_device_type": 1 00:13:39.721 }, 00:13:39.721 { 00:13:39.721 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:39.721 "dma_device_type": 2 00:13:39.721 } 00:13:39.721 ], 00:13:39.721 "driver_specific": {} 00:13:39.721 }' 00:13:39.721 23:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:39.721 23:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:39.721 23:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:39.721 23:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:39.721 23:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:39.993 23:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:39.993 23:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:39.993 23:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:39.993 23:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:39.993 23:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:39.993 23:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:39.993 23:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:39.993 23:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:39.993 23:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:39.993 23:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:40.252 23:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:40.252 "name": "BaseBdev3", 00:13:40.252 "aliases": [ 00:13:40.252 "bfce50e8-4715-4553-9dde-6fe97f7e118b" 00:13:40.252 ], 00:13:40.252 "product_name": "Malloc disk", 00:13:40.252 "block_size": 512, 00:13:40.252 "num_blocks": 65536, 00:13:40.252 "uuid": "bfce50e8-4715-4553-9dde-6fe97f7e118b", 00:13:40.252 "assigned_rate_limits": { 00:13:40.252 "rw_ios_per_sec": 0, 00:13:40.252 "rw_mbytes_per_sec": 0, 00:13:40.252 "r_mbytes_per_sec": 0, 00:13:40.252 "w_mbytes_per_sec": 0 00:13:40.252 }, 00:13:40.252 "claimed": true, 00:13:40.252 "claim_type": "exclusive_write", 00:13:40.252 "zoned": false, 00:13:40.252 "supported_io_types": { 00:13:40.252 "read": true, 00:13:40.252 "write": true, 00:13:40.252 "unmap": true, 00:13:40.252 "write_zeroes": true, 00:13:40.252 "flush": true, 00:13:40.252 "reset": true, 00:13:40.252 "compare": false, 00:13:40.252 "compare_and_write": false, 00:13:40.252 "abort": true, 00:13:40.252 "nvme_admin": false, 00:13:40.252 "nvme_io": false 00:13:40.252 }, 00:13:40.252 "memory_domains": [ 00:13:40.252 { 00:13:40.252 "dma_device_id": "system", 00:13:40.252 "dma_device_type": 1 00:13:40.252 }, 00:13:40.252 { 00:13:40.252 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:40.252 "dma_device_type": 2 00:13:40.252 } 00:13:40.252 ], 00:13:40.252 "driver_specific": {} 00:13:40.252 }' 00:13:40.252 23:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:40.252 23:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:40.252 23:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:40.252 23:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:40.511 23:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:40.511 23:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:40.511 23:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:40.511 23:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:40.511 23:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:40.511 23:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:40.511 23:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:40.511 23:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:40.511 23:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:40.770 [2024-05-14 23:55:41.299431] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:40.770 [2024-05-14 23:55:41.299463] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:40.770 [2024-05-14 23:55:41.299524] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:40.770 [2024-05-14 23:55:41.299582] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:40.770 [2024-05-14 23:55:41.299594] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1252ee0 name Existed_Raid, state offline 00:13:40.770 23:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@342 -- # killprocess 411259 00:13:40.770 23:55:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@946 -- # '[' -z 411259 ']' 00:13:40.770 23:55:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # kill -0 411259 00:13:40.770 23:55:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # uname 00:13:40.770 23:55:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:13:40.770 23:55:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 411259 00:13:41.029 23:55:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:13:41.029 23:55:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:13:41.029 23:55:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 411259' 00:13:41.029 killing process with pid 411259 00:13:41.029 23:55:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@965 -- # kill 411259 00:13:41.029 [2024-05-14 23:55:41.368995] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:41.029 23:55:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@970 -- # wait 411259 00:13:41.029 [2024-05-14 23:55:41.431028] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@344 -- # return 0 00:13:41.596 00:13:41.596 real 0m27.930s 00:13:41.596 user 0m50.981s 00:13:41.596 sys 0m4.994s 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:41.596 ************************************ 00:13:41.596 END TEST raid_state_function_test 00:13:41.596 ************************************ 00:13:41.596 23:55:41 bdev_raid -- bdev/bdev_raid.sh@816 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:13:41.596 23:55:41 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:13:41.596 23:55:41 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:13:41.596 23:55:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:41.596 ************************************ 00:13:41.596 START TEST raid_state_function_test_sb 00:13:41.596 ************************************ 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1121 -- # raid_state_function_test concat 3 true 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local raid_level=concat 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=3 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # '[' concat '!=' raid1 ']' 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # raid_pid=415448 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 415448' 00:13:41.596 Process raid pid: 415448 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@247 -- # waitforlisten 415448 /var/tmp/spdk-raid.sock 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@827 -- # '[' -z 415448 ']' 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:41.596 23:55:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:13:41.597 23:55:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:41.597 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:41.597 23:55:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:13:41.597 23:55:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:41.597 [2024-05-14 23:55:42.035090] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:13:41.597 [2024-05-14 23:55:42.035161] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:41.597 [2024-05-14 23:55:42.166949] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:41.855 [2024-05-14 23:55:42.267569] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:13:41.855 [2024-05-14 23:55:42.330746] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:41.855 [2024-05-14 23:55:42.330780] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:42.420 23:55:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:13:42.420 23:55:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # return 0 00:13:42.420 23:55:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:42.678 [2024-05-14 23:55:43.125527] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:42.678 [2024-05-14 23:55:43.125568] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:42.678 [2024-05-14 23:55:43.125580] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:42.678 [2024-05-14 23:55:43.125592] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:42.678 [2024-05-14 23:55:43.125601] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:42.678 [2024-05-14 23:55:43.125613] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:42.678 23:55:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:42.678 23:55:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:42.678 23:55:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:42.678 23:55:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:42.678 23:55:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:42.678 23:55:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:42.678 23:55:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:42.678 23:55:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:42.678 23:55:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:42.678 23:55:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:42.678 23:55:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:42.678 23:55:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:42.940 23:55:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:42.940 "name": "Existed_Raid", 00:13:42.940 "uuid": "0c3123f2-4ece-4b95-8cea-446236eebe39", 00:13:42.940 "strip_size_kb": 64, 00:13:42.940 "state": "configuring", 00:13:42.940 "raid_level": "concat", 00:13:42.940 "superblock": true, 00:13:42.940 "num_base_bdevs": 3, 00:13:42.940 "num_base_bdevs_discovered": 0, 00:13:42.940 "num_base_bdevs_operational": 3, 00:13:42.940 "base_bdevs_list": [ 00:13:42.940 { 00:13:42.940 "name": "BaseBdev1", 00:13:42.940 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:42.940 "is_configured": false, 00:13:42.940 "data_offset": 0, 00:13:42.940 "data_size": 0 00:13:42.940 }, 00:13:42.940 { 00:13:42.940 "name": "BaseBdev2", 00:13:42.940 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:42.940 "is_configured": false, 00:13:42.940 "data_offset": 0, 00:13:42.940 "data_size": 0 00:13:42.940 }, 00:13:42.940 { 00:13:42.940 "name": "BaseBdev3", 00:13:42.940 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:42.940 "is_configured": false, 00:13:42.940 "data_offset": 0, 00:13:42.940 "data_size": 0 00:13:42.940 } 00:13:42.940 ] 00:13:42.940 }' 00:13:42.940 23:55:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:42.940 23:55:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:43.509 23:55:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:43.767 [2024-05-14 23:55:44.124018] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:43.767 [2024-05-14 23:55:44.124047] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11a7be0 name Existed_Raid, state configuring 00:13:43.767 23:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:44.025 [2024-05-14 23:55:44.368682] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:44.025 [2024-05-14 23:55:44.368708] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:44.025 [2024-05-14 23:55:44.368718] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:44.025 [2024-05-14 23:55:44.368729] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:44.025 [2024-05-14 23:55:44.368738] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:44.025 [2024-05-14 23:55:44.368750] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:44.025 23:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:44.283 [2024-05-14 23:55:44.619318] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:44.283 BaseBdev1 00:13:44.283 23:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:13:44.283 23:55:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:13:44.283 23:55:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:44.283 23:55:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:13:44.283 23:55:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:44.283 23:55:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:44.283 23:55:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:44.541 23:55:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:44.541 [ 00:13:44.541 { 00:13:44.541 "name": "BaseBdev1", 00:13:44.541 "aliases": [ 00:13:44.541 "3721c918-cf56-4f27-bbde-e04431f28752" 00:13:44.541 ], 00:13:44.541 "product_name": "Malloc disk", 00:13:44.541 "block_size": 512, 00:13:44.541 "num_blocks": 65536, 00:13:44.541 "uuid": "3721c918-cf56-4f27-bbde-e04431f28752", 00:13:44.541 "assigned_rate_limits": { 00:13:44.541 "rw_ios_per_sec": 0, 00:13:44.541 "rw_mbytes_per_sec": 0, 00:13:44.541 "r_mbytes_per_sec": 0, 00:13:44.541 "w_mbytes_per_sec": 0 00:13:44.541 }, 00:13:44.541 "claimed": true, 00:13:44.541 "claim_type": "exclusive_write", 00:13:44.542 "zoned": false, 00:13:44.542 "supported_io_types": { 00:13:44.542 "read": true, 00:13:44.542 "write": true, 00:13:44.542 "unmap": true, 00:13:44.542 "write_zeroes": true, 00:13:44.542 "flush": true, 00:13:44.542 "reset": true, 00:13:44.542 "compare": false, 00:13:44.542 "compare_and_write": false, 00:13:44.542 "abort": true, 00:13:44.542 "nvme_admin": false, 00:13:44.542 "nvme_io": false 00:13:44.542 }, 00:13:44.542 "memory_domains": [ 00:13:44.542 { 00:13:44.542 "dma_device_id": "system", 00:13:44.542 "dma_device_type": 1 00:13:44.542 }, 00:13:44.542 { 00:13:44.542 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:44.542 "dma_device_type": 2 00:13:44.542 } 00:13:44.542 ], 00:13:44.542 "driver_specific": {} 00:13:44.542 } 00:13:44.542 ] 00:13:44.800 23:55:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:13:44.800 23:55:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:44.800 23:55:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:44.800 23:55:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:44.800 23:55:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:44.800 23:55:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:44.800 23:55:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:44.800 23:55:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:44.800 23:55:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:44.800 23:55:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:44.800 23:55:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:44.800 23:55:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:44.800 23:55:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:44.800 23:55:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:44.800 "name": "Existed_Raid", 00:13:44.800 "uuid": "b5de346a-9e07-49da-a6e9-d082f9f52629", 00:13:44.800 "strip_size_kb": 64, 00:13:44.801 "state": "configuring", 00:13:44.801 "raid_level": "concat", 00:13:44.801 "superblock": true, 00:13:44.801 "num_base_bdevs": 3, 00:13:44.801 "num_base_bdevs_discovered": 1, 00:13:44.801 "num_base_bdevs_operational": 3, 00:13:44.801 "base_bdevs_list": [ 00:13:44.801 { 00:13:44.801 "name": "BaseBdev1", 00:13:44.801 "uuid": "3721c918-cf56-4f27-bbde-e04431f28752", 00:13:44.801 "is_configured": true, 00:13:44.801 "data_offset": 2048, 00:13:44.801 "data_size": 63488 00:13:44.801 }, 00:13:44.801 { 00:13:44.801 "name": "BaseBdev2", 00:13:44.801 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:44.801 "is_configured": false, 00:13:44.801 "data_offset": 0, 00:13:44.801 "data_size": 0 00:13:44.801 }, 00:13:44.801 { 00:13:44.801 "name": "BaseBdev3", 00:13:44.801 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:44.801 "is_configured": false, 00:13:44.801 "data_offset": 0, 00:13:44.801 "data_size": 0 00:13:44.801 } 00:13:44.801 ] 00:13:44.801 }' 00:13:44.801 23:55:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:44.801 23:55:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:45.368 23:55:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:45.627 [2024-05-14 23:55:46.135326] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:45.627 [2024-05-14 23:55:46.135367] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11a74b0 name Existed_Raid, state configuring 00:13:45.627 23:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:45.885 [2024-05-14 23:55:46.384025] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:45.885 [2024-05-14 23:55:46.385508] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:45.885 [2024-05-14 23:55:46.385540] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:45.885 [2024-05-14 23:55:46.385550] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:45.885 [2024-05-14 23:55:46.385562] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:45.885 23:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:13:45.885 23:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:13:45.885 23:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:45.885 23:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:45.885 23:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:45.885 23:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:45.885 23:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:45.885 23:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:45.885 23:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:45.886 23:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:45.886 23:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:45.886 23:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:45.886 23:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:45.886 23:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:46.144 23:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:46.144 "name": "Existed_Raid", 00:13:46.144 "uuid": "1f8dddf4-3e5b-4c74-bf06-59b8285c906c", 00:13:46.144 "strip_size_kb": 64, 00:13:46.144 "state": "configuring", 00:13:46.144 "raid_level": "concat", 00:13:46.144 "superblock": true, 00:13:46.144 "num_base_bdevs": 3, 00:13:46.144 "num_base_bdevs_discovered": 1, 00:13:46.144 "num_base_bdevs_operational": 3, 00:13:46.144 "base_bdevs_list": [ 00:13:46.144 { 00:13:46.145 "name": "BaseBdev1", 00:13:46.145 "uuid": "3721c918-cf56-4f27-bbde-e04431f28752", 00:13:46.145 "is_configured": true, 00:13:46.145 "data_offset": 2048, 00:13:46.145 "data_size": 63488 00:13:46.145 }, 00:13:46.145 { 00:13:46.145 "name": "BaseBdev2", 00:13:46.145 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:46.145 "is_configured": false, 00:13:46.145 "data_offset": 0, 00:13:46.145 "data_size": 0 00:13:46.145 }, 00:13:46.145 { 00:13:46.145 "name": "BaseBdev3", 00:13:46.145 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:46.145 "is_configured": false, 00:13:46.145 "data_offset": 0, 00:13:46.145 "data_size": 0 00:13:46.145 } 00:13:46.145 ] 00:13:46.145 }' 00:13:46.145 23:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:46.145 23:55:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:46.711 23:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:46.970 [2024-05-14 23:55:47.494312] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:46.970 BaseBdev2 00:13:46.970 23:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:13:46.970 23:55:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:13:46.970 23:55:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:46.970 23:55:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:13:46.970 23:55:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:46.970 23:55:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:46.970 23:55:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:47.228 23:55:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:47.487 [ 00:13:47.487 { 00:13:47.487 "name": "BaseBdev2", 00:13:47.487 "aliases": [ 00:13:47.487 "456bd8bb-9743-4b36-84e3-df3b2f5d478c" 00:13:47.487 ], 00:13:47.487 "product_name": "Malloc disk", 00:13:47.487 "block_size": 512, 00:13:47.487 "num_blocks": 65536, 00:13:47.487 "uuid": "456bd8bb-9743-4b36-84e3-df3b2f5d478c", 00:13:47.487 "assigned_rate_limits": { 00:13:47.487 "rw_ios_per_sec": 0, 00:13:47.487 "rw_mbytes_per_sec": 0, 00:13:47.487 "r_mbytes_per_sec": 0, 00:13:47.487 "w_mbytes_per_sec": 0 00:13:47.487 }, 00:13:47.487 "claimed": true, 00:13:47.487 "claim_type": "exclusive_write", 00:13:47.487 "zoned": false, 00:13:47.487 "supported_io_types": { 00:13:47.487 "read": true, 00:13:47.487 "write": true, 00:13:47.487 "unmap": true, 00:13:47.487 "write_zeroes": true, 00:13:47.487 "flush": true, 00:13:47.487 "reset": true, 00:13:47.487 "compare": false, 00:13:47.487 "compare_and_write": false, 00:13:47.487 "abort": true, 00:13:47.487 "nvme_admin": false, 00:13:47.487 "nvme_io": false 00:13:47.487 }, 00:13:47.487 "memory_domains": [ 00:13:47.487 { 00:13:47.487 "dma_device_id": "system", 00:13:47.487 "dma_device_type": 1 00:13:47.487 }, 00:13:47.487 { 00:13:47.487 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:47.487 "dma_device_type": 2 00:13:47.487 } 00:13:47.487 ], 00:13:47.487 "driver_specific": {} 00:13:47.487 } 00:13:47.487 ] 00:13:47.487 23:55:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:13:47.487 23:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:13:47.487 23:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:13:47.487 23:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:47.487 23:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:47.487 23:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:47.487 23:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:47.487 23:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:47.487 23:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:47.487 23:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:47.487 23:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:47.487 23:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:47.487 23:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:47.487 23:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:47.487 23:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:47.745 23:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:47.745 "name": "Existed_Raid", 00:13:47.745 "uuid": "1f8dddf4-3e5b-4c74-bf06-59b8285c906c", 00:13:47.745 "strip_size_kb": 64, 00:13:47.745 "state": "configuring", 00:13:47.745 "raid_level": "concat", 00:13:47.745 "superblock": true, 00:13:47.745 "num_base_bdevs": 3, 00:13:47.745 "num_base_bdevs_discovered": 2, 00:13:47.745 "num_base_bdevs_operational": 3, 00:13:47.745 "base_bdevs_list": [ 00:13:47.745 { 00:13:47.745 "name": "BaseBdev1", 00:13:47.745 "uuid": "3721c918-cf56-4f27-bbde-e04431f28752", 00:13:47.745 "is_configured": true, 00:13:47.745 "data_offset": 2048, 00:13:47.745 "data_size": 63488 00:13:47.745 }, 00:13:47.745 { 00:13:47.745 "name": "BaseBdev2", 00:13:47.745 "uuid": "456bd8bb-9743-4b36-84e3-df3b2f5d478c", 00:13:47.745 "is_configured": true, 00:13:47.745 "data_offset": 2048, 00:13:47.745 "data_size": 63488 00:13:47.745 }, 00:13:47.745 { 00:13:47.745 "name": "BaseBdev3", 00:13:47.745 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:47.745 "is_configured": false, 00:13:47.745 "data_offset": 0, 00:13:47.745 "data_size": 0 00:13:47.745 } 00:13:47.745 ] 00:13:47.745 }' 00:13:47.745 23:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:47.745 23:55:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:48.312 23:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:48.570 [2024-05-14 23:55:49.078052] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:48.570 [2024-05-14 23:55:49.078214] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x11a8560 00:13:48.570 [2024-05-14 23:55:49.078228] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:48.570 [2024-05-14 23:55:49.078420] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11bf490 00:13:48.570 [2024-05-14 23:55:49.078542] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11a8560 00:13:48.570 [2024-05-14 23:55:49.078552] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x11a8560 00:13:48.570 [2024-05-14 23:55:49.078649] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:48.570 BaseBdev3 00:13:48.570 23:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:13:48.570 23:55:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:13:48.570 23:55:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:48.570 23:55:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:13:48.570 23:55:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:48.570 23:55:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:48.570 23:55:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:48.828 23:55:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:49.086 [ 00:13:49.086 { 00:13:49.086 "name": "BaseBdev3", 00:13:49.086 "aliases": [ 00:13:49.086 "f27f1ae9-f400-4aa5-9c98-ccaa030b1e94" 00:13:49.086 ], 00:13:49.086 "product_name": "Malloc disk", 00:13:49.086 "block_size": 512, 00:13:49.086 "num_blocks": 65536, 00:13:49.086 "uuid": "f27f1ae9-f400-4aa5-9c98-ccaa030b1e94", 00:13:49.086 "assigned_rate_limits": { 00:13:49.086 "rw_ios_per_sec": 0, 00:13:49.086 "rw_mbytes_per_sec": 0, 00:13:49.086 "r_mbytes_per_sec": 0, 00:13:49.086 "w_mbytes_per_sec": 0 00:13:49.086 }, 00:13:49.086 "claimed": true, 00:13:49.086 "claim_type": "exclusive_write", 00:13:49.086 "zoned": false, 00:13:49.086 "supported_io_types": { 00:13:49.086 "read": true, 00:13:49.086 "write": true, 00:13:49.086 "unmap": true, 00:13:49.086 "write_zeroes": true, 00:13:49.086 "flush": true, 00:13:49.086 "reset": true, 00:13:49.086 "compare": false, 00:13:49.086 "compare_and_write": false, 00:13:49.086 "abort": true, 00:13:49.086 "nvme_admin": false, 00:13:49.086 "nvme_io": false 00:13:49.086 }, 00:13:49.086 "memory_domains": [ 00:13:49.086 { 00:13:49.086 "dma_device_id": "system", 00:13:49.086 "dma_device_type": 1 00:13:49.086 }, 00:13:49.086 { 00:13:49.086 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:49.086 "dma_device_type": 2 00:13:49.086 } 00:13:49.086 ], 00:13:49.086 "driver_specific": {} 00:13:49.086 } 00:13:49.086 ] 00:13:49.086 23:55:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:13:49.086 23:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:13:49.087 23:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:13:49.087 23:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:13:49.087 23:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:49.087 23:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:13:49.087 23:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:49.087 23:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:49.087 23:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:49.087 23:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:49.087 23:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:49.087 23:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:49.087 23:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:49.087 23:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:49.087 23:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:49.345 23:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:49.345 "name": "Existed_Raid", 00:13:49.345 "uuid": "1f8dddf4-3e5b-4c74-bf06-59b8285c906c", 00:13:49.345 "strip_size_kb": 64, 00:13:49.345 "state": "online", 00:13:49.345 "raid_level": "concat", 00:13:49.345 "superblock": true, 00:13:49.345 "num_base_bdevs": 3, 00:13:49.345 "num_base_bdevs_discovered": 3, 00:13:49.345 "num_base_bdevs_operational": 3, 00:13:49.345 "base_bdevs_list": [ 00:13:49.345 { 00:13:49.345 "name": "BaseBdev1", 00:13:49.345 "uuid": "3721c918-cf56-4f27-bbde-e04431f28752", 00:13:49.345 "is_configured": true, 00:13:49.345 "data_offset": 2048, 00:13:49.345 "data_size": 63488 00:13:49.345 }, 00:13:49.345 { 00:13:49.345 "name": "BaseBdev2", 00:13:49.345 "uuid": "456bd8bb-9743-4b36-84e3-df3b2f5d478c", 00:13:49.345 "is_configured": true, 00:13:49.345 "data_offset": 2048, 00:13:49.345 "data_size": 63488 00:13:49.345 }, 00:13:49.345 { 00:13:49.345 "name": "BaseBdev3", 00:13:49.345 "uuid": "f27f1ae9-f400-4aa5-9c98-ccaa030b1e94", 00:13:49.345 "is_configured": true, 00:13:49.345 "data_offset": 2048, 00:13:49.345 "data_size": 63488 00:13:49.345 } 00:13:49.345 ] 00:13:49.345 }' 00:13:49.345 23:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:49.345 23:55:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:49.912 23:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:13:49.912 23:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:13:49.912 23:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:13:49.912 23:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:13:49.912 23:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:13:49.912 23:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:13:49.912 23:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:49.912 23:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:13:50.170 [2024-05-14 23:55:50.662575] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:50.170 23:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:13:50.170 "name": "Existed_Raid", 00:13:50.170 "aliases": [ 00:13:50.170 "1f8dddf4-3e5b-4c74-bf06-59b8285c906c" 00:13:50.170 ], 00:13:50.170 "product_name": "Raid Volume", 00:13:50.170 "block_size": 512, 00:13:50.170 "num_blocks": 190464, 00:13:50.170 "uuid": "1f8dddf4-3e5b-4c74-bf06-59b8285c906c", 00:13:50.170 "assigned_rate_limits": { 00:13:50.170 "rw_ios_per_sec": 0, 00:13:50.170 "rw_mbytes_per_sec": 0, 00:13:50.170 "r_mbytes_per_sec": 0, 00:13:50.170 "w_mbytes_per_sec": 0 00:13:50.170 }, 00:13:50.170 "claimed": false, 00:13:50.170 "zoned": false, 00:13:50.170 "supported_io_types": { 00:13:50.170 "read": true, 00:13:50.170 "write": true, 00:13:50.170 "unmap": true, 00:13:50.170 "write_zeroes": true, 00:13:50.170 "flush": true, 00:13:50.170 "reset": true, 00:13:50.170 "compare": false, 00:13:50.170 "compare_and_write": false, 00:13:50.170 "abort": false, 00:13:50.170 "nvme_admin": false, 00:13:50.170 "nvme_io": false 00:13:50.170 }, 00:13:50.170 "memory_domains": [ 00:13:50.170 { 00:13:50.170 "dma_device_id": "system", 00:13:50.170 "dma_device_type": 1 00:13:50.170 }, 00:13:50.170 { 00:13:50.170 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:50.170 "dma_device_type": 2 00:13:50.170 }, 00:13:50.170 { 00:13:50.171 "dma_device_id": "system", 00:13:50.171 "dma_device_type": 1 00:13:50.171 }, 00:13:50.171 { 00:13:50.171 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:50.171 "dma_device_type": 2 00:13:50.171 }, 00:13:50.171 { 00:13:50.171 "dma_device_id": "system", 00:13:50.171 "dma_device_type": 1 00:13:50.171 }, 00:13:50.171 { 00:13:50.171 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:50.171 "dma_device_type": 2 00:13:50.171 } 00:13:50.171 ], 00:13:50.171 "driver_specific": { 00:13:50.171 "raid": { 00:13:50.171 "uuid": "1f8dddf4-3e5b-4c74-bf06-59b8285c906c", 00:13:50.171 "strip_size_kb": 64, 00:13:50.171 "state": "online", 00:13:50.171 "raid_level": "concat", 00:13:50.171 "superblock": true, 00:13:50.171 "num_base_bdevs": 3, 00:13:50.171 "num_base_bdevs_discovered": 3, 00:13:50.171 "num_base_bdevs_operational": 3, 00:13:50.171 "base_bdevs_list": [ 00:13:50.171 { 00:13:50.171 "name": "BaseBdev1", 00:13:50.171 "uuid": "3721c918-cf56-4f27-bbde-e04431f28752", 00:13:50.171 "is_configured": true, 00:13:50.171 "data_offset": 2048, 00:13:50.171 "data_size": 63488 00:13:50.171 }, 00:13:50.171 { 00:13:50.171 "name": "BaseBdev2", 00:13:50.171 "uuid": "456bd8bb-9743-4b36-84e3-df3b2f5d478c", 00:13:50.171 "is_configured": true, 00:13:50.171 "data_offset": 2048, 00:13:50.171 "data_size": 63488 00:13:50.171 }, 00:13:50.171 { 00:13:50.171 "name": "BaseBdev3", 00:13:50.171 "uuid": "f27f1ae9-f400-4aa5-9c98-ccaa030b1e94", 00:13:50.171 "is_configured": true, 00:13:50.171 "data_offset": 2048, 00:13:50.171 "data_size": 63488 00:13:50.171 } 00:13:50.171 ] 00:13:50.171 } 00:13:50.171 } 00:13:50.171 }' 00:13:50.171 23:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:50.171 23:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:13:50.171 BaseBdev2 00:13:50.171 BaseBdev3' 00:13:50.171 23:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:50.171 23:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:50.171 23:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:50.430 23:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:50.430 "name": "BaseBdev1", 00:13:50.430 "aliases": [ 00:13:50.430 "3721c918-cf56-4f27-bbde-e04431f28752" 00:13:50.430 ], 00:13:50.430 "product_name": "Malloc disk", 00:13:50.430 "block_size": 512, 00:13:50.430 "num_blocks": 65536, 00:13:50.430 "uuid": "3721c918-cf56-4f27-bbde-e04431f28752", 00:13:50.430 "assigned_rate_limits": { 00:13:50.430 "rw_ios_per_sec": 0, 00:13:50.430 "rw_mbytes_per_sec": 0, 00:13:50.430 "r_mbytes_per_sec": 0, 00:13:50.430 "w_mbytes_per_sec": 0 00:13:50.430 }, 00:13:50.430 "claimed": true, 00:13:50.430 "claim_type": "exclusive_write", 00:13:50.430 "zoned": false, 00:13:50.430 "supported_io_types": { 00:13:50.430 "read": true, 00:13:50.430 "write": true, 00:13:50.430 "unmap": true, 00:13:50.430 "write_zeroes": true, 00:13:50.430 "flush": true, 00:13:50.430 "reset": true, 00:13:50.430 "compare": false, 00:13:50.430 "compare_and_write": false, 00:13:50.430 "abort": true, 00:13:50.430 "nvme_admin": false, 00:13:50.430 "nvme_io": false 00:13:50.430 }, 00:13:50.430 "memory_domains": [ 00:13:50.430 { 00:13:50.430 "dma_device_id": "system", 00:13:50.430 "dma_device_type": 1 00:13:50.430 }, 00:13:50.430 { 00:13:50.430 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:50.430 "dma_device_type": 2 00:13:50.430 } 00:13:50.430 ], 00:13:50.430 "driver_specific": {} 00:13:50.430 }' 00:13:50.430 23:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:50.430 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:50.688 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:50.688 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:50.688 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:50.688 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:50.688 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:50.688 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:50.688 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:50.689 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:50.948 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:50.948 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:50.948 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:50.948 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:50.948 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:51.206 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:51.206 "name": "BaseBdev2", 00:13:51.206 "aliases": [ 00:13:51.206 "456bd8bb-9743-4b36-84e3-df3b2f5d478c" 00:13:51.206 ], 00:13:51.206 "product_name": "Malloc disk", 00:13:51.206 "block_size": 512, 00:13:51.206 "num_blocks": 65536, 00:13:51.206 "uuid": "456bd8bb-9743-4b36-84e3-df3b2f5d478c", 00:13:51.206 "assigned_rate_limits": { 00:13:51.206 "rw_ios_per_sec": 0, 00:13:51.206 "rw_mbytes_per_sec": 0, 00:13:51.206 "r_mbytes_per_sec": 0, 00:13:51.206 "w_mbytes_per_sec": 0 00:13:51.206 }, 00:13:51.206 "claimed": true, 00:13:51.206 "claim_type": "exclusive_write", 00:13:51.206 "zoned": false, 00:13:51.206 "supported_io_types": { 00:13:51.206 "read": true, 00:13:51.206 "write": true, 00:13:51.206 "unmap": true, 00:13:51.206 "write_zeroes": true, 00:13:51.206 "flush": true, 00:13:51.206 "reset": true, 00:13:51.206 "compare": false, 00:13:51.206 "compare_and_write": false, 00:13:51.206 "abort": true, 00:13:51.206 "nvme_admin": false, 00:13:51.206 "nvme_io": false 00:13:51.206 }, 00:13:51.206 "memory_domains": [ 00:13:51.206 { 00:13:51.206 "dma_device_id": "system", 00:13:51.206 "dma_device_type": 1 00:13:51.206 }, 00:13:51.206 { 00:13:51.206 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:51.206 "dma_device_type": 2 00:13:51.206 } 00:13:51.206 ], 00:13:51.206 "driver_specific": {} 00:13:51.206 }' 00:13:51.206 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:51.206 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:51.206 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:51.206 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:51.206 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:51.206 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:51.206 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:51.206 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:51.464 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:51.464 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:51.464 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:51.464 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:51.464 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:51.464 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:51.464 23:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:51.721 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:51.721 "name": "BaseBdev3", 00:13:51.721 "aliases": [ 00:13:51.721 "f27f1ae9-f400-4aa5-9c98-ccaa030b1e94" 00:13:51.721 ], 00:13:51.721 "product_name": "Malloc disk", 00:13:51.721 "block_size": 512, 00:13:51.721 "num_blocks": 65536, 00:13:51.721 "uuid": "f27f1ae9-f400-4aa5-9c98-ccaa030b1e94", 00:13:51.721 "assigned_rate_limits": { 00:13:51.721 "rw_ios_per_sec": 0, 00:13:51.721 "rw_mbytes_per_sec": 0, 00:13:51.721 "r_mbytes_per_sec": 0, 00:13:51.721 "w_mbytes_per_sec": 0 00:13:51.721 }, 00:13:51.721 "claimed": true, 00:13:51.721 "claim_type": "exclusive_write", 00:13:51.721 "zoned": false, 00:13:51.721 "supported_io_types": { 00:13:51.721 "read": true, 00:13:51.721 "write": true, 00:13:51.722 "unmap": true, 00:13:51.722 "write_zeroes": true, 00:13:51.722 "flush": true, 00:13:51.722 "reset": true, 00:13:51.722 "compare": false, 00:13:51.722 "compare_and_write": false, 00:13:51.722 "abort": true, 00:13:51.722 "nvme_admin": false, 00:13:51.722 "nvme_io": false 00:13:51.722 }, 00:13:51.722 "memory_domains": [ 00:13:51.722 { 00:13:51.722 "dma_device_id": "system", 00:13:51.722 "dma_device_type": 1 00:13:51.722 }, 00:13:51.722 { 00:13:51.722 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:51.722 "dma_device_type": 2 00:13:51.722 } 00:13:51.722 ], 00:13:51.722 "driver_specific": {} 00:13:51.722 }' 00:13:51.722 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:51.722 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:51.722 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:51.722 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:51.722 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:51.978 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:51.978 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:51.978 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:51.978 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:51.978 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:51.978 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:51.978 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:51.978 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:52.234 [2024-05-14 23:55:52.731839] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:52.234 [2024-05-14 23:55:52.731867] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:52.234 [2024-05-14 23:55:52.731909] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:52.234 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # local expected_state 00:13:52.234 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # has_redundancy concat 00:13:52.234 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # case $1 in 00:13:52.234 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@216 -- # return 1 00:13:52.234 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:13:52.234 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:13:52.234 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:52.234 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:13:52.234 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:52.234 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:52.234 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:13:52.234 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:52.234 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:52.234 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:52.234 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:52.234 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:52.234 23:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:52.492 23:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:52.492 "name": "Existed_Raid", 00:13:52.492 "uuid": "1f8dddf4-3e5b-4c74-bf06-59b8285c906c", 00:13:52.492 "strip_size_kb": 64, 00:13:52.492 "state": "offline", 00:13:52.492 "raid_level": "concat", 00:13:52.492 "superblock": true, 00:13:52.492 "num_base_bdevs": 3, 00:13:52.492 "num_base_bdevs_discovered": 2, 00:13:52.492 "num_base_bdevs_operational": 2, 00:13:52.492 "base_bdevs_list": [ 00:13:52.492 { 00:13:52.492 "name": null, 00:13:52.492 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:52.492 "is_configured": false, 00:13:52.492 "data_offset": 2048, 00:13:52.492 "data_size": 63488 00:13:52.492 }, 00:13:52.492 { 00:13:52.492 "name": "BaseBdev2", 00:13:52.492 "uuid": "456bd8bb-9743-4b36-84e3-df3b2f5d478c", 00:13:52.492 "is_configured": true, 00:13:52.492 "data_offset": 2048, 00:13:52.492 "data_size": 63488 00:13:52.492 }, 00:13:52.492 { 00:13:52.492 "name": "BaseBdev3", 00:13:52.492 "uuid": "f27f1ae9-f400-4aa5-9c98-ccaa030b1e94", 00:13:52.492 "is_configured": true, 00:13:52.492 "data_offset": 2048, 00:13:52.492 "data_size": 63488 00:13:52.492 } 00:13:52.492 ] 00:13:52.492 }' 00:13:52.492 23:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:52.492 23:55:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:53.058 23:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:13:53.058 23:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:13:53.058 23:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:53.058 23:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:13:53.316 23:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:13:53.316 23:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:53.316 23:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:53.574 [2024-05-14 23:55:54.072440] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:53.574 23:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:13:53.574 23:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:13:53.574 23:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:53.574 23:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:13:53.831 23:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:13:53.831 23:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:53.831 23:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:54.088 [2024-05-14 23:55:54.574057] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:54.088 [2024-05-14 23:55:54.574107] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11a8560 name Existed_Raid, state offline 00:13:54.088 23:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:13:54.088 23:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:13:54.088 23:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:54.088 23:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:13:54.346 23:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:13:54.346 23:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:13:54.346 23:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@300 -- # '[' 3 -gt 2 ']' 00:13:54.346 23:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:13:54.346 23:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:13:54.346 23:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:54.603 BaseBdev2 00:13:54.603 23:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:13:54.603 23:55:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:13:54.604 23:55:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:54.604 23:55:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:13:54.604 23:55:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:54.604 23:55:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:54.604 23:55:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:54.861 23:55:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:55.119 [ 00:13:55.119 { 00:13:55.119 "name": "BaseBdev2", 00:13:55.119 "aliases": [ 00:13:55.119 "bba6d72e-23c2-4c3a-ab16-46ecc94276ef" 00:13:55.119 ], 00:13:55.119 "product_name": "Malloc disk", 00:13:55.119 "block_size": 512, 00:13:55.119 "num_blocks": 65536, 00:13:55.119 "uuid": "bba6d72e-23c2-4c3a-ab16-46ecc94276ef", 00:13:55.119 "assigned_rate_limits": { 00:13:55.119 "rw_ios_per_sec": 0, 00:13:55.119 "rw_mbytes_per_sec": 0, 00:13:55.119 "r_mbytes_per_sec": 0, 00:13:55.119 "w_mbytes_per_sec": 0 00:13:55.119 }, 00:13:55.119 "claimed": false, 00:13:55.119 "zoned": false, 00:13:55.119 "supported_io_types": { 00:13:55.119 "read": true, 00:13:55.119 "write": true, 00:13:55.119 "unmap": true, 00:13:55.119 "write_zeroes": true, 00:13:55.119 "flush": true, 00:13:55.119 "reset": true, 00:13:55.119 "compare": false, 00:13:55.119 "compare_and_write": false, 00:13:55.119 "abort": true, 00:13:55.119 "nvme_admin": false, 00:13:55.119 "nvme_io": false 00:13:55.119 }, 00:13:55.119 "memory_domains": [ 00:13:55.119 { 00:13:55.119 "dma_device_id": "system", 00:13:55.119 "dma_device_type": 1 00:13:55.119 }, 00:13:55.119 { 00:13:55.119 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:55.119 "dma_device_type": 2 00:13:55.119 } 00:13:55.119 ], 00:13:55.119 "driver_specific": {} 00:13:55.119 } 00:13:55.119 ] 00:13:55.119 23:55:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:13:55.119 23:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:13:55.119 23:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:13:55.119 23:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:55.377 BaseBdev3 00:13:55.377 23:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:13:55.377 23:55:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:13:55.377 23:55:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:55.377 23:55:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:13:55.377 23:55:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:55.377 23:55:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:55.377 23:55:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:55.634 23:55:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:55.911 [ 00:13:55.911 { 00:13:55.911 "name": "BaseBdev3", 00:13:55.911 "aliases": [ 00:13:55.911 "3846529a-aa0d-4be7-a1e2-59d4435ba32c" 00:13:55.911 ], 00:13:55.911 "product_name": "Malloc disk", 00:13:55.911 "block_size": 512, 00:13:55.911 "num_blocks": 65536, 00:13:55.911 "uuid": "3846529a-aa0d-4be7-a1e2-59d4435ba32c", 00:13:55.911 "assigned_rate_limits": { 00:13:55.911 "rw_ios_per_sec": 0, 00:13:55.911 "rw_mbytes_per_sec": 0, 00:13:55.911 "r_mbytes_per_sec": 0, 00:13:55.911 "w_mbytes_per_sec": 0 00:13:55.911 }, 00:13:55.911 "claimed": false, 00:13:55.911 "zoned": false, 00:13:55.911 "supported_io_types": { 00:13:55.911 "read": true, 00:13:55.911 "write": true, 00:13:55.911 "unmap": true, 00:13:55.911 "write_zeroes": true, 00:13:55.911 "flush": true, 00:13:55.911 "reset": true, 00:13:55.911 "compare": false, 00:13:55.911 "compare_and_write": false, 00:13:55.911 "abort": true, 00:13:55.911 "nvme_admin": false, 00:13:55.911 "nvme_io": false 00:13:55.911 }, 00:13:55.911 "memory_domains": [ 00:13:55.911 { 00:13:55.911 "dma_device_id": "system", 00:13:55.911 "dma_device_type": 1 00:13:55.911 }, 00:13:55.911 { 00:13:55.911 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:55.911 "dma_device_type": 2 00:13:55.911 } 00:13:55.911 ], 00:13:55.911 "driver_specific": {} 00:13:55.911 } 00:13:55.911 ] 00:13:55.911 23:55:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:13:55.911 23:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:13:55.911 23:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:13:55.911 23:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:56.199 [2024-05-14 23:55:56.497802] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:56.199 [2024-05-14 23:55:56.497848] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:56.199 [2024-05-14 23:55:56.497868] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:56.199 [2024-05-14 23:55:56.499257] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:56.199 23:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:56.199 23:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:56.199 23:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:56.199 23:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:56.199 23:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:56.199 23:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:56.199 23:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:56.199 23:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:56.199 23:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:56.199 23:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:56.199 23:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:56.199 23:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:56.199 23:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:56.199 "name": "Existed_Raid", 00:13:56.199 "uuid": "0de908ac-3057-472d-83d8-1923b438e50a", 00:13:56.199 "strip_size_kb": 64, 00:13:56.199 "state": "configuring", 00:13:56.199 "raid_level": "concat", 00:13:56.199 "superblock": true, 00:13:56.199 "num_base_bdevs": 3, 00:13:56.199 "num_base_bdevs_discovered": 2, 00:13:56.199 "num_base_bdevs_operational": 3, 00:13:56.199 "base_bdevs_list": [ 00:13:56.199 { 00:13:56.199 "name": "BaseBdev1", 00:13:56.199 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:56.199 "is_configured": false, 00:13:56.199 "data_offset": 0, 00:13:56.199 "data_size": 0 00:13:56.199 }, 00:13:56.199 { 00:13:56.199 "name": "BaseBdev2", 00:13:56.199 "uuid": "bba6d72e-23c2-4c3a-ab16-46ecc94276ef", 00:13:56.199 "is_configured": true, 00:13:56.199 "data_offset": 2048, 00:13:56.199 "data_size": 63488 00:13:56.199 }, 00:13:56.199 { 00:13:56.199 "name": "BaseBdev3", 00:13:56.199 "uuid": "3846529a-aa0d-4be7-a1e2-59d4435ba32c", 00:13:56.199 "is_configured": true, 00:13:56.199 "data_offset": 2048, 00:13:56.199 "data_size": 63488 00:13:56.199 } 00:13:56.199 ] 00:13:56.199 }' 00:13:56.199 23:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:56.199 23:55:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:57.135 23:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:57.135 [2024-05-14 23:55:57.580646] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:57.135 23:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:57.135 23:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:57.135 23:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:57.135 23:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:57.135 23:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:57.135 23:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:57.135 23:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:57.135 23:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:57.135 23:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:57.135 23:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:57.135 23:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:57.135 23:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:57.393 23:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:57.393 "name": "Existed_Raid", 00:13:57.393 "uuid": "0de908ac-3057-472d-83d8-1923b438e50a", 00:13:57.393 "strip_size_kb": 64, 00:13:57.393 "state": "configuring", 00:13:57.393 "raid_level": "concat", 00:13:57.393 "superblock": true, 00:13:57.393 "num_base_bdevs": 3, 00:13:57.393 "num_base_bdevs_discovered": 1, 00:13:57.393 "num_base_bdevs_operational": 3, 00:13:57.393 "base_bdevs_list": [ 00:13:57.393 { 00:13:57.393 "name": "BaseBdev1", 00:13:57.393 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:57.393 "is_configured": false, 00:13:57.393 "data_offset": 0, 00:13:57.394 "data_size": 0 00:13:57.394 }, 00:13:57.394 { 00:13:57.394 "name": null, 00:13:57.394 "uuid": "bba6d72e-23c2-4c3a-ab16-46ecc94276ef", 00:13:57.394 "is_configured": false, 00:13:57.394 "data_offset": 2048, 00:13:57.394 "data_size": 63488 00:13:57.394 }, 00:13:57.394 { 00:13:57.394 "name": "BaseBdev3", 00:13:57.394 "uuid": "3846529a-aa0d-4be7-a1e2-59d4435ba32c", 00:13:57.394 "is_configured": true, 00:13:57.394 "data_offset": 2048, 00:13:57.394 "data_size": 63488 00:13:57.394 } 00:13:57.394 ] 00:13:57.394 }' 00:13:57.394 23:55:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:57.394 23:55:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:57.960 23:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:57.960 23:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:58.217 23:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:13:58.217 23:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:58.475 [2024-05-14 23:55:58.919581] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:58.475 BaseBdev1 00:13:58.475 23:55:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:13:58.475 23:55:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:13:58.475 23:55:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:58.475 23:55:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:13:58.475 23:55:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:58.475 23:55:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:58.475 23:55:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:58.733 23:55:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:58.991 [ 00:13:58.991 { 00:13:58.991 "name": "BaseBdev1", 00:13:58.991 "aliases": [ 00:13:58.991 "51a80459-a696-46a9-9135-4e02a26bd710" 00:13:58.991 ], 00:13:58.991 "product_name": "Malloc disk", 00:13:58.991 "block_size": 512, 00:13:58.991 "num_blocks": 65536, 00:13:58.991 "uuid": "51a80459-a696-46a9-9135-4e02a26bd710", 00:13:58.991 "assigned_rate_limits": { 00:13:58.992 "rw_ios_per_sec": 0, 00:13:58.992 "rw_mbytes_per_sec": 0, 00:13:58.992 "r_mbytes_per_sec": 0, 00:13:58.992 "w_mbytes_per_sec": 0 00:13:58.992 }, 00:13:58.992 "claimed": true, 00:13:58.992 "claim_type": "exclusive_write", 00:13:58.992 "zoned": false, 00:13:58.992 "supported_io_types": { 00:13:58.992 "read": true, 00:13:58.992 "write": true, 00:13:58.992 "unmap": true, 00:13:58.992 "write_zeroes": true, 00:13:58.992 "flush": true, 00:13:58.992 "reset": true, 00:13:58.992 "compare": false, 00:13:58.992 "compare_and_write": false, 00:13:58.992 "abort": true, 00:13:58.992 "nvme_admin": false, 00:13:58.992 "nvme_io": false 00:13:58.992 }, 00:13:58.992 "memory_domains": [ 00:13:58.992 { 00:13:58.992 "dma_device_id": "system", 00:13:58.992 "dma_device_type": 1 00:13:58.992 }, 00:13:58.992 { 00:13:58.992 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:58.992 "dma_device_type": 2 00:13:58.992 } 00:13:58.992 ], 00:13:58.992 "driver_specific": {} 00:13:58.992 } 00:13:58.992 ] 00:13:58.992 23:55:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:13:58.992 23:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:58.992 23:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:58.992 23:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:58.992 23:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:58.992 23:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:58.992 23:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:58.992 23:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:58.992 23:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:58.992 23:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:58.992 23:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:58.992 23:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.992 23:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:59.250 23:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:59.250 "name": "Existed_Raid", 00:13:59.250 "uuid": "0de908ac-3057-472d-83d8-1923b438e50a", 00:13:59.250 "strip_size_kb": 64, 00:13:59.250 "state": "configuring", 00:13:59.250 "raid_level": "concat", 00:13:59.250 "superblock": true, 00:13:59.250 "num_base_bdevs": 3, 00:13:59.250 "num_base_bdevs_discovered": 2, 00:13:59.250 "num_base_bdevs_operational": 3, 00:13:59.250 "base_bdevs_list": [ 00:13:59.250 { 00:13:59.250 "name": "BaseBdev1", 00:13:59.250 "uuid": "51a80459-a696-46a9-9135-4e02a26bd710", 00:13:59.250 "is_configured": true, 00:13:59.250 "data_offset": 2048, 00:13:59.250 "data_size": 63488 00:13:59.250 }, 00:13:59.250 { 00:13:59.250 "name": null, 00:13:59.250 "uuid": "bba6d72e-23c2-4c3a-ab16-46ecc94276ef", 00:13:59.250 "is_configured": false, 00:13:59.250 "data_offset": 2048, 00:13:59.250 "data_size": 63488 00:13:59.250 }, 00:13:59.250 { 00:13:59.250 "name": "BaseBdev3", 00:13:59.250 "uuid": "3846529a-aa0d-4be7-a1e2-59d4435ba32c", 00:13:59.250 "is_configured": true, 00:13:59.250 "data_offset": 2048, 00:13:59.250 "data_size": 63488 00:13:59.250 } 00:13:59.250 ] 00:13:59.250 }' 00:13:59.250 23:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:59.250 23:55:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:59.815 23:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:59.815 23:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:00.072 23:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:14:00.072 23:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:00.330 [2024-05-14 23:56:00.716498] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:00.330 23:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:00.330 23:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:00.330 23:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:00.330 23:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:14:00.330 23:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:00.330 23:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:00.330 23:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:00.330 23:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:00.330 23:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:00.330 23:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:00.330 23:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:00.330 23:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:00.587 23:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:00.587 "name": "Existed_Raid", 00:14:00.587 "uuid": "0de908ac-3057-472d-83d8-1923b438e50a", 00:14:00.587 "strip_size_kb": 64, 00:14:00.587 "state": "configuring", 00:14:00.587 "raid_level": "concat", 00:14:00.587 "superblock": true, 00:14:00.587 "num_base_bdevs": 3, 00:14:00.587 "num_base_bdevs_discovered": 1, 00:14:00.587 "num_base_bdevs_operational": 3, 00:14:00.587 "base_bdevs_list": [ 00:14:00.587 { 00:14:00.587 "name": "BaseBdev1", 00:14:00.587 "uuid": "51a80459-a696-46a9-9135-4e02a26bd710", 00:14:00.587 "is_configured": true, 00:14:00.587 "data_offset": 2048, 00:14:00.587 "data_size": 63488 00:14:00.587 }, 00:14:00.587 { 00:14:00.587 "name": null, 00:14:00.587 "uuid": "bba6d72e-23c2-4c3a-ab16-46ecc94276ef", 00:14:00.587 "is_configured": false, 00:14:00.587 "data_offset": 2048, 00:14:00.587 "data_size": 63488 00:14:00.587 }, 00:14:00.587 { 00:14:00.587 "name": null, 00:14:00.587 "uuid": "3846529a-aa0d-4be7-a1e2-59d4435ba32c", 00:14:00.587 "is_configured": false, 00:14:00.587 "data_offset": 2048, 00:14:00.587 "data_size": 63488 00:14:00.587 } 00:14:00.587 ] 00:14:00.587 }' 00:14:00.587 23:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:00.587 23:56:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:01.152 23:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:01.152 23:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:01.410 23:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:14:01.410 23:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:01.668 [2024-05-14 23:56:02.064111] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:01.668 23:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:01.668 23:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:01.668 23:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:01.668 23:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:14:01.668 23:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:01.668 23:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:01.668 23:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:01.668 23:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:01.668 23:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:01.668 23:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:01.668 23:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:01.668 23:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:01.926 23:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:01.926 "name": "Existed_Raid", 00:14:01.926 "uuid": "0de908ac-3057-472d-83d8-1923b438e50a", 00:14:01.926 "strip_size_kb": 64, 00:14:01.926 "state": "configuring", 00:14:01.926 "raid_level": "concat", 00:14:01.926 "superblock": true, 00:14:01.926 "num_base_bdevs": 3, 00:14:01.926 "num_base_bdevs_discovered": 2, 00:14:01.926 "num_base_bdevs_operational": 3, 00:14:01.926 "base_bdevs_list": [ 00:14:01.926 { 00:14:01.926 "name": "BaseBdev1", 00:14:01.926 "uuid": "51a80459-a696-46a9-9135-4e02a26bd710", 00:14:01.926 "is_configured": true, 00:14:01.926 "data_offset": 2048, 00:14:01.926 "data_size": 63488 00:14:01.926 }, 00:14:01.926 { 00:14:01.926 "name": null, 00:14:01.926 "uuid": "bba6d72e-23c2-4c3a-ab16-46ecc94276ef", 00:14:01.926 "is_configured": false, 00:14:01.926 "data_offset": 2048, 00:14:01.926 "data_size": 63488 00:14:01.926 }, 00:14:01.926 { 00:14:01.926 "name": "BaseBdev3", 00:14:01.926 "uuid": "3846529a-aa0d-4be7-a1e2-59d4435ba32c", 00:14:01.926 "is_configured": true, 00:14:01.926 "data_offset": 2048, 00:14:01.926 "data_size": 63488 00:14:01.926 } 00:14:01.926 ] 00:14:01.926 }' 00:14:01.927 23:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:01.927 23:56:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:02.492 23:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:02.492 23:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.750 23:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:14:02.750 23:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:02.750 [2024-05-14 23:56:03.311445] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:03.011 23:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:03.011 23:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:03.011 23:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:03.011 23:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:14:03.011 23:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:03.011 23:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:03.011 23:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:03.011 23:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:03.011 23:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:03.011 23:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:03.011 23:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:03.011 23:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:03.011 23:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:03.011 "name": "Existed_Raid", 00:14:03.011 "uuid": "0de908ac-3057-472d-83d8-1923b438e50a", 00:14:03.011 "strip_size_kb": 64, 00:14:03.011 "state": "configuring", 00:14:03.011 "raid_level": "concat", 00:14:03.011 "superblock": true, 00:14:03.011 "num_base_bdevs": 3, 00:14:03.011 "num_base_bdevs_discovered": 1, 00:14:03.011 "num_base_bdevs_operational": 3, 00:14:03.011 "base_bdevs_list": [ 00:14:03.011 { 00:14:03.011 "name": null, 00:14:03.011 "uuid": "51a80459-a696-46a9-9135-4e02a26bd710", 00:14:03.011 "is_configured": false, 00:14:03.011 "data_offset": 2048, 00:14:03.011 "data_size": 63488 00:14:03.011 }, 00:14:03.011 { 00:14:03.011 "name": null, 00:14:03.011 "uuid": "bba6d72e-23c2-4c3a-ab16-46ecc94276ef", 00:14:03.011 "is_configured": false, 00:14:03.011 "data_offset": 2048, 00:14:03.011 "data_size": 63488 00:14:03.011 }, 00:14:03.011 { 00:14:03.011 "name": "BaseBdev3", 00:14:03.011 "uuid": "3846529a-aa0d-4be7-a1e2-59d4435ba32c", 00:14:03.011 "is_configured": true, 00:14:03.011 "data_offset": 2048, 00:14:03.011 "data_size": 63488 00:14:03.011 } 00:14:03.011 ] 00:14:03.011 }' 00:14:03.011 23:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:03.011 23:56:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:03.972 23:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:03.972 23:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:03.972 23:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:14:03.972 23:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:04.231 [2024-05-14 23:56:04.619144] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:04.231 23:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:04.231 23:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:04.231 23:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:04.231 23:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:14:04.231 23:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:04.231 23:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:04.231 23:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:04.231 23:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:04.231 23:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:04.231 23:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:04.231 23:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:04.231 23:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:04.490 23:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:04.490 "name": "Existed_Raid", 00:14:04.490 "uuid": "0de908ac-3057-472d-83d8-1923b438e50a", 00:14:04.490 "strip_size_kb": 64, 00:14:04.490 "state": "configuring", 00:14:04.490 "raid_level": "concat", 00:14:04.490 "superblock": true, 00:14:04.490 "num_base_bdevs": 3, 00:14:04.490 "num_base_bdevs_discovered": 2, 00:14:04.490 "num_base_bdevs_operational": 3, 00:14:04.490 "base_bdevs_list": [ 00:14:04.490 { 00:14:04.490 "name": null, 00:14:04.490 "uuid": "51a80459-a696-46a9-9135-4e02a26bd710", 00:14:04.490 "is_configured": false, 00:14:04.490 "data_offset": 2048, 00:14:04.490 "data_size": 63488 00:14:04.490 }, 00:14:04.490 { 00:14:04.490 "name": "BaseBdev2", 00:14:04.490 "uuid": "bba6d72e-23c2-4c3a-ab16-46ecc94276ef", 00:14:04.490 "is_configured": true, 00:14:04.490 "data_offset": 2048, 00:14:04.490 "data_size": 63488 00:14:04.490 }, 00:14:04.490 { 00:14:04.490 "name": "BaseBdev3", 00:14:04.490 "uuid": "3846529a-aa0d-4be7-a1e2-59d4435ba32c", 00:14:04.490 "is_configured": true, 00:14:04.490 "data_offset": 2048, 00:14:04.490 "data_size": 63488 00:14:04.490 } 00:14:04.490 ] 00:14:04.490 }' 00:14:04.490 23:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:04.490 23:56:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:05.055 23:56:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:05.056 23:56:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:05.313 23:56:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:14:05.313 23:56:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:05.313 23:56:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:05.570 23:56:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 51a80459-a696-46a9-9135-4e02a26bd710 00:14:05.827 [2024-05-14 23:56:06.216006] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:05.827 [2024-05-14 23:56:06.216167] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x134c650 00:14:05.827 [2024-05-14 23:56:06.216180] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:05.827 [2024-05-14 23:56:06.216361] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13591f0 00:14:05.827 [2024-05-14 23:56:06.216501] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x134c650 00:14:05.827 [2024-05-14 23:56:06.216512] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x134c650 00:14:05.827 [2024-05-14 23:56:06.216612] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:05.827 NewBaseBdev 00:14:05.827 23:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:14:05.827 23:56:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:14:05.827 23:56:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:05.827 23:56:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:14:05.827 23:56:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:05.827 23:56:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:05.827 23:56:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:06.085 23:56:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:06.342 [ 00:14:06.342 { 00:14:06.342 "name": "NewBaseBdev", 00:14:06.342 "aliases": [ 00:14:06.342 "51a80459-a696-46a9-9135-4e02a26bd710" 00:14:06.342 ], 00:14:06.342 "product_name": "Malloc disk", 00:14:06.342 "block_size": 512, 00:14:06.342 "num_blocks": 65536, 00:14:06.342 "uuid": "51a80459-a696-46a9-9135-4e02a26bd710", 00:14:06.343 "assigned_rate_limits": { 00:14:06.343 "rw_ios_per_sec": 0, 00:14:06.343 "rw_mbytes_per_sec": 0, 00:14:06.343 "r_mbytes_per_sec": 0, 00:14:06.343 "w_mbytes_per_sec": 0 00:14:06.343 }, 00:14:06.343 "claimed": true, 00:14:06.343 "claim_type": "exclusive_write", 00:14:06.343 "zoned": false, 00:14:06.343 "supported_io_types": { 00:14:06.343 "read": true, 00:14:06.343 "write": true, 00:14:06.343 "unmap": true, 00:14:06.343 "write_zeroes": true, 00:14:06.343 "flush": true, 00:14:06.343 "reset": true, 00:14:06.343 "compare": false, 00:14:06.343 "compare_and_write": false, 00:14:06.343 "abort": true, 00:14:06.343 "nvme_admin": false, 00:14:06.343 "nvme_io": false 00:14:06.343 }, 00:14:06.343 "memory_domains": [ 00:14:06.343 { 00:14:06.343 "dma_device_id": "system", 00:14:06.343 "dma_device_type": 1 00:14:06.343 }, 00:14:06.343 { 00:14:06.343 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:06.343 "dma_device_type": 2 00:14:06.343 } 00:14:06.343 ], 00:14:06.343 "driver_specific": {} 00:14:06.343 } 00:14:06.343 ] 00:14:06.343 23:56:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:14:06.343 23:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:14:06.343 23:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:06.343 23:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:14:06.343 23:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:14:06.343 23:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:06.343 23:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:06.343 23:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:06.343 23:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:06.343 23:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:06.343 23:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:06.343 23:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:06.343 23:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:06.601 23:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:06.601 "name": "Existed_Raid", 00:14:06.601 "uuid": "0de908ac-3057-472d-83d8-1923b438e50a", 00:14:06.601 "strip_size_kb": 64, 00:14:06.601 "state": "online", 00:14:06.601 "raid_level": "concat", 00:14:06.601 "superblock": true, 00:14:06.601 "num_base_bdevs": 3, 00:14:06.601 "num_base_bdevs_discovered": 3, 00:14:06.601 "num_base_bdevs_operational": 3, 00:14:06.601 "base_bdevs_list": [ 00:14:06.601 { 00:14:06.601 "name": "NewBaseBdev", 00:14:06.601 "uuid": "51a80459-a696-46a9-9135-4e02a26bd710", 00:14:06.601 "is_configured": true, 00:14:06.601 "data_offset": 2048, 00:14:06.601 "data_size": 63488 00:14:06.601 }, 00:14:06.601 { 00:14:06.601 "name": "BaseBdev2", 00:14:06.601 "uuid": "bba6d72e-23c2-4c3a-ab16-46ecc94276ef", 00:14:06.601 "is_configured": true, 00:14:06.601 "data_offset": 2048, 00:14:06.601 "data_size": 63488 00:14:06.601 }, 00:14:06.601 { 00:14:06.601 "name": "BaseBdev3", 00:14:06.601 "uuid": "3846529a-aa0d-4be7-a1e2-59d4435ba32c", 00:14:06.601 "is_configured": true, 00:14:06.601 "data_offset": 2048, 00:14:06.601 "data_size": 63488 00:14:06.601 } 00:14:06.601 ] 00:14:06.601 }' 00:14:06.601 23:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:06.601 23:56:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:07.168 23:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:14:07.168 23:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:14:07.168 23:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:14:07.168 23:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:14:07.168 23:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:14:07.168 23:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:14:07.168 23:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:07.168 23:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:14:07.427 [2024-05-14 23:56:07.788464] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:07.427 23:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:14:07.427 "name": "Existed_Raid", 00:14:07.427 "aliases": [ 00:14:07.427 "0de908ac-3057-472d-83d8-1923b438e50a" 00:14:07.427 ], 00:14:07.427 "product_name": "Raid Volume", 00:14:07.427 "block_size": 512, 00:14:07.427 "num_blocks": 190464, 00:14:07.427 "uuid": "0de908ac-3057-472d-83d8-1923b438e50a", 00:14:07.427 "assigned_rate_limits": { 00:14:07.427 "rw_ios_per_sec": 0, 00:14:07.427 "rw_mbytes_per_sec": 0, 00:14:07.427 "r_mbytes_per_sec": 0, 00:14:07.427 "w_mbytes_per_sec": 0 00:14:07.427 }, 00:14:07.427 "claimed": false, 00:14:07.427 "zoned": false, 00:14:07.427 "supported_io_types": { 00:14:07.427 "read": true, 00:14:07.427 "write": true, 00:14:07.427 "unmap": true, 00:14:07.427 "write_zeroes": true, 00:14:07.427 "flush": true, 00:14:07.427 "reset": true, 00:14:07.427 "compare": false, 00:14:07.427 "compare_and_write": false, 00:14:07.427 "abort": false, 00:14:07.427 "nvme_admin": false, 00:14:07.427 "nvme_io": false 00:14:07.427 }, 00:14:07.427 "memory_domains": [ 00:14:07.427 { 00:14:07.427 "dma_device_id": "system", 00:14:07.427 "dma_device_type": 1 00:14:07.427 }, 00:14:07.427 { 00:14:07.427 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:07.427 "dma_device_type": 2 00:14:07.427 }, 00:14:07.427 { 00:14:07.427 "dma_device_id": "system", 00:14:07.427 "dma_device_type": 1 00:14:07.427 }, 00:14:07.427 { 00:14:07.427 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:07.427 "dma_device_type": 2 00:14:07.427 }, 00:14:07.427 { 00:14:07.427 "dma_device_id": "system", 00:14:07.427 "dma_device_type": 1 00:14:07.427 }, 00:14:07.427 { 00:14:07.427 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:07.427 "dma_device_type": 2 00:14:07.427 } 00:14:07.427 ], 00:14:07.427 "driver_specific": { 00:14:07.427 "raid": { 00:14:07.427 "uuid": "0de908ac-3057-472d-83d8-1923b438e50a", 00:14:07.427 "strip_size_kb": 64, 00:14:07.427 "state": "online", 00:14:07.428 "raid_level": "concat", 00:14:07.428 "superblock": true, 00:14:07.428 "num_base_bdevs": 3, 00:14:07.428 "num_base_bdevs_discovered": 3, 00:14:07.428 "num_base_bdevs_operational": 3, 00:14:07.428 "base_bdevs_list": [ 00:14:07.428 { 00:14:07.428 "name": "NewBaseBdev", 00:14:07.428 "uuid": "51a80459-a696-46a9-9135-4e02a26bd710", 00:14:07.428 "is_configured": true, 00:14:07.428 "data_offset": 2048, 00:14:07.428 "data_size": 63488 00:14:07.428 }, 00:14:07.428 { 00:14:07.428 "name": "BaseBdev2", 00:14:07.428 "uuid": "bba6d72e-23c2-4c3a-ab16-46ecc94276ef", 00:14:07.428 "is_configured": true, 00:14:07.428 "data_offset": 2048, 00:14:07.428 "data_size": 63488 00:14:07.428 }, 00:14:07.428 { 00:14:07.428 "name": "BaseBdev3", 00:14:07.428 "uuid": "3846529a-aa0d-4be7-a1e2-59d4435ba32c", 00:14:07.428 "is_configured": true, 00:14:07.428 "data_offset": 2048, 00:14:07.428 "data_size": 63488 00:14:07.428 } 00:14:07.428 ] 00:14:07.428 } 00:14:07.428 } 00:14:07.428 }' 00:14:07.428 23:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:07.428 23:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:14:07.428 BaseBdev2 00:14:07.428 BaseBdev3' 00:14:07.428 23:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:07.428 23:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:07.428 23:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:07.687 23:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:07.687 "name": "NewBaseBdev", 00:14:07.687 "aliases": [ 00:14:07.687 "51a80459-a696-46a9-9135-4e02a26bd710" 00:14:07.687 ], 00:14:07.687 "product_name": "Malloc disk", 00:14:07.687 "block_size": 512, 00:14:07.687 "num_blocks": 65536, 00:14:07.687 "uuid": "51a80459-a696-46a9-9135-4e02a26bd710", 00:14:07.687 "assigned_rate_limits": { 00:14:07.687 "rw_ios_per_sec": 0, 00:14:07.687 "rw_mbytes_per_sec": 0, 00:14:07.687 "r_mbytes_per_sec": 0, 00:14:07.687 "w_mbytes_per_sec": 0 00:14:07.687 }, 00:14:07.687 "claimed": true, 00:14:07.687 "claim_type": "exclusive_write", 00:14:07.687 "zoned": false, 00:14:07.687 "supported_io_types": { 00:14:07.687 "read": true, 00:14:07.687 "write": true, 00:14:07.687 "unmap": true, 00:14:07.687 "write_zeroes": true, 00:14:07.687 "flush": true, 00:14:07.687 "reset": true, 00:14:07.687 "compare": false, 00:14:07.687 "compare_and_write": false, 00:14:07.687 "abort": true, 00:14:07.687 "nvme_admin": false, 00:14:07.687 "nvme_io": false 00:14:07.687 }, 00:14:07.687 "memory_domains": [ 00:14:07.687 { 00:14:07.687 "dma_device_id": "system", 00:14:07.687 "dma_device_type": 1 00:14:07.687 }, 00:14:07.687 { 00:14:07.687 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:07.687 "dma_device_type": 2 00:14:07.687 } 00:14:07.687 ], 00:14:07.687 "driver_specific": {} 00:14:07.687 }' 00:14:07.687 23:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:07.687 23:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:07.687 23:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:07.687 23:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:07.687 23:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:07.945 23:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:07.945 23:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:07.945 23:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:07.945 23:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:07.945 23:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:07.945 23:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:07.945 23:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:07.945 23:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:07.945 23:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:07.945 23:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:08.204 23:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:08.204 "name": "BaseBdev2", 00:14:08.204 "aliases": [ 00:14:08.204 "bba6d72e-23c2-4c3a-ab16-46ecc94276ef" 00:14:08.204 ], 00:14:08.204 "product_name": "Malloc disk", 00:14:08.204 "block_size": 512, 00:14:08.204 "num_blocks": 65536, 00:14:08.204 "uuid": "bba6d72e-23c2-4c3a-ab16-46ecc94276ef", 00:14:08.204 "assigned_rate_limits": { 00:14:08.204 "rw_ios_per_sec": 0, 00:14:08.204 "rw_mbytes_per_sec": 0, 00:14:08.204 "r_mbytes_per_sec": 0, 00:14:08.204 "w_mbytes_per_sec": 0 00:14:08.204 }, 00:14:08.204 "claimed": true, 00:14:08.204 "claim_type": "exclusive_write", 00:14:08.204 "zoned": false, 00:14:08.204 "supported_io_types": { 00:14:08.204 "read": true, 00:14:08.204 "write": true, 00:14:08.204 "unmap": true, 00:14:08.204 "write_zeroes": true, 00:14:08.204 "flush": true, 00:14:08.204 "reset": true, 00:14:08.204 "compare": false, 00:14:08.204 "compare_and_write": false, 00:14:08.204 "abort": true, 00:14:08.204 "nvme_admin": false, 00:14:08.204 "nvme_io": false 00:14:08.204 }, 00:14:08.204 "memory_domains": [ 00:14:08.204 { 00:14:08.204 "dma_device_id": "system", 00:14:08.204 "dma_device_type": 1 00:14:08.204 }, 00:14:08.204 { 00:14:08.204 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:08.204 "dma_device_type": 2 00:14:08.204 } 00:14:08.204 ], 00:14:08.204 "driver_specific": {} 00:14:08.204 }' 00:14:08.204 23:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:08.204 23:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:08.204 23:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:08.204 23:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:08.462 23:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:08.462 23:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:08.462 23:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:08.462 23:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:08.462 23:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:08.462 23:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:08.462 23:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:08.462 23:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:08.462 23:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:08.462 23:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:08.462 23:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:08.720 23:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:08.720 "name": "BaseBdev3", 00:14:08.720 "aliases": [ 00:14:08.720 "3846529a-aa0d-4be7-a1e2-59d4435ba32c" 00:14:08.720 ], 00:14:08.720 "product_name": "Malloc disk", 00:14:08.720 "block_size": 512, 00:14:08.720 "num_blocks": 65536, 00:14:08.720 "uuid": "3846529a-aa0d-4be7-a1e2-59d4435ba32c", 00:14:08.720 "assigned_rate_limits": { 00:14:08.720 "rw_ios_per_sec": 0, 00:14:08.720 "rw_mbytes_per_sec": 0, 00:14:08.720 "r_mbytes_per_sec": 0, 00:14:08.720 "w_mbytes_per_sec": 0 00:14:08.720 }, 00:14:08.720 "claimed": true, 00:14:08.720 "claim_type": "exclusive_write", 00:14:08.720 "zoned": false, 00:14:08.720 "supported_io_types": { 00:14:08.720 "read": true, 00:14:08.720 "write": true, 00:14:08.720 "unmap": true, 00:14:08.720 "write_zeroes": true, 00:14:08.720 "flush": true, 00:14:08.720 "reset": true, 00:14:08.720 "compare": false, 00:14:08.720 "compare_and_write": false, 00:14:08.720 "abort": true, 00:14:08.720 "nvme_admin": false, 00:14:08.720 "nvme_io": false 00:14:08.720 }, 00:14:08.720 "memory_domains": [ 00:14:08.720 { 00:14:08.720 "dma_device_id": "system", 00:14:08.720 "dma_device_type": 1 00:14:08.720 }, 00:14:08.720 { 00:14:08.720 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:08.720 "dma_device_type": 2 00:14:08.720 } 00:14:08.720 ], 00:14:08.720 "driver_specific": {} 00:14:08.720 }' 00:14:08.720 23:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:08.978 23:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:08.978 23:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:08.978 23:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:08.978 23:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:08.978 23:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:08.978 23:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:08.978 23:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:08.978 23:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:08.978 23:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:09.236 23:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:09.236 23:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:09.236 23:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:09.494 [2024-05-14 23:56:09.845664] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:09.494 [2024-05-14 23:56:09.845690] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:09.494 [2024-05-14 23:56:09.845750] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:09.494 [2024-05-14 23:56:09.845803] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:09.494 [2024-05-14 23:56:09.845815] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x134c650 name Existed_Raid, state offline 00:14:09.494 23:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@342 -- # killprocess 415448 00:14:09.494 23:56:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@946 -- # '[' -z 415448 ']' 00:14:09.494 23:56:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # kill -0 415448 00:14:09.494 23:56:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # uname 00:14:09.494 23:56:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:14:09.494 23:56:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 415448 00:14:09.494 23:56:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:14:09.494 23:56:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:14:09.494 23:56:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 415448' 00:14:09.494 killing process with pid 415448 00:14:09.494 23:56:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@965 -- # kill 415448 00:14:09.494 [2024-05-14 23:56:09.925136] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:09.494 23:56:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@970 -- # wait 415448 00:14:09.494 [2024-05-14 23:56:09.953982] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:09.752 23:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@344 -- # return 0 00:14:09.752 00:14:09.752 real 0m28.234s 00:14:09.752 user 0m51.752s 00:14:09.752 sys 0m5.080s 00:14:09.752 23:56:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:14:09.753 23:56:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:09.753 ************************************ 00:14:09.753 END TEST raid_state_function_test_sb 00:14:09.753 ************************************ 00:14:09.753 23:56:10 bdev_raid -- bdev/bdev_raid.sh@817 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:14:09.753 23:56:10 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:14:09.753 23:56:10 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:14:09.753 23:56:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:09.753 ************************************ 00:14:09.753 START TEST raid_superblock_test 00:14:09.753 ************************************ 00:14:09.753 23:56:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1121 -- # raid_superblock_test concat 3 00:14:09.753 23:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local raid_level=concat 00:14:09.753 23:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=3 00:14:09.753 23:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:14:09.753 23:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:14:09.753 23:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:14:09.753 23:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:14:09.753 23:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:14:09.753 23:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:14:09.753 23:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:14:09.753 23:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size 00:14:09.753 23:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:14:09.753 23:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:14:09.753 23:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:14:09.753 23:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # '[' concat '!=' raid1 ']' 00:14:09.753 23:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size=64 00:14:09.753 23:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@406 -- # strip_size_create_arg='-z 64' 00:14:09.753 23:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # raid_pid=419690 00:14:09.753 23:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # waitforlisten 419690 /var/tmp/spdk-raid.sock 00:14:09.753 23:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:09.753 23:56:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@827 -- # '[' -z 419690 ']' 00:14:09.753 23:56:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:09.753 23:56:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:14:09.753 23:56:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:09.753 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:09.753 23:56:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:14:09.753 23:56:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:10.011 [2024-05-14 23:56:10.359108] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:14:10.011 [2024-05-14 23:56:10.359171] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid419690 ] 00:14:10.011 [2024-05-14 23:56:10.487503] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:10.011 [2024-05-14 23:56:10.593127] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:14:10.269 [2024-05-14 23:56:10.653527] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:10.269 [2024-05-14 23:56:10.653559] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:10.835 23:56:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:14:10.835 23:56:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # return 0 00:14:10.835 23:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:14:10.835 23:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:14:10.835 23:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:14:10.835 23:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:14:10.835 23:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:14:10.835 23:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:10.835 23:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:14:10.835 23:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:10.835 23:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:14:11.093 malloc1 00:14:11.093 23:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:11.351 [2024-05-14 23:56:11.776979] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:11.351 [2024-05-14 23:56:11.777029] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:11.351 [2024-05-14 23:56:11.777055] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf8d780 00:14:11.351 [2024-05-14 23:56:11.777067] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:11.351 [2024-05-14 23:56:11.778860] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:11.351 [2024-05-14 23:56:11.778889] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:11.351 pt1 00:14:11.351 23:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:14:11.351 23:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:14:11.351 23:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:14:11.351 23:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:14:11.351 23:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:14:11.351 23:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:11.351 23:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:14:11.351 23:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:11.351 23:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:14:11.609 malloc2 00:14:11.609 23:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:11.867 [2024-05-14 23:56:12.267074] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:11.867 [2024-05-14 23:56:12.267121] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:11.867 [2024-05-14 23:56:12.267140] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf8eb60 00:14:11.867 [2024-05-14 23:56:12.267153] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:11.867 [2024-05-14 23:56:12.268694] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:11.867 [2024-05-14 23:56:12.268722] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:11.867 pt2 00:14:11.867 23:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:14:11.867 23:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:14:11.867 23:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc3 00:14:11.868 23:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt3 00:14:11.868 23:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:14:11.868 23:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:11.868 23:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:14:11.868 23:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:11.868 23:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:14:12.126 malloc3 00:14:12.126 23:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:12.384 [2024-05-14 23:56:12.765445] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:12.384 [2024-05-14 23:56:12.765492] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:12.384 [2024-05-14 23:56:12.765514] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1139080 00:14:12.384 [2024-05-14 23:56:12.765526] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:12.384 [2024-05-14 23:56:12.767117] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:12.384 [2024-05-14 23:56:12.767145] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:12.384 pt3 00:14:12.384 23:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:14:12.384 23:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:14:12.384 23:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:14:12.695 [2024-05-14 23:56:13.010112] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:12.695 [2024-05-14 23:56:13.011675] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:12.695 [2024-05-14 23:56:13.011732] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:12.695 [2024-05-14 23:56:13.011889] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x113c910 00:14:12.695 [2024-05-14 23:56:13.011901] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:12.695 [2024-05-14 23:56:13.012106] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x113be30 00:14:12.695 [2024-05-14 23:56:13.012256] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x113c910 00:14:12.695 [2024-05-14 23:56:13.012266] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x113c910 00:14:12.695 [2024-05-14 23:56:13.012370] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:12.695 23:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:14:12.695 23:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:14:12.695 23:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:14:12.695 23:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:14:12.695 23:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:12.695 23:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:12.695 23:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:12.695 23:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:12.695 23:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:12.695 23:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:12.696 23:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:12.696 23:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:12.953 23:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:12.953 "name": "raid_bdev1", 00:14:12.953 "uuid": "14f60316-2ea2-4075-86fd-432f3ac40211", 00:14:12.953 "strip_size_kb": 64, 00:14:12.953 "state": "online", 00:14:12.954 "raid_level": "concat", 00:14:12.954 "superblock": true, 00:14:12.954 "num_base_bdevs": 3, 00:14:12.954 "num_base_bdevs_discovered": 3, 00:14:12.954 "num_base_bdevs_operational": 3, 00:14:12.954 "base_bdevs_list": [ 00:14:12.954 { 00:14:12.954 "name": "pt1", 00:14:12.954 "uuid": "cdfc2b37-de4e-5142-8c3e-e75a9acf18ee", 00:14:12.954 "is_configured": true, 00:14:12.954 "data_offset": 2048, 00:14:12.954 "data_size": 63488 00:14:12.954 }, 00:14:12.954 { 00:14:12.954 "name": "pt2", 00:14:12.954 "uuid": "8922214f-d33c-5dd7-b30c-552494eef8bb", 00:14:12.954 "is_configured": true, 00:14:12.954 "data_offset": 2048, 00:14:12.954 "data_size": 63488 00:14:12.954 }, 00:14:12.954 { 00:14:12.954 "name": "pt3", 00:14:12.954 "uuid": "5b4684e5-e1fe-503b-baaf-b67b8f04d2e5", 00:14:12.954 "is_configured": true, 00:14:12.954 "data_offset": 2048, 00:14:12.954 "data_size": 63488 00:14:12.954 } 00:14:12.954 ] 00:14:12.954 }' 00:14:12.954 23:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:12.954 23:56:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:13.519 23:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:14:13.519 23:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:14:13.519 23:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:14:13.519 23:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:14:13.519 23:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:14:13.519 23:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:14:13.519 23:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:14:13.519 23:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:13.519 [2024-05-14 23:56:14.069134] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:13.519 23:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:14:13.519 "name": "raid_bdev1", 00:14:13.519 "aliases": [ 00:14:13.519 "14f60316-2ea2-4075-86fd-432f3ac40211" 00:14:13.519 ], 00:14:13.519 "product_name": "Raid Volume", 00:14:13.519 "block_size": 512, 00:14:13.519 "num_blocks": 190464, 00:14:13.519 "uuid": "14f60316-2ea2-4075-86fd-432f3ac40211", 00:14:13.519 "assigned_rate_limits": { 00:14:13.519 "rw_ios_per_sec": 0, 00:14:13.519 "rw_mbytes_per_sec": 0, 00:14:13.519 "r_mbytes_per_sec": 0, 00:14:13.519 "w_mbytes_per_sec": 0 00:14:13.519 }, 00:14:13.519 "claimed": false, 00:14:13.519 "zoned": false, 00:14:13.519 "supported_io_types": { 00:14:13.519 "read": true, 00:14:13.519 "write": true, 00:14:13.519 "unmap": true, 00:14:13.519 "write_zeroes": true, 00:14:13.519 "flush": true, 00:14:13.519 "reset": true, 00:14:13.519 "compare": false, 00:14:13.519 "compare_and_write": false, 00:14:13.519 "abort": false, 00:14:13.519 "nvme_admin": false, 00:14:13.519 "nvme_io": false 00:14:13.519 }, 00:14:13.519 "memory_domains": [ 00:14:13.519 { 00:14:13.519 "dma_device_id": "system", 00:14:13.519 "dma_device_type": 1 00:14:13.519 }, 00:14:13.519 { 00:14:13.519 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:13.519 "dma_device_type": 2 00:14:13.519 }, 00:14:13.519 { 00:14:13.519 "dma_device_id": "system", 00:14:13.519 "dma_device_type": 1 00:14:13.519 }, 00:14:13.519 { 00:14:13.519 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:13.519 "dma_device_type": 2 00:14:13.519 }, 00:14:13.519 { 00:14:13.519 "dma_device_id": "system", 00:14:13.519 "dma_device_type": 1 00:14:13.519 }, 00:14:13.519 { 00:14:13.520 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:13.520 "dma_device_type": 2 00:14:13.520 } 00:14:13.520 ], 00:14:13.520 "driver_specific": { 00:14:13.520 "raid": { 00:14:13.520 "uuid": "14f60316-2ea2-4075-86fd-432f3ac40211", 00:14:13.520 "strip_size_kb": 64, 00:14:13.520 "state": "online", 00:14:13.520 "raid_level": "concat", 00:14:13.520 "superblock": true, 00:14:13.520 "num_base_bdevs": 3, 00:14:13.520 "num_base_bdevs_discovered": 3, 00:14:13.520 "num_base_bdevs_operational": 3, 00:14:13.520 "base_bdevs_list": [ 00:14:13.520 { 00:14:13.520 "name": "pt1", 00:14:13.520 "uuid": "cdfc2b37-de4e-5142-8c3e-e75a9acf18ee", 00:14:13.520 "is_configured": true, 00:14:13.520 "data_offset": 2048, 00:14:13.520 "data_size": 63488 00:14:13.520 }, 00:14:13.520 { 00:14:13.520 "name": "pt2", 00:14:13.520 "uuid": "8922214f-d33c-5dd7-b30c-552494eef8bb", 00:14:13.520 "is_configured": true, 00:14:13.520 "data_offset": 2048, 00:14:13.520 "data_size": 63488 00:14:13.520 }, 00:14:13.520 { 00:14:13.520 "name": "pt3", 00:14:13.520 "uuid": "5b4684e5-e1fe-503b-baaf-b67b8f04d2e5", 00:14:13.520 "is_configured": true, 00:14:13.520 "data_offset": 2048, 00:14:13.520 "data_size": 63488 00:14:13.520 } 00:14:13.520 ] 00:14:13.520 } 00:14:13.520 } 00:14:13.520 }' 00:14:13.520 23:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:13.778 23:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:14:13.778 pt2 00:14:13.778 pt3' 00:14:13.778 23:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:13.778 23:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:13.778 23:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:14.035 23:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:14.035 "name": "pt1", 00:14:14.035 "aliases": [ 00:14:14.035 "cdfc2b37-de4e-5142-8c3e-e75a9acf18ee" 00:14:14.035 ], 00:14:14.035 "product_name": "passthru", 00:14:14.035 "block_size": 512, 00:14:14.035 "num_blocks": 65536, 00:14:14.035 "uuid": "cdfc2b37-de4e-5142-8c3e-e75a9acf18ee", 00:14:14.035 "assigned_rate_limits": { 00:14:14.035 "rw_ios_per_sec": 0, 00:14:14.035 "rw_mbytes_per_sec": 0, 00:14:14.035 "r_mbytes_per_sec": 0, 00:14:14.035 "w_mbytes_per_sec": 0 00:14:14.035 }, 00:14:14.035 "claimed": true, 00:14:14.035 "claim_type": "exclusive_write", 00:14:14.035 "zoned": false, 00:14:14.035 "supported_io_types": { 00:14:14.035 "read": true, 00:14:14.035 "write": true, 00:14:14.035 "unmap": true, 00:14:14.035 "write_zeroes": true, 00:14:14.035 "flush": true, 00:14:14.035 "reset": true, 00:14:14.035 "compare": false, 00:14:14.035 "compare_and_write": false, 00:14:14.035 "abort": true, 00:14:14.035 "nvme_admin": false, 00:14:14.035 "nvme_io": false 00:14:14.035 }, 00:14:14.035 "memory_domains": [ 00:14:14.035 { 00:14:14.035 "dma_device_id": "system", 00:14:14.035 "dma_device_type": 1 00:14:14.035 }, 00:14:14.035 { 00:14:14.035 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:14.035 "dma_device_type": 2 00:14:14.035 } 00:14:14.035 ], 00:14:14.035 "driver_specific": { 00:14:14.035 "passthru": { 00:14:14.035 "name": "pt1", 00:14:14.035 "base_bdev_name": "malloc1" 00:14:14.035 } 00:14:14.035 } 00:14:14.035 }' 00:14:14.035 23:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:14.035 23:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:14.035 23:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:14.035 23:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:14.035 23:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:14.035 23:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:14.035 23:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:14.035 23:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:14.294 23:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:14.294 23:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:14.294 23:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:14.294 23:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:14.294 23:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:14.294 23:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:14.294 23:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:14.551 23:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:14.551 "name": "pt2", 00:14:14.551 "aliases": [ 00:14:14.551 "8922214f-d33c-5dd7-b30c-552494eef8bb" 00:14:14.551 ], 00:14:14.551 "product_name": "passthru", 00:14:14.551 "block_size": 512, 00:14:14.551 "num_blocks": 65536, 00:14:14.551 "uuid": "8922214f-d33c-5dd7-b30c-552494eef8bb", 00:14:14.551 "assigned_rate_limits": { 00:14:14.551 "rw_ios_per_sec": 0, 00:14:14.551 "rw_mbytes_per_sec": 0, 00:14:14.551 "r_mbytes_per_sec": 0, 00:14:14.551 "w_mbytes_per_sec": 0 00:14:14.551 }, 00:14:14.551 "claimed": true, 00:14:14.551 "claim_type": "exclusive_write", 00:14:14.551 "zoned": false, 00:14:14.551 "supported_io_types": { 00:14:14.551 "read": true, 00:14:14.551 "write": true, 00:14:14.551 "unmap": true, 00:14:14.551 "write_zeroes": true, 00:14:14.551 "flush": true, 00:14:14.551 "reset": true, 00:14:14.551 "compare": false, 00:14:14.551 "compare_and_write": false, 00:14:14.551 "abort": true, 00:14:14.551 "nvme_admin": false, 00:14:14.551 "nvme_io": false 00:14:14.551 }, 00:14:14.551 "memory_domains": [ 00:14:14.551 { 00:14:14.551 "dma_device_id": "system", 00:14:14.551 "dma_device_type": 1 00:14:14.551 }, 00:14:14.551 { 00:14:14.551 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:14.551 "dma_device_type": 2 00:14:14.551 } 00:14:14.551 ], 00:14:14.551 "driver_specific": { 00:14:14.551 "passthru": { 00:14:14.551 "name": "pt2", 00:14:14.551 "base_bdev_name": "malloc2" 00:14:14.551 } 00:14:14.551 } 00:14:14.551 }' 00:14:14.551 23:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:14.551 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:14.551 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:14.551 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:14.551 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:14.551 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:14.551 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:14.809 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:14.809 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:14.809 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:14.809 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:14.809 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:14.809 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:14.809 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:14.809 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:15.067 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:15.067 "name": "pt3", 00:14:15.067 "aliases": [ 00:14:15.067 "5b4684e5-e1fe-503b-baaf-b67b8f04d2e5" 00:14:15.067 ], 00:14:15.067 "product_name": "passthru", 00:14:15.067 "block_size": 512, 00:14:15.067 "num_blocks": 65536, 00:14:15.067 "uuid": "5b4684e5-e1fe-503b-baaf-b67b8f04d2e5", 00:14:15.067 "assigned_rate_limits": { 00:14:15.067 "rw_ios_per_sec": 0, 00:14:15.067 "rw_mbytes_per_sec": 0, 00:14:15.067 "r_mbytes_per_sec": 0, 00:14:15.067 "w_mbytes_per_sec": 0 00:14:15.067 }, 00:14:15.067 "claimed": true, 00:14:15.067 "claim_type": "exclusive_write", 00:14:15.067 "zoned": false, 00:14:15.067 "supported_io_types": { 00:14:15.067 "read": true, 00:14:15.067 "write": true, 00:14:15.067 "unmap": true, 00:14:15.067 "write_zeroes": true, 00:14:15.067 "flush": true, 00:14:15.067 "reset": true, 00:14:15.067 "compare": false, 00:14:15.067 "compare_and_write": false, 00:14:15.067 "abort": true, 00:14:15.067 "nvme_admin": false, 00:14:15.067 "nvme_io": false 00:14:15.067 }, 00:14:15.067 "memory_domains": [ 00:14:15.067 { 00:14:15.067 "dma_device_id": "system", 00:14:15.067 "dma_device_type": 1 00:14:15.067 }, 00:14:15.067 { 00:14:15.067 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:15.067 "dma_device_type": 2 00:14:15.067 } 00:14:15.067 ], 00:14:15.067 "driver_specific": { 00:14:15.067 "passthru": { 00:14:15.067 "name": "pt3", 00:14:15.067 "base_bdev_name": "malloc3" 00:14:15.067 } 00:14:15.067 } 00:14:15.067 }' 00:14:15.067 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:15.067 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:15.067 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:15.067 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:15.067 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:15.067 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:15.067 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:15.067 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:15.067 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:15.324 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:15.324 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:15.324 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:15.324 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:15.324 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:14:15.324 [2024-05-14 23:56:15.914016] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:15.582 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=14f60316-2ea2-4075-86fd-432f3ac40211 00:14:15.582 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # '[' -z 14f60316-2ea2-4075-86fd-432f3ac40211 ']' 00:14:15.582 23:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:15.582 [2024-05-14 23:56:16.162428] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:15.582 [2024-05-14 23:56:16.162456] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:15.582 [2024-05-14 23:56:16.162509] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:15.582 [2024-05-14 23:56:16.162563] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:15.582 [2024-05-14 23:56:16.162574] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x113c910 name raid_bdev1, state offline 00:14:15.840 23:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:15.840 23:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:14:15.840 23:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:14:15.840 23:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:14:15.840 23:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:14:15.840 23:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:16.099 23:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:14:16.099 23:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:16.357 23:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:14:16.357 23:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:14:16.616 23:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:14:16.616 23:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:14:16.873 23:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:14:16.873 23:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:16.873 23:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:14:16.873 23:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:16.873 23:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:16.873 23:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:16.873 23:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:16.873 23:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:16.873 23:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:16.873 23:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:16.873 23:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:16.873 23:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:14:16.873 23:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:17.131 [2024-05-14 23:56:17.602185] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:14:17.131 [2024-05-14 23:56:17.603584] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:14:17.131 [2024-05-14 23:56:17.603627] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:14:17.131 [2024-05-14 23:56:17.603676] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:14:17.131 [2024-05-14 23:56:17.603716] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:14:17.131 [2024-05-14 23:56:17.603739] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:14:17.131 [2024-05-14 23:56:17.603756] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:17.131 [2024-05-14 23:56:17.603766] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1139480 name raid_bdev1, state configuring 00:14:17.131 request: 00:14:17.131 { 00:14:17.131 "name": "raid_bdev1", 00:14:17.131 "raid_level": "concat", 00:14:17.131 "base_bdevs": [ 00:14:17.131 "malloc1", 00:14:17.131 "malloc2", 00:14:17.131 "malloc3" 00:14:17.131 ], 00:14:17.131 "superblock": false, 00:14:17.131 "strip_size_kb": 64, 00:14:17.131 "method": "bdev_raid_create", 00:14:17.131 "req_id": 1 00:14:17.131 } 00:14:17.131 Got JSON-RPC error response 00:14:17.131 response: 00:14:17.131 { 00:14:17.131 "code": -17, 00:14:17.131 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:14:17.131 } 00:14:17.131 23:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:14:17.131 23:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:14:17.131 23:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:14:17.131 23:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:14:17.131 23:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:17.131 23:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:14:17.390 23:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:14:17.390 23:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:14:17.390 23:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:17.648 [2024-05-14 23:56:18.091407] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:17.648 [2024-05-14 23:56:18.091451] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:17.648 [2024-05-14 23:56:18.091473] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1136aa0 00:14:17.648 [2024-05-14 23:56:18.091486] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:17.648 [2024-05-14 23:56:18.093137] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:17.648 [2024-05-14 23:56:18.093165] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:17.648 [2024-05-14 23:56:18.093233] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt1 00:14:17.648 [2024-05-14 23:56:18.093260] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:17.648 pt1 00:14:17.648 23:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:14:17.648 23:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:14:17.648 23:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:17.648 23:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:14:17.648 23:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:17.648 23:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:17.648 23:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:17.648 23:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:17.648 23:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:17.648 23:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:17.648 23:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:17.648 23:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:17.907 23:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:17.907 "name": "raid_bdev1", 00:14:17.907 "uuid": "14f60316-2ea2-4075-86fd-432f3ac40211", 00:14:17.907 "strip_size_kb": 64, 00:14:17.907 "state": "configuring", 00:14:17.907 "raid_level": "concat", 00:14:17.907 "superblock": true, 00:14:17.907 "num_base_bdevs": 3, 00:14:17.907 "num_base_bdevs_discovered": 1, 00:14:17.907 "num_base_bdevs_operational": 3, 00:14:17.907 "base_bdevs_list": [ 00:14:17.907 { 00:14:17.907 "name": "pt1", 00:14:17.907 "uuid": "cdfc2b37-de4e-5142-8c3e-e75a9acf18ee", 00:14:17.907 "is_configured": true, 00:14:17.907 "data_offset": 2048, 00:14:17.907 "data_size": 63488 00:14:17.907 }, 00:14:17.907 { 00:14:17.907 "name": null, 00:14:17.907 "uuid": "8922214f-d33c-5dd7-b30c-552494eef8bb", 00:14:17.907 "is_configured": false, 00:14:17.907 "data_offset": 2048, 00:14:17.907 "data_size": 63488 00:14:17.907 }, 00:14:17.907 { 00:14:17.907 "name": null, 00:14:17.907 "uuid": "5b4684e5-e1fe-503b-baaf-b67b8f04d2e5", 00:14:17.907 "is_configured": false, 00:14:17.907 "data_offset": 2048, 00:14:17.907 "data_size": 63488 00:14:17.907 } 00:14:17.907 ] 00:14:17.907 }' 00:14:17.907 23:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:17.907 23:56:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:18.474 23:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@470 -- # '[' 3 -gt 2 ']' 00:14:18.474 23:56:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:18.732 [2024-05-14 23:56:19.138186] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:18.732 [2024-05-14 23:56:19.138241] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:18.732 [2024-05-14 23:56:19.138265] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x113cdd0 00:14:18.732 [2024-05-14 23:56:19.138278] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:18.732 [2024-05-14 23:56:19.138624] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:18.732 [2024-05-14 23:56:19.138642] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:18.732 [2024-05-14 23:56:19.138707] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:14:18.732 [2024-05-14 23:56:19.138725] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:18.732 pt2 00:14:18.732 23:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:18.990 [2024-05-14 23:56:19.378839] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:14:18.990 23:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:14:18.990 23:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:14:18.990 23:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:18.990 23:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:14:18.990 23:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:18.990 23:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:18.990 23:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:18.990 23:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:18.990 23:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:18.990 23:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:18.990 23:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:18.990 23:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:19.249 23:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:19.249 "name": "raid_bdev1", 00:14:19.249 "uuid": "14f60316-2ea2-4075-86fd-432f3ac40211", 00:14:19.249 "strip_size_kb": 64, 00:14:19.249 "state": "configuring", 00:14:19.249 "raid_level": "concat", 00:14:19.249 "superblock": true, 00:14:19.249 "num_base_bdevs": 3, 00:14:19.249 "num_base_bdevs_discovered": 1, 00:14:19.249 "num_base_bdevs_operational": 3, 00:14:19.249 "base_bdevs_list": [ 00:14:19.249 { 00:14:19.249 "name": "pt1", 00:14:19.249 "uuid": "cdfc2b37-de4e-5142-8c3e-e75a9acf18ee", 00:14:19.249 "is_configured": true, 00:14:19.249 "data_offset": 2048, 00:14:19.249 "data_size": 63488 00:14:19.249 }, 00:14:19.249 { 00:14:19.249 "name": null, 00:14:19.249 "uuid": "8922214f-d33c-5dd7-b30c-552494eef8bb", 00:14:19.249 "is_configured": false, 00:14:19.249 "data_offset": 2048, 00:14:19.249 "data_size": 63488 00:14:19.249 }, 00:14:19.249 { 00:14:19.249 "name": null, 00:14:19.249 "uuid": "5b4684e5-e1fe-503b-baaf-b67b8f04d2e5", 00:14:19.249 "is_configured": false, 00:14:19.249 "data_offset": 2048, 00:14:19.249 "data_size": 63488 00:14:19.249 } 00:14:19.249 ] 00:14:19.249 }' 00:14:19.249 23:56:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:19.249 23:56:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:19.816 23:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:14:19.816 23:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:14:19.816 23:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:20.074 [2024-05-14 23:56:20.465717] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:20.074 [2024-05-14 23:56:20.465772] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:20.074 [2024-05-14 23:56:20.465794] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf8e370 00:14:20.074 [2024-05-14 23:56:20.465807] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:20.074 [2024-05-14 23:56:20.466143] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:20.074 [2024-05-14 23:56:20.466159] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:20.074 [2024-05-14 23:56:20.466224] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:14:20.074 [2024-05-14 23:56:20.466243] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:20.074 pt2 00:14:20.074 23:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:14:20.074 23:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:14:20.074 23:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:20.333 [2024-05-14 23:56:20.706357] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:20.333 [2024-05-14 23:56:20.706411] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:20.333 [2024-05-14 23:56:20.706437] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1138470 00:14:20.333 [2024-05-14 23:56:20.706449] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:20.333 [2024-05-14 23:56:20.706772] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:20.333 [2024-05-14 23:56:20.706788] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:20.333 [2024-05-14 23:56:20.706851] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt3 00:14:20.333 [2024-05-14 23:56:20.706871] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:20.333 [2024-05-14 23:56:20.706978] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1137ce0 00:14:20.333 [2024-05-14 23:56:20.706988] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:20.333 [2024-05-14 23:56:20.707154] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x113ea90 00:14:20.333 [2024-05-14 23:56:20.707289] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1137ce0 00:14:20.333 [2024-05-14 23:56:20.707299] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1137ce0 00:14:20.333 [2024-05-14 23:56:20.707407] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:20.333 pt3 00:14:20.333 23:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:14:20.333 23:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:14:20.333 23:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:14:20.333 23:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:14:20.333 23:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:14:20.333 23:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:14:20.333 23:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:20.333 23:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:20.333 23:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:20.333 23:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:20.333 23:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:20.333 23:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:20.333 23:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:20.333 23:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:20.592 23:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:20.592 "name": "raid_bdev1", 00:14:20.592 "uuid": "14f60316-2ea2-4075-86fd-432f3ac40211", 00:14:20.592 "strip_size_kb": 64, 00:14:20.592 "state": "online", 00:14:20.592 "raid_level": "concat", 00:14:20.592 "superblock": true, 00:14:20.592 "num_base_bdevs": 3, 00:14:20.592 "num_base_bdevs_discovered": 3, 00:14:20.592 "num_base_bdevs_operational": 3, 00:14:20.592 "base_bdevs_list": [ 00:14:20.592 { 00:14:20.592 "name": "pt1", 00:14:20.592 "uuid": "cdfc2b37-de4e-5142-8c3e-e75a9acf18ee", 00:14:20.592 "is_configured": true, 00:14:20.592 "data_offset": 2048, 00:14:20.592 "data_size": 63488 00:14:20.592 }, 00:14:20.592 { 00:14:20.592 "name": "pt2", 00:14:20.592 "uuid": "8922214f-d33c-5dd7-b30c-552494eef8bb", 00:14:20.592 "is_configured": true, 00:14:20.592 "data_offset": 2048, 00:14:20.592 "data_size": 63488 00:14:20.592 }, 00:14:20.592 { 00:14:20.592 "name": "pt3", 00:14:20.592 "uuid": "5b4684e5-e1fe-503b-baaf-b67b8f04d2e5", 00:14:20.592 "is_configured": true, 00:14:20.592 "data_offset": 2048, 00:14:20.592 "data_size": 63488 00:14:20.592 } 00:14:20.592 ] 00:14:20.592 }' 00:14:20.592 23:56:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:20.592 23:56:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:21.158 23:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:14:21.158 23:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:14:21.158 23:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:14:21.158 23:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:14:21.158 23:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:14:21.158 23:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:14:21.158 23:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:21.158 23:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:14:21.416 [2024-05-14 23:56:21.773451] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:21.416 23:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:14:21.416 "name": "raid_bdev1", 00:14:21.416 "aliases": [ 00:14:21.416 "14f60316-2ea2-4075-86fd-432f3ac40211" 00:14:21.416 ], 00:14:21.416 "product_name": "Raid Volume", 00:14:21.416 "block_size": 512, 00:14:21.416 "num_blocks": 190464, 00:14:21.416 "uuid": "14f60316-2ea2-4075-86fd-432f3ac40211", 00:14:21.416 "assigned_rate_limits": { 00:14:21.416 "rw_ios_per_sec": 0, 00:14:21.416 "rw_mbytes_per_sec": 0, 00:14:21.416 "r_mbytes_per_sec": 0, 00:14:21.416 "w_mbytes_per_sec": 0 00:14:21.416 }, 00:14:21.416 "claimed": false, 00:14:21.416 "zoned": false, 00:14:21.416 "supported_io_types": { 00:14:21.416 "read": true, 00:14:21.416 "write": true, 00:14:21.416 "unmap": true, 00:14:21.416 "write_zeroes": true, 00:14:21.416 "flush": true, 00:14:21.416 "reset": true, 00:14:21.416 "compare": false, 00:14:21.416 "compare_and_write": false, 00:14:21.416 "abort": false, 00:14:21.416 "nvme_admin": false, 00:14:21.416 "nvme_io": false 00:14:21.416 }, 00:14:21.416 "memory_domains": [ 00:14:21.416 { 00:14:21.416 "dma_device_id": "system", 00:14:21.416 "dma_device_type": 1 00:14:21.416 }, 00:14:21.416 { 00:14:21.416 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:21.416 "dma_device_type": 2 00:14:21.416 }, 00:14:21.416 { 00:14:21.416 "dma_device_id": "system", 00:14:21.416 "dma_device_type": 1 00:14:21.416 }, 00:14:21.416 { 00:14:21.416 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:21.416 "dma_device_type": 2 00:14:21.416 }, 00:14:21.416 { 00:14:21.416 "dma_device_id": "system", 00:14:21.416 "dma_device_type": 1 00:14:21.416 }, 00:14:21.416 { 00:14:21.416 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:21.416 "dma_device_type": 2 00:14:21.416 } 00:14:21.416 ], 00:14:21.416 "driver_specific": { 00:14:21.416 "raid": { 00:14:21.416 "uuid": "14f60316-2ea2-4075-86fd-432f3ac40211", 00:14:21.416 "strip_size_kb": 64, 00:14:21.416 "state": "online", 00:14:21.416 "raid_level": "concat", 00:14:21.416 "superblock": true, 00:14:21.416 "num_base_bdevs": 3, 00:14:21.416 "num_base_bdevs_discovered": 3, 00:14:21.417 "num_base_bdevs_operational": 3, 00:14:21.417 "base_bdevs_list": [ 00:14:21.417 { 00:14:21.417 "name": "pt1", 00:14:21.417 "uuid": "cdfc2b37-de4e-5142-8c3e-e75a9acf18ee", 00:14:21.417 "is_configured": true, 00:14:21.417 "data_offset": 2048, 00:14:21.417 "data_size": 63488 00:14:21.417 }, 00:14:21.417 { 00:14:21.417 "name": "pt2", 00:14:21.417 "uuid": "8922214f-d33c-5dd7-b30c-552494eef8bb", 00:14:21.417 "is_configured": true, 00:14:21.417 "data_offset": 2048, 00:14:21.417 "data_size": 63488 00:14:21.417 }, 00:14:21.417 { 00:14:21.417 "name": "pt3", 00:14:21.417 "uuid": "5b4684e5-e1fe-503b-baaf-b67b8f04d2e5", 00:14:21.417 "is_configured": true, 00:14:21.417 "data_offset": 2048, 00:14:21.417 "data_size": 63488 00:14:21.417 } 00:14:21.417 ] 00:14:21.417 } 00:14:21.417 } 00:14:21.417 }' 00:14:21.417 23:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:21.417 23:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:14:21.417 pt2 00:14:21.417 pt3' 00:14:21.417 23:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:21.417 23:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:21.417 23:56:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:21.674 23:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:21.674 "name": "pt1", 00:14:21.674 "aliases": [ 00:14:21.674 "cdfc2b37-de4e-5142-8c3e-e75a9acf18ee" 00:14:21.674 ], 00:14:21.674 "product_name": "passthru", 00:14:21.674 "block_size": 512, 00:14:21.674 "num_blocks": 65536, 00:14:21.674 "uuid": "cdfc2b37-de4e-5142-8c3e-e75a9acf18ee", 00:14:21.674 "assigned_rate_limits": { 00:14:21.674 "rw_ios_per_sec": 0, 00:14:21.674 "rw_mbytes_per_sec": 0, 00:14:21.674 "r_mbytes_per_sec": 0, 00:14:21.674 "w_mbytes_per_sec": 0 00:14:21.674 }, 00:14:21.675 "claimed": true, 00:14:21.675 "claim_type": "exclusive_write", 00:14:21.675 "zoned": false, 00:14:21.675 "supported_io_types": { 00:14:21.675 "read": true, 00:14:21.675 "write": true, 00:14:21.675 "unmap": true, 00:14:21.675 "write_zeroes": true, 00:14:21.675 "flush": true, 00:14:21.675 "reset": true, 00:14:21.675 "compare": false, 00:14:21.675 "compare_and_write": false, 00:14:21.675 "abort": true, 00:14:21.675 "nvme_admin": false, 00:14:21.675 "nvme_io": false 00:14:21.675 }, 00:14:21.675 "memory_domains": [ 00:14:21.675 { 00:14:21.675 "dma_device_id": "system", 00:14:21.675 "dma_device_type": 1 00:14:21.675 }, 00:14:21.675 { 00:14:21.675 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:21.675 "dma_device_type": 2 00:14:21.675 } 00:14:21.675 ], 00:14:21.675 "driver_specific": { 00:14:21.675 "passthru": { 00:14:21.675 "name": "pt1", 00:14:21.675 "base_bdev_name": "malloc1" 00:14:21.675 } 00:14:21.675 } 00:14:21.675 }' 00:14:21.675 23:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:21.675 23:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:21.675 23:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:21.675 23:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:21.675 23:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:21.931 23:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:21.931 23:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:21.931 23:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:21.931 23:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:21.931 23:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:21.931 23:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:21.931 23:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:21.931 23:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:21.931 23:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:21.931 23:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:22.189 23:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:22.189 "name": "pt2", 00:14:22.189 "aliases": [ 00:14:22.189 "8922214f-d33c-5dd7-b30c-552494eef8bb" 00:14:22.189 ], 00:14:22.189 "product_name": "passthru", 00:14:22.189 "block_size": 512, 00:14:22.189 "num_blocks": 65536, 00:14:22.189 "uuid": "8922214f-d33c-5dd7-b30c-552494eef8bb", 00:14:22.189 "assigned_rate_limits": { 00:14:22.189 "rw_ios_per_sec": 0, 00:14:22.189 "rw_mbytes_per_sec": 0, 00:14:22.189 "r_mbytes_per_sec": 0, 00:14:22.189 "w_mbytes_per_sec": 0 00:14:22.189 }, 00:14:22.189 "claimed": true, 00:14:22.189 "claim_type": "exclusive_write", 00:14:22.189 "zoned": false, 00:14:22.189 "supported_io_types": { 00:14:22.189 "read": true, 00:14:22.189 "write": true, 00:14:22.189 "unmap": true, 00:14:22.189 "write_zeroes": true, 00:14:22.189 "flush": true, 00:14:22.189 "reset": true, 00:14:22.189 "compare": false, 00:14:22.189 "compare_and_write": false, 00:14:22.189 "abort": true, 00:14:22.189 "nvme_admin": false, 00:14:22.189 "nvme_io": false 00:14:22.189 }, 00:14:22.189 "memory_domains": [ 00:14:22.189 { 00:14:22.189 "dma_device_id": "system", 00:14:22.189 "dma_device_type": 1 00:14:22.189 }, 00:14:22.189 { 00:14:22.189 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:22.189 "dma_device_type": 2 00:14:22.189 } 00:14:22.189 ], 00:14:22.189 "driver_specific": { 00:14:22.189 "passthru": { 00:14:22.189 "name": "pt2", 00:14:22.189 "base_bdev_name": "malloc2" 00:14:22.189 } 00:14:22.189 } 00:14:22.189 }' 00:14:22.189 23:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:22.189 23:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:22.189 23:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:22.189 23:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:22.446 23:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:22.446 23:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:22.446 23:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:22.446 23:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:22.446 23:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:22.446 23:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:22.446 23:56:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:22.446 23:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:22.446 23:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:22.446 23:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:22.446 23:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:22.704 23:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:22.704 "name": "pt3", 00:14:22.704 "aliases": [ 00:14:22.704 "5b4684e5-e1fe-503b-baaf-b67b8f04d2e5" 00:14:22.704 ], 00:14:22.704 "product_name": "passthru", 00:14:22.704 "block_size": 512, 00:14:22.704 "num_blocks": 65536, 00:14:22.704 "uuid": "5b4684e5-e1fe-503b-baaf-b67b8f04d2e5", 00:14:22.704 "assigned_rate_limits": { 00:14:22.704 "rw_ios_per_sec": 0, 00:14:22.704 "rw_mbytes_per_sec": 0, 00:14:22.704 "r_mbytes_per_sec": 0, 00:14:22.704 "w_mbytes_per_sec": 0 00:14:22.704 }, 00:14:22.704 "claimed": true, 00:14:22.704 "claim_type": "exclusive_write", 00:14:22.704 "zoned": false, 00:14:22.704 "supported_io_types": { 00:14:22.704 "read": true, 00:14:22.704 "write": true, 00:14:22.704 "unmap": true, 00:14:22.704 "write_zeroes": true, 00:14:22.704 "flush": true, 00:14:22.704 "reset": true, 00:14:22.704 "compare": false, 00:14:22.704 "compare_and_write": false, 00:14:22.704 "abort": true, 00:14:22.704 "nvme_admin": false, 00:14:22.704 "nvme_io": false 00:14:22.704 }, 00:14:22.704 "memory_domains": [ 00:14:22.704 { 00:14:22.704 "dma_device_id": "system", 00:14:22.704 "dma_device_type": 1 00:14:22.704 }, 00:14:22.704 { 00:14:22.704 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:22.704 "dma_device_type": 2 00:14:22.704 } 00:14:22.704 ], 00:14:22.704 "driver_specific": { 00:14:22.704 "passthru": { 00:14:22.704 "name": "pt3", 00:14:22.704 "base_bdev_name": "malloc3" 00:14:22.704 } 00:14:22.704 } 00:14:22.704 }' 00:14:22.704 23:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:22.962 23:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:22.962 23:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:22.962 23:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:22.962 23:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:22.962 23:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:22.962 23:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:22.962 23:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:22.962 23:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:22.962 23:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:22.962 23:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:23.222 23:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:23.222 23:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:23.223 23:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:14:23.483 [2024-05-14 23:56:23.818860] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:23.483 23:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # '[' 14f60316-2ea2-4075-86fd-432f3ac40211 '!=' 14f60316-2ea2-4075-86fd-432f3ac40211 ']' 00:14:23.483 23:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@491 -- # has_redundancy concat 00:14:23.483 23:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:14:23.483 23:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@216 -- # return 1 00:14:23.483 23:56:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@568 -- # killprocess 419690 00:14:23.483 23:56:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@946 -- # '[' -z 419690 ']' 00:14:23.483 23:56:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # kill -0 419690 00:14:23.483 23:56:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # uname 00:14:23.483 23:56:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:14:23.483 23:56:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 419690 00:14:23.483 23:56:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:14:23.483 23:56:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:14:23.483 23:56:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 419690' 00:14:23.483 killing process with pid 419690 00:14:23.483 23:56:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@965 -- # kill 419690 00:14:23.483 [2024-05-14 23:56:23.893293] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:23.483 [2024-05-14 23:56:23.893356] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:23.483 [2024-05-14 23:56:23.893420] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:23.483 [2024-05-14 23:56:23.893433] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1137ce0 name raid_bdev1, state offline 00:14:23.483 23:56:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@970 -- # wait 419690 00:14:23.483 [2024-05-14 23:56:23.920029] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:23.741 23:56:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # return 0 00:14:23.741 00:14:23.741 real 0m13.850s 00:14:23.741 user 0m24.888s 00:14:23.741 sys 0m2.541s 00:14:23.741 23:56:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:14:23.741 23:56:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:23.741 ************************************ 00:14:23.741 END TEST raid_superblock_test 00:14:23.741 ************************************ 00:14:23.741 23:56:24 bdev_raid -- bdev/bdev_raid.sh@814 -- # for level in raid0 concat raid1 00:14:23.741 23:56:24 bdev_raid -- bdev/bdev_raid.sh@815 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:14:23.741 23:56:24 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:14:23.741 23:56:24 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:14:23.741 23:56:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:23.741 ************************************ 00:14:23.741 START TEST raid_state_function_test 00:14:23.741 ************************************ 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1121 -- # raid_state_function_test raid1 3 false 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local raid_level=raid1 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=3 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local superblock=false 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # '[' raid1 '!=' raid1 ']' 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@235 -- # strip_size=0 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@238 -- # '[' false = true ']' 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@241 -- # superblock_create_arg= 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # raid_pid=421896 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 421896' 00:14:23.741 Process raid pid: 421896 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@247 -- # waitforlisten 421896 /var/tmp/spdk-raid.sock 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@827 -- # '[' -z 421896 ']' 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:23.741 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:14:23.741 23:56:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:23.741 [2024-05-14 23:56:24.301234] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:14:23.741 [2024-05-14 23:56:24.301294] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:23.999 [2024-05-14 23:56:24.430514] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:23.999 [2024-05-14 23:56:24.536147] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:14:24.257 [2024-05-14 23:56:24.595999] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:24.257 [2024-05-14 23:56:24.596026] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:24.822 23:56:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:14:24.822 23:56:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # return 0 00:14:24.822 23:56:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:25.079 [2024-05-14 23:56:25.455411] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:25.079 [2024-05-14 23:56:25.455451] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:25.079 [2024-05-14 23:56:25.455462] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:25.079 [2024-05-14 23:56:25.455474] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:25.079 [2024-05-14 23:56:25.455483] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:25.079 [2024-05-14 23:56:25.455494] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:25.079 23:56:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:25.079 23:56:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:25.079 23:56:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:25.079 23:56:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:25.079 23:56:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:25.079 23:56:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:25.079 23:56:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:25.079 23:56:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:25.079 23:56:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:25.079 23:56:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:25.079 23:56:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:25.079 23:56:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:25.337 23:56:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:25.337 "name": "Existed_Raid", 00:14:25.337 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:25.337 "strip_size_kb": 0, 00:14:25.337 "state": "configuring", 00:14:25.337 "raid_level": "raid1", 00:14:25.337 "superblock": false, 00:14:25.337 "num_base_bdevs": 3, 00:14:25.337 "num_base_bdevs_discovered": 0, 00:14:25.337 "num_base_bdevs_operational": 3, 00:14:25.337 "base_bdevs_list": [ 00:14:25.337 { 00:14:25.337 "name": "BaseBdev1", 00:14:25.337 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:25.337 "is_configured": false, 00:14:25.337 "data_offset": 0, 00:14:25.337 "data_size": 0 00:14:25.337 }, 00:14:25.337 { 00:14:25.337 "name": "BaseBdev2", 00:14:25.337 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:25.337 "is_configured": false, 00:14:25.337 "data_offset": 0, 00:14:25.337 "data_size": 0 00:14:25.337 }, 00:14:25.337 { 00:14:25.337 "name": "BaseBdev3", 00:14:25.337 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:25.337 "is_configured": false, 00:14:25.337 "data_offset": 0, 00:14:25.337 "data_size": 0 00:14:25.337 } 00:14:25.337 ] 00:14:25.337 }' 00:14:25.337 23:56:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:25.337 23:56:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:25.902 23:56:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:26.159 [2024-05-14 23:56:26.542153] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:26.159 [2024-05-14 23:56:26.542185] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x203abe0 name Existed_Raid, state configuring 00:14:26.159 23:56:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:26.417 [2024-05-14 23:56:26.782815] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:26.417 [2024-05-14 23:56:26.782844] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:26.417 [2024-05-14 23:56:26.782854] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:26.417 [2024-05-14 23:56:26.782865] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:26.417 [2024-05-14 23:56:26.782874] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:26.417 [2024-05-14 23:56:26.782885] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:26.417 23:56:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:26.674 [2024-05-14 23:56:27.037313] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:26.674 BaseBdev1 00:14:26.674 23:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:14:26.674 23:56:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:14:26.674 23:56:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:26.674 23:56:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:14:26.674 23:56:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:26.674 23:56:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:26.674 23:56:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:26.932 23:56:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:26.932 [ 00:14:26.932 { 00:14:26.932 "name": "BaseBdev1", 00:14:26.932 "aliases": [ 00:14:26.932 "72bca0eb-0df6-400e-8a5b-15e355be09de" 00:14:26.932 ], 00:14:26.932 "product_name": "Malloc disk", 00:14:26.932 "block_size": 512, 00:14:26.932 "num_blocks": 65536, 00:14:26.932 "uuid": "72bca0eb-0df6-400e-8a5b-15e355be09de", 00:14:26.932 "assigned_rate_limits": { 00:14:26.932 "rw_ios_per_sec": 0, 00:14:26.932 "rw_mbytes_per_sec": 0, 00:14:26.932 "r_mbytes_per_sec": 0, 00:14:26.932 "w_mbytes_per_sec": 0 00:14:26.932 }, 00:14:26.932 "claimed": true, 00:14:26.932 "claim_type": "exclusive_write", 00:14:26.932 "zoned": false, 00:14:26.932 "supported_io_types": { 00:14:26.932 "read": true, 00:14:26.932 "write": true, 00:14:26.932 "unmap": true, 00:14:26.932 "write_zeroes": true, 00:14:26.932 "flush": true, 00:14:26.932 "reset": true, 00:14:26.932 "compare": false, 00:14:26.932 "compare_and_write": false, 00:14:26.932 "abort": true, 00:14:26.932 "nvme_admin": false, 00:14:26.932 "nvme_io": false 00:14:26.932 }, 00:14:26.932 "memory_domains": [ 00:14:26.932 { 00:14:26.932 "dma_device_id": "system", 00:14:26.932 "dma_device_type": 1 00:14:26.932 }, 00:14:26.932 { 00:14:26.932 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:26.932 "dma_device_type": 2 00:14:26.932 } 00:14:26.932 ], 00:14:26.932 "driver_specific": {} 00:14:26.932 } 00:14:26.932 ] 00:14:27.189 23:56:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:14:27.189 23:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:27.189 23:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:27.189 23:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:27.189 23:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:27.189 23:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:27.189 23:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:27.189 23:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:27.189 23:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:27.189 23:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:27.189 23:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:27.189 23:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:27.189 23:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:27.447 23:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:27.447 "name": "Existed_Raid", 00:14:27.447 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:27.447 "strip_size_kb": 0, 00:14:27.447 "state": "configuring", 00:14:27.447 "raid_level": "raid1", 00:14:27.447 "superblock": false, 00:14:27.447 "num_base_bdevs": 3, 00:14:27.447 "num_base_bdevs_discovered": 1, 00:14:27.447 "num_base_bdevs_operational": 3, 00:14:27.447 "base_bdevs_list": [ 00:14:27.447 { 00:14:27.447 "name": "BaseBdev1", 00:14:27.447 "uuid": "72bca0eb-0df6-400e-8a5b-15e355be09de", 00:14:27.447 "is_configured": true, 00:14:27.447 "data_offset": 0, 00:14:27.447 "data_size": 65536 00:14:27.447 }, 00:14:27.447 { 00:14:27.447 "name": "BaseBdev2", 00:14:27.447 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:27.447 "is_configured": false, 00:14:27.447 "data_offset": 0, 00:14:27.447 "data_size": 0 00:14:27.447 }, 00:14:27.447 { 00:14:27.447 "name": "BaseBdev3", 00:14:27.447 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:27.447 "is_configured": false, 00:14:27.447 "data_offset": 0, 00:14:27.447 "data_size": 0 00:14:27.447 } 00:14:27.447 ] 00:14:27.447 }' 00:14:27.447 23:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:27.447 23:56:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:28.014 23:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:28.014 [2024-05-14 23:56:28.593439] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:28.014 [2024-05-14 23:56:28.593477] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x203a4b0 name Existed_Raid, state configuring 00:14:28.272 23:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:28.272 [2024-05-14 23:56:28.773940] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:28.272 [2024-05-14 23:56:28.775450] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:28.272 [2024-05-14 23:56:28.775483] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:28.272 [2024-05-14 23:56:28.775494] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:28.272 [2024-05-14 23:56:28.775510] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:28.272 23:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:14:28.272 23:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:14:28.272 23:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:28.272 23:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:28.272 23:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:28.272 23:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:28.272 23:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:28.272 23:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:28.272 23:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:28.272 23:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:28.272 23:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:28.272 23:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:28.272 23:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:28.272 23:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:28.533 23:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:28.533 "name": "Existed_Raid", 00:14:28.533 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:28.533 "strip_size_kb": 0, 00:14:28.533 "state": "configuring", 00:14:28.533 "raid_level": "raid1", 00:14:28.533 "superblock": false, 00:14:28.533 "num_base_bdevs": 3, 00:14:28.533 "num_base_bdevs_discovered": 1, 00:14:28.533 "num_base_bdevs_operational": 3, 00:14:28.533 "base_bdevs_list": [ 00:14:28.533 { 00:14:28.533 "name": "BaseBdev1", 00:14:28.533 "uuid": "72bca0eb-0df6-400e-8a5b-15e355be09de", 00:14:28.533 "is_configured": true, 00:14:28.533 "data_offset": 0, 00:14:28.533 "data_size": 65536 00:14:28.533 }, 00:14:28.533 { 00:14:28.533 "name": "BaseBdev2", 00:14:28.533 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:28.533 "is_configured": false, 00:14:28.533 "data_offset": 0, 00:14:28.533 "data_size": 0 00:14:28.533 }, 00:14:28.533 { 00:14:28.533 "name": "BaseBdev3", 00:14:28.533 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:28.533 "is_configured": false, 00:14:28.533 "data_offset": 0, 00:14:28.533 "data_size": 0 00:14:28.533 } 00:14:28.533 ] 00:14:28.533 }' 00:14:28.533 23:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:28.533 23:56:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:29.132 23:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:29.389 [2024-05-14 23:56:29.869464] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:29.389 BaseBdev2 00:14:29.389 23:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:14:29.389 23:56:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:14:29.389 23:56:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:29.389 23:56:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:14:29.389 23:56:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:29.389 23:56:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:29.389 23:56:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:29.647 23:56:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:29.905 [ 00:14:29.905 { 00:14:29.905 "name": "BaseBdev2", 00:14:29.905 "aliases": [ 00:14:29.905 "3e670965-6529-44d2-9347-186ee50d9bf1" 00:14:29.905 ], 00:14:29.905 "product_name": "Malloc disk", 00:14:29.905 "block_size": 512, 00:14:29.905 "num_blocks": 65536, 00:14:29.905 "uuid": "3e670965-6529-44d2-9347-186ee50d9bf1", 00:14:29.905 "assigned_rate_limits": { 00:14:29.905 "rw_ios_per_sec": 0, 00:14:29.905 "rw_mbytes_per_sec": 0, 00:14:29.906 "r_mbytes_per_sec": 0, 00:14:29.906 "w_mbytes_per_sec": 0 00:14:29.906 }, 00:14:29.906 "claimed": true, 00:14:29.906 "claim_type": "exclusive_write", 00:14:29.906 "zoned": false, 00:14:29.906 "supported_io_types": { 00:14:29.906 "read": true, 00:14:29.906 "write": true, 00:14:29.906 "unmap": true, 00:14:29.906 "write_zeroes": true, 00:14:29.906 "flush": true, 00:14:29.906 "reset": true, 00:14:29.906 "compare": false, 00:14:29.906 "compare_and_write": false, 00:14:29.906 "abort": true, 00:14:29.906 "nvme_admin": false, 00:14:29.906 "nvme_io": false 00:14:29.906 }, 00:14:29.906 "memory_domains": [ 00:14:29.906 { 00:14:29.906 "dma_device_id": "system", 00:14:29.906 "dma_device_type": 1 00:14:29.906 }, 00:14:29.906 { 00:14:29.906 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:29.906 "dma_device_type": 2 00:14:29.906 } 00:14:29.906 ], 00:14:29.906 "driver_specific": {} 00:14:29.906 } 00:14:29.906 ] 00:14:29.906 23:56:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:14:29.906 23:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:14:29.906 23:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:14:29.906 23:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:29.906 23:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:29.906 23:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:29.906 23:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:29.906 23:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:29.906 23:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:29.906 23:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:29.906 23:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:29.906 23:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:29.906 23:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:29.906 23:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:29.906 23:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:30.164 23:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:30.164 "name": "Existed_Raid", 00:14:30.164 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:30.164 "strip_size_kb": 0, 00:14:30.164 "state": "configuring", 00:14:30.164 "raid_level": "raid1", 00:14:30.164 "superblock": false, 00:14:30.164 "num_base_bdevs": 3, 00:14:30.164 "num_base_bdevs_discovered": 2, 00:14:30.164 "num_base_bdevs_operational": 3, 00:14:30.164 "base_bdevs_list": [ 00:14:30.164 { 00:14:30.164 "name": "BaseBdev1", 00:14:30.164 "uuid": "72bca0eb-0df6-400e-8a5b-15e355be09de", 00:14:30.164 "is_configured": true, 00:14:30.164 "data_offset": 0, 00:14:30.164 "data_size": 65536 00:14:30.164 }, 00:14:30.164 { 00:14:30.164 "name": "BaseBdev2", 00:14:30.164 "uuid": "3e670965-6529-44d2-9347-186ee50d9bf1", 00:14:30.164 "is_configured": true, 00:14:30.164 "data_offset": 0, 00:14:30.164 "data_size": 65536 00:14:30.164 }, 00:14:30.164 { 00:14:30.164 "name": "BaseBdev3", 00:14:30.164 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:30.164 "is_configured": false, 00:14:30.164 "data_offset": 0, 00:14:30.164 "data_size": 0 00:14:30.164 } 00:14:30.164 ] 00:14:30.164 }' 00:14:30.164 23:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:30.164 23:56:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:30.729 23:56:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:30.987 [2024-05-14 23:56:31.433070] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:30.987 [2024-05-14 23:56:31.433109] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x203b560 00:14:30.987 [2024-05-14 23:56:31.433123] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:14:30.987 [2024-05-14 23:56:31.433322] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2052490 00:14:30.987 [2024-05-14 23:56:31.433471] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x203b560 00:14:30.987 [2024-05-14 23:56:31.433482] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x203b560 00:14:30.987 [2024-05-14 23:56:31.433650] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:30.987 BaseBdev3 00:14:30.987 23:56:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:14:30.987 23:56:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:14:30.987 23:56:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:30.987 23:56:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:14:30.987 23:56:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:30.987 23:56:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:30.987 23:56:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:31.246 23:56:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:31.503 [ 00:14:31.503 { 00:14:31.503 "name": "BaseBdev3", 00:14:31.503 "aliases": [ 00:14:31.503 "09a15ee9-61eb-4421-8183-5bd27cded52b" 00:14:31.503 ], 00:14:31.503 "product_name": "Malloc disk", 00:14:31.503 "block_size": 512, 00:14:31.503 "num_blocks": 65536, 00:14:31.503 "uuid": "09a15ee9-61eb-4421-8183-5bd27cded52b", 00:14:31.503 "assigned_rate_limits": { 00:14:31.503 "rw_ios_per_sec": 0, 00:14:31.503 "rw_mbytes_per_sec": 0, 00:14:31.503 "r_mbytes_per_sec": 0, 00:14:31.503 "w_mbytes_per_sec": 0 00:14:31.503 }, 00:14:31.503 "claimed": true, 00:14:31.503 "claim_type": "exclusive_write", 00:14:31.503 "zoned": false, 00:14:31.503 "supported_io_types": { 00:14:31.503 "read": true, 00:14:31.503 "write": true, 00:14:31.503 "unmap": true, 00:14:31.503 "write_zeroes": true, 00:14:31.503 "flush": true, 00:14:31.503 "reset": true, 00:14:31.503 "compare": false, 00:14:31.503 "compare_and_write": false, 00:14:31.503 "abort": true, 00:14:31.503 "nvme_admin": false, 00:14:31.503 "nvme_io": false 00:14:31.503 }, 00:14:31.503 "memory_domains": [ 00:14:31.503 { 00:14:31.503 "dma_device_id": "system", 00:14:31.503 "dma_device_type": 1 00:14:31.503 }, 00:14:31.503 { 00:14:31.503 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:31.503 "dma_device_type": 2 00:14:31.503 } 00:14:31.503 ], 00:14:31.503 "driver_specific": {} 00:14:31.503 } 00:14:31.503 ] 00:14:31.503 23:56:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:14:31.503 23:56:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:14:31.503 23:56:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:14:31.503 23:56:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:14:31.503 23:56:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:31.503 23:56:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:14:31.503 23:56:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:31.503 23:56:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:31.503 23:56:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:31.503 23:56:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:31.503 23:56:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:31.503 23:56:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:31.503 23:56:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:31.503 23:56:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:31.503 23:56:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:31.761 23:56:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:31.761 "name": "Existed_Raid", 00:14:31.761 "uuid": "11b17c7c-99d6-4229-a064-106dd49d32ed", 00:14:31.761 "strip_size_kb": 0, 00:14:31.761 "state": "online", 00:14:31.761 "raid_level": "raid1", 00:14:31.761 "superblock": false, 00:14:31.761 "num_base_bdevs": 3, 00:14:31.761 "num_base_bdevs_discovered": 3, 00:14:31.761 "num_base_bdevs_operational": 3, 00:14:31.761 "base_bdevs_list": [ 00:14:31.761 { 00:14:31.761 "name": "BaseBdev1", 00:14:31.761 "uuid": "72bca0eb-0df6-400e-8a5b-15e355be09de", 00:14:31.761 "is_configured": true, 00:14:31.761 "data_offset": 0, 00:14:31.761 "data_size": 65536 00:14:31.761 }, 00:14:31.761 { 00:14:31.761 "name": "BaseBdev2", 00:14:31.761 "uuid": "3e670965-6529-44d2-9347-186ee50d9bf1", 00:14:31.761 "is_configured": true, 00:14:31.761 "data_offset": 0, 00:14:31.761 "data_size": 65536 00:14:31.761 }, 00:14:31.761 { 00:14:31.761 "name": "BaseBdev3", 00:14:31.761 "uuid": "09a15ee9-61eb-4421-8183-5bd27cded52b", 00:14:31.761 "is_configured": true, 00:14:31.761 "data_offset": 0, 00:14:31.761 "data_size": 65536 00:14:31.761 } 00:14:31.761 ] 00:14:31.761 }' 00:14:31.761 23:56:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:31.761 23:56:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:32.326 23:56:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:14:32.326 23:56:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:14:32.326 23:56:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:14:32.326 23:56:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:14:32.326 23:56:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:14:32.326 23:56:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:14:32.326 23:56:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:32.326 23:56:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:14:32.583 [2024-05-14 23:56:33.009524] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:32.583 23:56:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:14:32.583 "name": "Existed_Raid", 00:14:32.583 "aliases": [ 00:14:32.583 "11b17c7c-99d6-4229-a064-106dd49d32ed" 00:14:32.583 ], 00:14:32.583 "product_name": "Raid Volume", 00:14:32.583 "block_size": 512, 00:14:32.583 "num_blocks": 65536, 00:14:32.583 "uuid": "11b17c7c-99d6-4229-a064-106dd49d32ed", 00:14:32.583 "assigned_rate_limits": { 00:14:32.583 "rw_ios_per_sec": 0, 00:14:32.583 "rw_mbytes_per_sec": 0, 00:14:32.583 "r_mbytes_per_sec": 0, 00:14:32.583 "w_mbytes_per_sec": 0 00:14:32.583 }, 00:14:32.583 "claimed": false, 00:14:32.583 "zoned": false, 00:14:32.583 "supported_io_types": { 00:14:32.583 "read": true, 00:14:32.583 "write": true, 00:14:32.583 "unmap": false, 00:14:32.583 "write_zeroes": true, 00:14:32.583 "flush": false, 00:14:32.583 "reset": true, 00:14:32.583 "compare": false, 00:14:32.583 "compare_and_write": false, 00:14:32.583 "abort": false, 00:14:32.583 "nvme_admin": false, 00:14:32.583 "nvme_io": false 00:14:32.583 }, 00:14:32.583 "memory_domains": [ 00:14:32.583 { 00:14:32.583 "dma_device_id": "system", 00:14:32.583 "dma_device_type": 1 00:14:32.583 }, 00:14:32.583 { 00:14:32.583 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:32.583 "dma_device_type": 2 00:14:32.583 }, 00:14:32.583 { 00:14:32.583 "dma_device_id": "system", 00:14:32.583 "dma_device_type": 1 00:14:32.583 }, 00:14:32.583 { 00:14:32.583 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:32.583 "dma_device_type": 2 00:14:32.583 }, 00:14:32.583 { 00:14:32.583 "dma_device_id": "system", 00:14:32.583 "dma_device_type": 1 00:14:32.583 }, 00:14:32.583 { 00:14:32.583 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:32.583 "dma_device_type": 2 00:14:32.583 } 00:14:32.583 ], 00:14:32.583 "driver_specific": { 00:14:32.583 "raid": { 00:14:32.583 "uuid": "11b17c7c-99d6-4229-a064-106dd49d32ed", 00:14:32.583 "strip_size_kb": 0, 00:14:32.583 "state": "online", 00:14:32.583 "raid_level": "raid1", 00:14:32.583 "superblock": false, 00:14:32.584 "num_base_bdevs": 3, 00:14:32.584 "num_base_bdevs_discovered": 3, 00:14:32.584 "num_base_bdevs_operational": 3, 00:14:32.584 "base_bdevs_list": [ 00:14:32.584 { 00:14:32.584 "name": "BaseBdev1", 00:14:32.584 "uuid": "72bca0eb-0df6-400e-8a5b-15e355be09de", 00:14:32.584 "is_configured": true, 00:14:32.584 "data_offset": 0, 00:14:32.584 "data_size": 65536 00:14:32.584 }, 00:14:32.584 { 00:14:32.584 "name": "BaseBdev2", 00:14:32.584 "uuid": "3e670965-6529-44d2-9347-186ee50d9bf1", 00:14:32.584 "is_configured": true, 00:14:32.584 "data_offset": 0, 00:14:32.584 "data_size": 65536 00:14:32.584 }, 00:14:32.584 { 00:14:32.584 "name": "BaseBdev3", 00:14:32.584 "uuid": "09a15ee9-61eb-4421-8183-5bd27cded52b", 00:14:32.584 "is_configured": true, 00:14:32.584 "data_offset": 0, 00:14:32.584 "data_size": 65536 00:14:32.584 } 00:14:32.584 ] 00:14:32.584 } 00:14:32.584 } 00:14:32.584 }' 00:14:32.584 23:56:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:32.584 23:56:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:14:32.584 BaseBdev2 00:14:32.584 BaseBdev3' 00:14:32.584 23:56:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:32.584 23:56:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:32.584 23:56:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:32.842 23:56:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:32.842 "name": "BaseBdev1", 00:14:32.842 "aliases": [ 00:14:32.842 "72bca0eb-0df6-400e-8a5b-15e355be09de" 00:14:32.842 ], 00:14:32.842 "product_name": "Malloc disk", 00:14:32.842 "block_size": 512, 00:14:32.842 "num_blocks": 65536, 00:14:32.842 "uuid": "72bca0eb-0df6-400e-8a5b-15e355be09de", 00:14:32.842 "assigned_rate_limits": { 00:14:32.842 "rw_ios_per_sec": 0, 00:14:32.842 "rw_mbytes_per_sec": 0, 00:14:32.842 "r_mbytes_per_sec": 0, 00:14:32.842 "w_mbytes_per_sec": 0 00:14:32.842 }, 00:14:32.842 "claimed": true, 00:14:32.842 "claim_type": "exclusive_write", 00:14:32.842 "zoned": false, 00:14:32.842 "supported_io_types": { 00:14:32.842 "read": true, 00:14:32.842 "write": true, 00:14:32.842 "unmap": true, 00:14:32.842 "write_zeroes": true, 00:14:32.842 "flush": true, 00:14:32.842 "reset": true, 00:14:32.842 "compare": false, 00:14:32.842 "compare_and_write": false, 00:14:32.842 "abort": true, 00:14:32.842 "nvme_admin": false, 00:14:32.842 "nvme_io": false 00:14:32.842 }, 00:14:32.842 "memory_domains": [ 00:14:32.842 { 00:14:32.842 "dma_device_id": "system", 00:14:32.842 "dma_device_type": 1 00:14:32.842 }, 00:14:32.842 { 00:14:32.842 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:32.842 "dma_device_type": 2 00:14:32.842 } 00:14:32.842 ], 00:14:32.842 "driver_specific": {} 00:14:32.842 }' 00:14:32.842 23:56:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:32.843 23:56:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:32.843 23:56:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:32.843 23:56:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:33.100 23:56:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:33.100 23:56:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:33.100 23:56:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:33.100 23:56:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:33.100 23:56:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:33.100 23:56:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:33.100 23:56:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:33.100 23:56:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:33.100 23:56:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:33.100 23:56:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:33.100 23:56:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:33.357 23:56:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:33.357 "name": "BaseBdev2", 00:14:33.357 "aliases": [ 00:14:33.357 "3e670965-6529-44d2-9347-186ee50d9bf1" 00:14:33.357 ], 00:14:33.357 "product_name": "Malloc disk", 00:14:33.357 "block_size": 512, 00:14:33.357 "num_blocks": 65536, 00:14:33.357 "uuid": "3e670965-6529-44d2-9347-186ee50d9bf1", 00:14:33.357 "assigned_rate_limits": { 00:14:33.357 "rw_ios_per_sec": 0, 00:14:33.357 "rw_mbytes_per_sec": 0, 00:14:33.357 "r_mbytes_per_sec": 0, 00:14:33.357 "w_mbytes_per_sec": 0 00:14:33.357 }, 00:14:33.357 "claimed": true, 00:14:33.357 "claim_type": "exclusive_write", 00:14:33.357 "zoned": false, 00:14:33.357 "supported_io_types": { 00:14:33.357 "read": true, 00:14:33.357 "write": true, 00:14:33.357 "unmap": true, 00:14:33.357 "write_zeroes": true, 00:14:33.357 "flush": true, 00:14:33.357 "reset": true, 00:14:33.357 "compare": false, 00:14:33.357 "compare_and_write": false, 00:14:33.357 "abort": true, 00:14:33.357 "nvme_admin": false, 00:14:33.357 "nvme_io": false 00:14:33.357 }, 00:14:33.357 "memory_domains": [ 00:14:33.357 { 00:14:33.357 "dma_device_id": "system", 00:14:33.357 "dma_device_type": 1 00:14:33.357 }, 00:14:33.357 { 00:14:33.357 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:33.357 "dma_device_type": 2 00:14:33.357 } 00:14:33.357 ], 00:14:33.357 "driver_specific": {} 00:14:33.357 }' 00:14:33.357 23:56:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:33.615 23:56:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:33.615 23:56:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:33.615 23:56:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:33.615 23:56:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:33.615 23:56:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:33.615 23:56:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:33.615 23:56:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:33.615 23:56:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:33.615 23:56:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:33.873 23:56:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:33.873 23:56:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:33.873 23:56:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:33.873 23:56:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:33.873 23:56:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:34.131 23:56:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:34.131 "name": "BaseBdev3", 00:14:34.131 "aliases": [ 00:14:34.131 "09a15ee9-61eb-4421-8183-5bd27cded52b" 00:14:34.131 ], 00:14:34.131 "product_name": "Malloc disk", 00:14:34.131 "block_size": 512, 00:14:34.131 "num_blocks": 65536, 00:14:34.131 "uuid": "09a15ee9-61eb-4421-8183-5bd27cded52b", 00:14:34.131 "assigned_rate_limits": { 00:14:34.131 "rw_ios_per_sec": 0, 00:14:34.131 "rw_mbytes_per_sec": 0, 00:14:34.131 "r_mbytes_per_sec": 0, 00:14:34.131 "w_mbytes_per_sec": 0 00:14:34.131 }, 00:14:34.131 "claimed": true, 00:14:34.131 "claim_type": "exclusive_write", 00:14:34.131 "zoned": false, 00:14:34.131 "supported_io_types": { 00:14:34.131 "read": true, 00:14:34.131 "write": true, 00:14:34.131 "unmap": true, 00:14:34.131 "write_zeroes": true, 00:14:34.131 "flush": true, 00:14:34.131 "reset": true, 00:14:34.131 "compare": false, 00:14:34.131 "compare_and_write": false, 00:14:34.131 "abort": true, 00:14:34.131 "nvme_admin": false, 00:14:34.131 "nvme_io": false 00:14:34.131 }, 00:14:34.131 "memory_domains": [ 00:14:34.131 { 00:14:34.131 "dma_device_id": "system", 00:14:34.131 "dma_device_type": 1 00:14:34.131 }, 00:14:34.131 { 00:14:34.131 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:34.131 "dma_device_type": 2 00:14:34.131 } 00:14:34.131 ], 00:14:34.131 "driver_specific": {} 00:14:34.131 }' 00:14:34.131 23:56:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:34.131 23:56:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:34.131 23:56:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:34.131 23:56:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:34.131 23:56:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:34.131 23:56:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:34.131 23:56:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:34.389 23:56:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:34.389 23:56:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:34.389 23:56:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:34.389 23:56:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:34.389 23:56:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:34.389 23:56:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:34.647 [2024-05-14 23:56:35.106851] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:34.647 23:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # local expected_state 00:14:34.647 23:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # has_redundancy raid1 00:14:34.647 23:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:14:34.647 23:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 0 00:14:34.647 23:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@280 -- # expected_state=online 00:14:34.647 23:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:14:34.647 23:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:34.647 23:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:14:34.647 23:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:34.647 23:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:34.647 23:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:14:34.647 23:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:34.647 23:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:34.647 23:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:34.647 23:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:34.647 23:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:34.647 23:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:34.905 23:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:34.905 "name": "Existed_Raid", 00:14:34.905 "uuid": "11b17c7c-99d6-4229-a064-106dd49d32ed", 00:14:34.905 "strip_size_kb": 0, 00:14:34.905 "state": "online", 00:14:34.905 "raid_level": "raid1", 00:14:34.905 "superblock": false, 00:14:34.905 "num_base_bdevs": 3, 00:14:34.905 "num_base_bdevs_discovered": 2, 00:14:34.905 "num_base_bdevs_operational": 2, 00:14:34.905 "base_bdevs_list": [ 00:14:34.905 { 00:14:34.905 "name": null, 00:14:34.905 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:34.905 "is_configured": false, 00:14:34.905 "data_offset": 0, 00:14:34.905 "data_size": 65536 00:14:34.905 }, 00:14:34.905 { 00:14:34.905 "name": "BaseBdev2", 00:14:34.905 "uuid": "3e670965-6529-44d2-9347-186ee50d9bf1", 00:14:34.905 "is_configured": true, 00:14:34.905 "data_offset": 0, 00:14:34.905 "data_size": 65536 00:14:34.905 }, 00:14:34.905 { 00:14:34.905 "name": "BaseBdev3", 00:14:34.905 "uuid": "09a15ee9-61eb-4421-8183-5bd27cded52b", 00:14:34.905 "is_configured": true, 00:14:34.905 "data_offset": 0, 00:14:34.905 "data_size": 65536 00:14:34.905 } 00:14:34.905 ] 00:14:34.905 }' 00:14:34.905 23:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:34.905 23:56:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:35.471 23:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:14:35.471 23:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:14:35.471 23:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:35.471 23:56:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:14:35.729 23:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:14:35.729 23:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:35.729 23:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:35.987 [2024-05-14 23:56:36.447496] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:35.987 23:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:14:35.987 23:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:14:35.987 23:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:35.987 23:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:14:36.245 23:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:14:36.245 23:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:36.245 23:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:36.502 [2024-05-14 23:56:36.949274] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:36.502 [2024-05-14 23:56:36.949346] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:36.503 [2024-05-14 23:56:36.961807] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:36.503 [2024-05-14 23:56:36.961868] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:36.503 [2024-05-14 23:56:36.961881] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x203b560 name Existed_Raid, state offline 00:14:36.503 23:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:14:36.503 23:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:14:36.503 23:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:36.503 23:56:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:14:36.761 23:56:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:14:36.761 23:56:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:14:36.761 23:56:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@300 -- # '[' 3 -gt 2 ']' 00:14:36.761 23:56:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:14:36.761 23:56:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:14:36.761 23:56:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:37.019 BaseBdev2 00:14:37.019 23:56:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:14:37.019 23:56:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:14:37.019 23:56:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:37.019 23:56:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:14:37.019 23:56:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:37.019 23:56:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:37.019 23:56:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:37.277 23:56:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:37.535 [ 00:14:37.535 { 00:14:37.535 "name": "BaseBdev2", 00:14:37.535 "aliases": [ 00:14:37.535 "88d18c01-56b7-4a9b-b41b-c871fa9e4d5c" 00:14:37.535 ], 00:14:37.535 "product_name": "Malloc disk", 00:14:37.535 "block_size": 512, 00:14:37.535 "num_blocks": 65536, 00:14:37.535 "uuid": "88d18c01-56b7-4a9b-b41b-c871fa9e4d5c", 00:14:37.535 "assigned_rate_limits": { 00:14:37.535 "rw_ios_per_sec": 0, 00:14:37.535 "rw_mbytes_per_sec": 0, 00:14:37.535 "r_mbytes_per_sec": 0, 00:14:37.535 "w_mbytes_per_sec": 0 00:14:37.535 }, 00:14:37.535 "claimed": false, 00:14:37.535 "zoned": false, 00:14:37.535 "supported_io_types": { 00:14:37.535 "read": true, 00:14:37.535 "write": true, 00:14:37.535 "unmap": true, 00:14:37.535 "write_zeroes": true, 00:14:37.535 "flush": true, 00:14:37.535 "reset": true, 00:14:37.535 "compare": false, 00:14:37.535 "compare_and_write": false, 00:14:37.535 "abort": true, 00:14:37.535 "nvme_admin": false, 00:14:37.535 "nvme_io": false 00:14:37.535 }, 00:14:37.535 "memory_domains": [ 00:14:37.535 { 00:14:37.535 "dma_device_id": "system", 00:14:37.535 "dma_device_type": 1 00:14:37.535 }, 00:14:37.535 { 00:14:37.535 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:37.535 "dma_device_type": 2 00:14:37.535 } 00:14:37.535 ], 00:14:37.535 "driver_specific": {} 00:14:37.535 } 00:14:37.535 ] 00:14:37.535 23:56:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:14:37.535 23:56:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:14:37.535 23:56:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:14:37.536 23:56:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:37.794 BaseBdev3 00:14:37.794 23:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:14:37.794 23:56:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:14:37.794 23:56:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:37.794 23:56:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:14:37.794 23:56:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:37.794 23:56:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:37.794 23:56:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:38.053 23:56:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:38.053 [ 00:14:38.053 { 00:14:38.053 "name": "BaseBdev3", 00:14:38.053 "aliases": [ 00:14:38.053 "3bd410a0-d103-480e-94d5-3243d801648d" 00:14:38.053 ], 00:14:38.053 "product_name": "Malloc disk", 00:14:38.053 "block_size": 512, 00:14:38.053 "num_blocks": 65536, 00:14:38.053 "uuid": "3bd410a0-d103-480e-94d5-3243d801648d", 00:14:38.053 "assigned_rate_limits": { 00:14:38.053 "rw_ios_per_sec": 0, 00:14:38.053 "rw_mbytes_per_sec": 0, 00:14:38.053 "r_mbytes_per_sec": 0, 00:14:38.053 "w_mbytes_per_sec": 0 00:14:38.053 }, 00:14:38.053 "claimed": false, 00:14:38.053 "zoned": false, 00:14:38.053 "supported_io_types": { 00:14:38.053 "read": true, 00:14:38.053 "write": true, 00:14:38.053 "unmap": true, 00:14:38.053 "write_zeroes": true, 00:14:38.053 "flush": true, 00:14:38.053 "reset": true, 00:14:38.053 "compare": false, 00:14:38.053 "compare_and_write": false, 00:14:38.053 "abort": true, 00:14:38.053 "nvme_admin": false, 00:14:38.053 "nvme_io": false 00:14:38.053 }, 00:14:38.053 "memory_domains": [ 00:14:38.053 { 00:14:38.053 "dma_device_id": "system", 00:14:38.053 "dma_device_type": 1 00:14:38.053 }, 00:14:38.053 { 00:14:38.053 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:38.053 "dma_device_type": 2 00:14:38.053 } 00:14:38.053 ], 00:14:38.053 "driver_specific": {} 00:14:38.053 } 00:14:38.053 ] 00:14:38.311 23:56:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:14:38.311 23:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:14:38.311 23:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:14:38.312 23:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:38.312 [2024-05-14 23:56:38.879359] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:38.312 [2024-05-14 23:56:38.879411] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:38.312 [2024-05-14 23:56:38.879430] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:38.312 [2024-05-14 23:56:38.880823] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:38.312 23:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:38.312 23:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:38.312 23:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:38.312 23:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:38.312 23:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:38.312 23:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:38.312 23:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:38.312 23:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:38.312 23:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:38.312 23:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:38.570 23:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:38.570 23:56:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:38.570 23:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:38.570 "name": "Existed_Raid", 00:14:38.570 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:38.570 "strip_size_kb": 0, 00:14:38.570 "state": "configuring", 00:14:38.570 "raid_level": "raid1", 00:14:38.570 "superblock": false, 00:14:38.570 "num_base_bdevs": 3, 00:14:38.570 "num_base_bdevs_discovered": 2, 00:14:38.570 "num_base_bdevs_operational": 3, 00:14:38.570 "base_bdevs_list": [ 00:14:38.570 { 00:14:38.570 "name": "BaseBdev1", 00:14:38.570 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:38.570 "is_configured": false, 00:14:38.570 "data_offset": 0, 00:14:38.570 "data_size": 0 00:14:38.570 }, 00:14:38.570 { 00:14:38.570 "name": "BaseBdev2", 00:14:38.570 "uuid": "88d18c01-56b7-4a9b-b41b-c871fa9e4d5c", 00:14:38.570 "is_configured": true, 00:14:38.570 "data_offset": 0, 00:14:38.570 "data_size": 65536 00:14:38.570 }, 00:14:38.570 { 00:14:38.570 "name": "BaseBdev3", 00:14:38.570 "uuid": "3bd410a0-d103-480e-94d5-3243d801648d", 00:14:38.570 "is_configured": true, 00:14:38.570 "data_offset": 0, 00:14:38.570 "data_size": 65536 00:14:38.570 } 00:14:38.570 ] 00:14:38.570 }' 00:14:38.570 23:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:38.570 23:56:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:39.503 23:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:39.503 [2024-05-14 23:56:39.958218] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:39.503 23:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:39.503 23:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:39.503 23:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:39.503 23:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:39.503 23:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:39.503 23:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:39.503 23:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:39.503 23:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:39.503 23:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:39.503 23:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:39.503 23:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:39.503 23:56:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:39.761 23:56:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:39.761 "name": "Existed_Raid", 00:14:39.761 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:39.761 "strip_size_kb": 0, 00:14:39.761 "state": "configuring", 00:14:39.761 "raid_level": "raid1", 00:14:39.761 "superblock": false, 00:14:39.761 "num_base_bdevs": 3, 00:14:39.761 "num_base_bdevs_discovered": 1, 00:14:39.761 "num_base_bdevs_operational": 3, 00:14:39.761 "base_bdevs_list": [ 00:14:39.761 { 00:14:39.761 "name": "BaseBdev1", 00:14:39.761 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:39.761 "is_configured": false, 00:14:39.761 "data_offset": 0, 00:14:39.761 "data_size": 0 00:14:39.761 }, 00:14:39.761 { 00:14:39.761 "name": null, 00:14:39.761 "uuid": "88d18c01-56b7-4a9b-b41b-c871fa9e4d5c", 00:14:39.761 "is_configured": false, 00:14:39.761 "data_offset": 0, 00:14:39.761 "data_size": 65536 00:14:39.761 }, 00:14:39.761 { 00:14:39.761 "name": "BaseBdev3", 00:14:39.761 "uuid": "3bd410a0-d103-480e-94d5-3243d801648d", 00:14:39.761 "is_configured": true, 00:14:39.761 "data_offset": 0, 00:14:39.761 "data_size": 65536 00:14:39.761 } 00:14:39.761 ] 00:14:39.761 }' 00:14:39.761 23:56:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:39.761 23:56:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:40.340 23:56:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:40.340 23:56:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:40.600 23:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:14:40.600 23:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:40.857 [2024-05-14 23:56:41.305261] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:40.857 BaseBdev1 00:14:40.857 23:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:14:40.857 23:56:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:14:40.858 23:56:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:40.858 23:56:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:14:40.858 23:56:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:40.858 23:56:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:40.858 23:56:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:41.115 23:56:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:41.373 [ 00:14:41.373 { 00:14:41.373 "name": "BaseBdev1", 00:14:41.373 "aliases": [ 00:14:41.373 "b8d96af6-6972-4d0c-86fd-ec96d19dd2bd" 00:14:41.373 ], 00:14:41.373 "product_name": "Malloc disk", 00:14:41.373 "block_size": 512, 00:14:41.373 "num_blocks": 65536, 00:14:41.373 "uuid": "b8d96af6-6972-4d0c-86fd-ec96d19dd2bd", 00:14:41.373 "assigned_rate_limits": { 00:14:41.373 "rw_ios_per_sec": 0, 00:14:41.373 "rw_mbytes_per_sec": 0, 00:14:41.373 "r_mbytes_per_sec": 0, 00:14:41.373 "w_mbytes_per_sec": 0 00:14:41.373 }, 00:14:41.373 "claimed": true, 00:14:41.373 "claim_type": "exclusive_write", 00:14:41.373 "zoned": false, 00:14:41.373 "supported_io_types": { 00:14:41.373 "read": true, 00:14:41.373 "write": true, 00:14:41.373 "unmap": true, 00:14:41.373 "write_zeroes": true, 00:14:41.373 "flush": true, 00:14:41.373 "reset": true, 00:14:41.373 "compare": false, 00:14:41.373 "compare_and_write": false, 00:14:41.373 "abort": true, 00:14:41.373 "nvme_admin": false, 00:14:41.373 "nvme_io": false 00:14:41.373 }, 00:14:41.373 "memory_domains": [ 00:14:41.373 { 00:14:41.373 "dma_device_id": "system", 00:14:41.373 "dma_device_type": 1 00:14:41.373 }, 00:14:41.373 { 00:14:41.373 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:41.373 "dma_device_type": 2 00:14:41.373 } 00:14:41.373 ], 00:14:41.373 "driver_specific": {} 00:14:41.373 } 00:14:41.373 ] 00:14:41.373 23:56:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:14:41.373 23:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:41.373 23:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:41.373 23:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:41.373 23:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:41.373 23:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:41.373 23:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:41.373 23:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:41.373 23:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:41.373 23:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:41.373 23:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:41.373 23:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:41.373 23:56:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:41.630 23:56:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:41.630 "name": "Existed_Raid", 00:14:41.630 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:41.630 "strip_size_kb": 0, 00:14:41.630 "state": "configuring", 00:14:41.630 "raid_level": "raid1", 00:14:41.630 "superblock": false, 00:14:41.630 "num_base_bdevs": 3, 00:14:41.630 "num_base_bdevs_discovered": 2, 00:14:41.630 "num_base_bdevs_operational": 3, 00:14:41.630 "base_bdevs_list": [ 00:14:41.630 { 00:14:41.630 "name": "BaseBdev1", 00:14:41.630 "uuid": "b8d96af6-6972-4d0c-86fd-ec96d19dd2bd", 00:14:41.630 "is_configured": true, 00:14:41.630 "data_offset": 0, 00:14:41.630 "data_size": 65536 00:14:41.630 }, 00:14:41.630 { 00:14:41.630 "name": null, 00:14:41.630 "uuid": "88d18c01-56b7-4a9b-b41b-c871fa9e4d5c", 00:14:41.630 "is_configured": false, 00:14:41.630 "data_offset": 0, 00:14:41.630 "data_size": 65536 00:14:41.630 }, 00:14:41.630 { 00:14:41.630 "name": "BaseBdev3", 00:14:41.630 "uuid": "3bd410a0-d103-480e-94d5-3243d801648d", 00:14:41.630 "is_configured": true, 00:14:41.630 "data_offset": 0, 00:14:41.630 "data_size": 65536 00:14:41.630 } 00:14:41.630 ] 00:14:41.630 }' 00:14:41.630 23:56:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:41.630 23:56:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:42.196 23:56:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:42.196 23:56:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:42.454 23:56:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:14:42.454 23:56:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:42.712 [2024-05-14 23:56:43.114094] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:42.712 23:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:42.712 23:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:42.712 23:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:42.712 23:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:42.712 23:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:42.712 23:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:42.712 23:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:42.712 23:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:42.712 23:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:42.712 23:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:42.712 23:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:42.712 23:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:42.981 23:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:42.981 "name": "Existed_Raid", 00:14:42.981 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:42.981 "strip_size_kb": 0, 00:14:42.981 "state": "configuring", 00:14:42.981 "raid_level": "raid1", 00:14:42.981 "superblock": false, 00:14:42.981 "num_base_bdevs": 3, 00:14:42.981 "num_base_bdevs_discovered": 1, 00:14:42.981 "num_base_bdevs_operational": 3, 00:14:42.981 "base_bdevs_list": [ 00:14:42.981 { 00:14:42.981 "name": "BaseBdev1", 00:14:42.981 "uuid": "b8d96af6-6972-4d0c-86fd-ec96d19dd2bd", 00:14:42.981 "is_configured": true, 00:14:42.981 "data_offset": 0, 00:14:42.981 "data_size": 65536 00:14:42.981 }, 00:14:42.981 { 00:14:42.981 "name": null, 00:14:42.981 "uuid": "88d18c01-56b7-4a9b-b41b-c871fa9e4d5c", 00:14:42.981 "is_configured": false, 00:14:42.981 "data_offset": 0, 00:14:42.981 "data_size": 65536 00:14:42.981 }, 00:14:42.981 { 00:14:42.981 "name": null, 00:14:42.981 "uuid": "3bd410a0-d103-480e-94d5-3243d801648d", 00:14:42.981 "is_configured": false, 00:14:42.981 "data_offset": 0, 00:14:42.981 "data_size": 65536 00:14:42.981 } 00:14:42.981 ] 00:14:42.981 }' 00:14:42.981 23:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:42.981 23:56:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:43.547 23:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.547 23:56:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:43.805 23:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:14:43.805 23:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:43.805 [2024-05-14 23:56:44.357413] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:43.805 23:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:43.805 23:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:43.805 23:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:43.805 23:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:43.806 23:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:43.806 23:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:43.806 23:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:43.806 23:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:43.806 23:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:43.806 23:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:43.806 23:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.806 23:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:44.063 23:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:44.063 "name": "Existed_Raid", 00:14:44.063 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:44.063 "strip_size_kb": 0, 00:14:44.063 "state": "configuring", 00:14:44.063 "raid_level": "raid1", 00:14:44.063 "superblock": false, 00:14:44.063 "num_base_bdevs": 3, 00:14:44.063 "num_base_bdevs_discovered": 2, 00:14:44.063 "num_base_bdevs_operational": 3, 00:14:44.063 "base_bdevs_list": [ 00:14:44.063 { 00:14:44.063 "name": "BaseBdev1", 00:14:44.063 "uuid": "b8d96af6-6972-4d0c-86fd-ec96d19dd2bd", 00:14:44.063 "is_configured": true, 00:14:44.063 "data_offset": 0, 00:14:44.063 "data_size": 65536 00:14:44.063 }, 00:14:44.063 { 00:14:44.063 "name": null, 00:14:44.063 "uuid": "88d18c01-56b7-4a9b-b41b-c871fa9e4d5c", 00:14:44.063 "is_configured": false, 00:14:44.063 "data_offset": 0, 00:14:44.063 "data_size": 65536 00:14:44.063 }, 00:14:44.063 { 00:14:44.063 "name": "BaseBdev3", 00:14:44.063 "uuid": "3bd410a0-d103-480e-94d5-3243d801648d", 00:14:44.063 "is_configured": true, 00:14:44.063 "data_offset": 0, 00:14:44.063 "data_size": 65536 00:14:44.063 } 00:14:44.063 ] 00:14:44.063 }' 00:14:44.063 23:56:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:44.064 23:56:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:45.011 23:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:45.011 23:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:45.011 23:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:14:45.011 23:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:45.297 [2024-05-14 23:56:45.632832] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:45.297 23:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:45.297 23:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:45.297 23:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:45.297 23:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:45.297 23:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:45.297 23:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:45.297 23:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:45.297 23:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:45.297 23:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:45.297 23:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:45.297 23:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:45.297 23:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:45.571 23:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:45.571 "name": "Existed_Raid", 00:14:45.572 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:45.572 "strip_size_kb": 0, 00:14:45.572 "state": "configuring", 00:14:45.572 "raid_level": "raid1", 00:14:45.572 "superblock": false, 00:14:45.572 "num_base_bdevs": 3, 00:14:45.572 "num_base_bdevs_discovered": 1, 00:14:45.572 "num_base_bdevs_operational": 3, 00:14:45.572 "base_bdevs_list": [ 00:14:45.572 { 00:14:45.572 "name": null, 00:14:45.572 "uuid": "b8d96af6-6972-4d0c-86fd-ec96d19dd2bd", 00:14:45.572 "is_configured": false, 00:14:45.572 "data_offset": 0, 00:14:45.572 "data_size": 65536 00:14:45.572 }, 00:14:45.572 { 00:14:45.572 "name": null, 00:14:45.572 "uuid": "88d18c01-56b7-4a9b-b41b-c871fa9e4d5c", 00:14:45.572 "is_configured": false, 00:14:45.572 "data_offset": 0, 00:14:45.572 "data_size": 65536 00:14:45.572 }, 00:14:45.572 { 00:14:45.572 "name": "BaseBdev3", 00:14:45.572 "uuid": "3bd410a0-d103-480e-94d5-3243d801648d", 00:14:45.572 "is_configured": true, 00:14:45.572 "data_offset": 0, 00:14:45.572 "data_size": 65536 00:14:45.572 } 00:14:45.572 ] 00:14:45.572 }' 00:14:45.572 23:56:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:45.572 23:56:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:46.137 23:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:46.137 23:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:46.137 23:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:14:46.137 23:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:46.394 [2024-05-14 23:56:46.834839] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:46.395 23:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:46.395 23:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:46.395 23:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:46.395 23:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:46.395 23:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:46.395 23:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:46.395 23:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:46.395 23:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:46.395 23:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:46.395 23:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:46.395 23:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:46.395 23:56:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:46.652 23:56:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:46.652 "name": "Existed_Raid", 00:14:46.652 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:46.652 "strip_size_kb": 0, 00:14:46.652 "state": "configuring", 00:14:46.652 "raid_level": "raid1", 00:14:46.652 "superblock": false, 00:14:46.652 "num_base_bdevs": 3, 00:14:46.652 "num_base_bdevs_discovered": 2, 00:14:46.652 "num_base_bdevs_operational": 3, 00:14:46.652 "base_bdevs_list": [ 00:14:46.652 { 00:14:46.652 "name": null, 00:14:46.652 "uuid": "b8d96af6-6972-4d0c-86fd-ec96d19dd2bd", 00:14:46.652 "is_configured": false, 00:14:46.652 "data_offset": 0, 00:14:46.652 "data_size": 65536 00:14:46.652 }, 00:14:46.652 { 00:14:46.652 "name": "BaseBdev2", 00:14:46.652 "uuid": "88d18c01-56b7-4a9b-b41b-c871fa9e4d5c", 00:14:46.652 "is_configured": true, 00:14:46.652 "data_offset": 0, 00:14:46.652 "data_size": 65536 00:14:46.652 }, 00:14:46.652 { 00:14:46.652 "name": "BaseBdev3", 00:14:46.652 "uuid": "3bd410a0-d103-480e-94d5-3243d801648d", 00:14:46.652 "is_configured": true, 00:14:46.652 "data_offset": 0, 00:14:46.652 "data_size": 65536 00:14:46.652 } 00:14:46.652 ] 00:14:46.652 }' 00:14:46.652 23:56:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:46.652 23:56:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:47.217 23:56:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:47.217 23:56:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:47.476 23:56:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:14:47.476 23:56:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:47.476 23:56:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:47.476 23:56:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u b8d96af6-6972-4d0c-86fd-ec96d19dd2bd 00:14:47.735 [2024-05-14 23:56:48.291217] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:47.735 [2024-05-14 23:56:48.291260] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x21e01d0 00:14:47.735 [2024-05-14 23:56:48.291269] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:14:47.735 [2024-05-14 23:56:48.291484] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21edde0 00:14:47.735 [2024-05-14 23:56:48.291633] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21e01d0 00:14:47.735 [2024-05-14 23:56:48.291643] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x21e01d0 00:14:47.735 [2024-05-14 23:56:48.291815] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:47.735 NewBaseBdev 00:14:47.735 23:56:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:14:47.735 23:56:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:14:47.735 23:56:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:47.735 23:56:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:14:47.735 23:56:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:47.735 23:56:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:47.735 23:56:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:47.993 23:56:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:48.252 [ 00:14:48.252 { 00:14:48.252 "name": "NewBaseBdev", 00:14:48.252 "aliases": [ 00:14:48.252 "b8d96af6-6972-4d0c-86fd-ec96d19dd2bd" 00:14:48.252 ], 00:14:48.252 "product_name": "Malloc disk", 00:14:48.252 "block_size": 512, 00:14:48.252 "num_blocks": 65536, 00:14:48.252 "uuid": "b8d96af6-6972-4d0c-86fd-ec96d19dd2bd", 00:14:48.252 "assigned_rate_limits": { 00:14:48.252 "rw_ios_per_sec": 0, 00:14:48.252 "rw_mbytes_per_sec": 0, 00:14:48.252 "r_mbytes_per_sec": 0, 00:14:48.252 "w_mbytes_per_sec": 0 00:14:48.252 }, 00:14:48.252 "claimed": true, 00:14:48.252 "claim_type": "exclusive_write", 00:14:48.252 "zoned": false, 00:14:48.252 "supported_io_types": { 00:14:48.252 "read": true, 00:14:48.252 "write": true, 00:14:48.252 "unmap": true, 00:14:48.252 "write_zeroes": true, 00:14:48.252 "flush": true, 00:14:48.252 "reset": true, 00:14:48.252 "compare": false, 00:14:48.252 "compare_and_write": false, 00:14:48.252 "abort": true, 00:14:48.252 "nvme_admin": false, 00:14:48.252 "nvme_io": false 00:14:48.252 }, 00:14:48.252 "memory_domains": [ 00:14:48.252 { 00:14:48.252 "dma_device_id": "system", 00:14:48.252 "dma_device_type": 1 00:14:48.252 }, 00:14:48.252 { 00:14:48.252 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:48.252 "dma_device_type": 2 00:14:48.252 } 00:14:48.252 ], 00:14:48.252 "driver_specific": {} 00:14:48.252 } 00:14:48.252 ] 00:14:48.252 23:56:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:14:48.252 23:56:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:14:48.252 23:56:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:48.252 23:56:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:14:48.252 23:56:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:48.252 23:56:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:48.252 23:56:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:48.252 23:56:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:48.252 23:56:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:48.252 23:56:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:48.252 23:56:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:48.252 23:56:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:48.252 23:56:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:48.510 23:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:48.510 "name": "Existed_Raid", 00:14:48.510 "uuid": "1cd4000a-8daa-4768-9303-741df3ed8547", 00:14:48.510 "strip_size_kb": 0, 00:14:48.510 "state": "online", 00:14:48.510 "raid_level": "raid1", 00:14:48.510 "superblock": false, 00:14:48.510 "num_base_bdevs": 3, 00:14:48.510 "num_base_bdevs_discovered": 3, 00:14:48.510 "num_base_bdevs_operational": 3, 00:14:48.510 "base_bdevs_list": [ 00:14:48.510 { 00:14:48.510 "name": "NewBaseBdev", 00:14:48.510 "uuid": "b8d96af6-6972-4d0c-86fd-ec96d19dd2bd", 00:14:48.510 "is_configured": true, 00:14:48.510 "data_offset": 0, 00:14:48.510 "data_size": 65536 00:14:48.510 }, 00:14:48.510 { 00:14:48.510 "name": "BaseBdev2", 00:14:48.510 "uuid": "88d18c01-56b7-4a9b-b41b-c871fa9e4d5c", 00:14:48.510 "is_configured": true, 00:14:48.510 "data_offset": 0, 00:14:48.510 "data_size": 65536 00:14:48.510 }, 00:14:48.510 { 00:14:48.510 "name": "BaseBdev3", 00:14:48.510 "uuid": "3bd410a0-d103-480e-94d5-3243d801648d", 00:14:48.510 "is_configured": true, 00:14:48.510 "data_offset": 0, 00:14:48.510 "data_size": 65536 00:14:48.510 } 00:14:48.510 ] 00:14:48.510 }' 00:14:48.510 23:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:48.510 23:56:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:49.077 23:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:14:49.077 23:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:14:49.077 23:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:14:49.077 23:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:14:49.077 23:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:14:49.077 23:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:14:49.077 23:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:49.077 23:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:14:49.335 [2024-05-14 23:56:49.799494] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:49.335 23:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:14:49.335 "name": "Existed_Raid", 00:14:49.335 "aliases": [ 00:14:49.335 "1cd4000a-8daa-4768-9303-741df3ed8547" 00:14:49.335 ], 00:14:49.335 "product_name": "Raid Volume", 00:14:49.335 "block_size": 512, 00:14:49.335 "num_blocks": 65536, 00:14:49.335 "uuid": "1cd4000a-8daa-4768-9303-741df3ed8547", 00:14:49.335 "assigned_rate_limits": { 00:14:49.335 "rw_ios_per_sec": 0, 00:14:49.335 "rw_mbytes_per_sec": 0, 00:14:49.335 "r_mbytes_per_sec": 0, 00:14:49.335 "w_mbytes_per_sec": 0 00:14:49.335 }, 00:14:49.335 "claimed": false, 00:14:49.335 "zoned": false, 00:14:49.335 "supported_io_types": { 00:14:49.335 "read": true, 00:14:49.335 "write": true, 00:14:49.335 "unmap": false, 00:14:49.335 "write_zeroes": true, 00:14:49.335 "flush": false, 00:14:49.335 "reset": true, 00:14:49.335 "compare": false, 00:14:49.335 "compare_and_write": false, 00:14:49.335 "abort": false, 00:14:49.335 "nvme_admin": false, 00:14:49.335 "nvme_io": false 00:14:49.335 }, 00:14:49.335 "memory_domains": [ 00:14:49.335 { 00:14:49.335 "dma_device_id": "system", 00:14:49.335 "dma_device_type": 1 00:14:49.335 }, 00:14:49.335 { 00:14:49.335 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:49.335 "dma_device_type": 2 00:14:49.335 }, 00:14:49.335 { 00:14:49.335 "dma_device_id": "system", 00:14:49.335 "dma_device_type": 1 00:14:49.335 }, 00:14:49.335 { 00:14:49.335 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:49.335 "dma_device_type": 2 00:14:49.335 }, 00:14:49.335 { 00:14:49.335 "dma_device_id": "system", 00:14:49.335 "dma_device_type": 1 00:14:49.335 }, 00:14:49.335 { 00:14:49.335 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:49.335 "dma_device_type": 2 00:14:49.335 } 00:14:49.335 ], 00:14:49.335 "driver_specific": { 00:14:49.335 "raid": { 00:14:49.335 "uuid": "1cd4000a-8daa-4768-9303-741df3ed8547", 00:14:49.335 "strip_size_kb": 0, 00:14:49.335 "state": "online", 00:14:49.335 "raid_level": "raid1", 00:14:49.335 "superblock": false, 00:14:49.335 "num_base_bdevs": 3, 00:14:49.335 "num_base_bdevs_discovered": 3, 00:14:49.335 "num_base_bdevs_operational": 3, 00:14:49.335 "base_bdevs_list": [ 00:14:49.335 { 00:14:49.335 "name": "NewBaseBdev", 00:14:49.335 "uuid": "b8d96af6-6972-4d0c-86fd-ec96d19dd2bd", 00:14:49.335 "is_configured": true, 00:14:49.335 "data_offset": 0, 00:14:49.335 "data_size": 65536 00:14:49.335 }, 00:14:49.335 { 00:14:49.335 "name": "BaseBdev2", 00:14:49.335 "uuid": "88d18c01-56b7-4a9b-b41b-c871fa9e4d5c", 00:14:49.335 "is_configured": true, 00:14:49.335 "data_offset": 0, 00:14:49.335 "data_size": 65536 00:14:49.335 }, 00:14:49.335 { 00:14:49.335 "name": "BaseBdev3", 00:14:49.335 "uuid": "3bd410a0-d103-480e-94d5-3243d801648d", 00:14:49.335 "is_configured": true, 00:14:49.335 "data_offset": 0, 00:14:49.335 "data_size": 65536 00:14:49.335 } 00:14:49.335 ] 00:14:49.335 } 00:14:49.335 } 00:14:49.335 }' 00:14:49.335 23:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:49.335 23:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:14:49.335 BaseBdev2 00:14:49.335 BaseBdev3' 00:14:49.335 23:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:49.335 23:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:49.335 23:56:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:49.593 23:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:49.593 "name": "NewBaseBdev", 00:14:49.593 "aliases": [ 00:14:49.593 "b8d96af6-6972-4d0c-86fd-ec96d19dd2bd" 00:14:49.593 ], 00:14:49.593 "product_name": "Malloc disk", 00:14:49.593 "block_size": 512, 00:14:49.593 "num_blocks": 65536, 00:14:49.593 "uuid": "b8d96af6-6972-4d0c-86fd-ec96d19dd2bd", 00:14:49.593 "assigned_rate_limits": { 00:14:49.593 "rw_ios_per_sec": 0, 00:14:49.593 "rw_mbytes_per_sec": 0, 00:14:49.593 "r_mbytes_per_sec": 0, 00:14:49.593 "w_mbytes_per_sec": 0 00:14:49.593 }, 00:14:49.593 "claimed": true, 00:14:49.593 "claim_type": "exclusive_write", 00:14:49.593 "zoned": false, 00:14:49.593 "supported_io_types": { 00:14:49.593 "read": true, 00:14:49.593 "write": true, 00:14:49.593 "unmap": true, 00:14:49.593 "write_zeroes": true, 00:14:49.593 "flush": true, 00:14:49.593 "reset": true, 00:14:49.593 "compare": false, 00:14:49.593 "compare_and_write": false, 00:14:49.593 "abort": true, 00:14:49.593 "nvme_admin": false, 00:14:49.593 "nvme_io": false 00:14:49.593 }, 00:14:49.593 "memory_domains": [ 00:14:49.593 { 00:14:49.593 "dma_device_id": "system", 00:14:49.593 "dma_device_type": 1 00:14:49.593 }, 00:14:49.593 { 00:14:49.593 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:49.593 "dma_device_type": 2 00:14:49.593 } 00:14:49.593 ], 00:14:49.593 "driver_specific": {} 00:14:49.593 }' 00:14:49.593 23:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:49.593 23:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:49.851 23:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:49.851 23:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:49.851 23:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:49.851 23:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:49.851 23:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:49.851 23:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:49.851 23:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:49.851 23:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:49.851 23:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:50.109 23:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:50.109 23:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:50.109 23:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:50.109 23:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:50.109 23:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:50.109 "name": "BaseBdev2", 00:14:50.109 "aliases": [ 00:14:50.109 "88d18c01-56b7-4a9b-b41b-c871fa9e4d5c" 00:14:50.109 ], 00:14:50.109 "product_name": "Malloc disk", 00:14:50.109 "block_size": 512, 00:14:50.109 "num_blocks": 65536, 00:14:50.109 "uuid": "88d18c01-56b7-4a9b-b41b-c871fa9e4d5c", 00:14:50.109 "assigned_rate_limits": { 00:14:50.109 "rw_ios_per_sec": 0, 00:14:50.109 "rw_mbytes_per_sec": 0, 00:14:50.109 "r_mbytes_per_sec": 0, 00:14:50.109 "w_mbytes_per_sec": 0 00:14:50.109 }, 00:14:50.109 "claimed": true, 00:14:50.109 "claim_type": "exclusive_write", 00:14:50.109 "zoned": false, 00:14:50.109 "supported_io_types": { 00:14:50.109 "read": true, 00:14:50.109 "write": true, 00:14:50.109 "unmap": true, 00:14:50.109 "write_zeroes": true, 00:14:50.109 "flush": true, 00:14:50.109 "reset": true, 00:14:50.109 "compare": false, 00:14:50.109 "compare_and_write": false, 00:14:50.109 "abort": true, 00:14:50.109 "nvme_admin": false, 00:14:50.109 "nvme_io": false 00:14:50.109 }, 00:14:50.109 "memory_domains": [ 00:14:50.109 { 00:14:50.109 "dma_device_id": "system", 00:14:50.109 "dma_device_type": 1 00:14:50.109 }, 00:14:50.109 { 00:14:50.110 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:50.110 "dma_device_type": 2 00:14:50.110 } 00:14:50.110 ], 00:14:50.110 "driver_specific": {} 00:14:50.110 }' 00:14:50.110 23:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:50.367 23:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:50.367 23:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:50.367 23:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:50.367 23:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:50.367 23:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:50.367 23:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:50.367 23:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:50.367 23:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:50.367 23:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:50.626 23:56:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:50.626 23:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:50.626 23:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:50.626 23:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:50.626 23:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:50.884 23:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:50.884 "name": "BaseBdev3", 00:14:50.884 "aliases": [ 00:14:50.884 "3bd410a0-d103-480e-94d5-3243d801648d" 00:14:50.884 ], 00:14:50.884 "product_name": "Malloc disk", 00:14:50.884 "block_size": 512, 00:14:50.884 "num_blocks": 65536, 00:14:50.884 "uuid": "3bd410a0-d103-480e-94d5-3243d801648d", 00:14:50.884 "assigned_rate_limits": { 00:14:50.884 "rw_ios_per_sec": 0, 00:14:50.884 "rw_mbytes_per_sec": 0, 00:14:50.884 "r_mbytes_per_sec": 0, 00:14:50.884 "w_mbytes_per_sec": 0 00:14:50.884 }, 00:14:50.884 "claimed": true, 00:14:50.884 "claim_type": "exclusive_write", 00:14:50.884 "zoned": false, 00:14:50.884 "supported_io_types": { 00:14:50.884 "read": true, 00:14:50.885 "write": true, 00:14:50.885 "unmap": true, 00:14:50.885 "write_zeroes": true, 00:14:50.885 "flush": true, 00:14:50.885 "reset": true, 00:14:50.885 "compare": false, 00:14:50.885 "compare_and_write": false, 00:14:50.885 "abort": true, 00:14:50.885 "nvme_admin": false, 00:14:50.885 "nvme_io": false 00:14:50.885 }, 00:14:50.885 "memory_domains": [ 00:14:50.885 { 00:14:50.885 "dma_device_id": "system", 00:14:50.885 "dma_device_type": 1 00:14:50.885 }, 00:14:50.885 { 00:14:50.885 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:50.885 "dma_device_type": 2 00:14:50.885 } 00:14:50.885 ], 00:14:50.885 "driver_specific": {} 00:14:50.885 }' 00:14:50.885 23:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:50.885 23:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:50.885 23:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:50.885 23:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:50.885 23:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:50.885 23:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:50.885 23:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:50.885 23:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:51.143 23:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:51.143 23:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:51.143 23:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:51.143 23:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:51.143 23:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:51.401 [2024-05-14 23:56:51.812571] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:51.401 [2024-05-14 23:56:51.812595] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:51.401 [2024-05-14 23:56:51.812648] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:51.401 [2024-05-14 23:56:51.812913] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:51.401 [2024-05-14 23:56:51.812926] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21e01d0 name Existed_Raid, state offline 00:14:51.401 23:56:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@342 -- # killprocess 421896 00:14:51.401 23:56:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@946 -- # '[' -z 421896 ']' 00:14:51.401 23:56:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # kill -0 421896 00:14:51.401 23:56:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # uname 00:14:51.401 23:56:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:14:51.401 23:56:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 421896 00:14:51.401 23:56:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:14:51.401 23:56:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:14:51.401 23:56:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 421896' 00:14:51.401 killing process with pid 421896 00:14:51.401 23:56:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@965 -- # kill 421896 00:14:51.401 [2024-05-14 23:56:51.884424] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:51.401 23:56:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@970 -- # wait 421896 00:14:51.401 [2024-05-14 23:56:51.911706] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:51.662 23:56:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@344 -- # return 0 00:14:51.662 00:14:51.662 real 0m27.924s 00:14:51.662 user 0m51.215s 00:14:51.662 sys 0m5.028s 00:14:51.662 23:56:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:14:51.662 23:56:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:51.662 ************************************ 00:14:51.662 END TEST raid_state_function_test 00:14:51.662 ************************************ 00:14:51.663 23:56:52 bdev_raid -- bdev/bdev_raid.sh@816 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:14:51.663 23:56:52 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:14:51.663 23:56:52 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:14:51.663 23:56:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:51.663 ************************************ 00:14:51.663 START TEST raid_state_function_test_sb 00:14:51.663 ************************************ 00:14:51.663 23:56:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1121 -- # raid_state_function_test raid1 3 true 00:14:51.663 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local raid_level=raid1 00:14:51.922 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=3 00:14:51.922 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:14:51.922 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:14:51.922 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:14:51.922 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:14:51.922 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:14:51.922 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:14:51.922 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:14:51.922 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:14:51.922 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:14:51.922 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:14:51.922 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:14:51.922 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:14:51.922 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:14:51.922 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:51.922 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:14:51.922 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:14:51.922 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size 00:14:51.922 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:14:51.922 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:14:51.922 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # '[' raid1 '!=' raid1 ']' 00:14:51.922 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@235 -- # strip_size=0 00:14:51.922 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:14:51.922 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:14:51.923 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # raid_pid=426028 00:14:51.923 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 426028' 00:14:51.923 Process raid pid: 426028 00:14:51.923 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:51.923 23:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@247 -- # waitforlisten 426028 /var/tmp/spdk-raid.sock 00:14:51.923 23:56:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@827 -- # '[' -z 426028 ']' 00:14:51.923 23:56:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:51.923 23:56:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:14:51.923 23:56:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:51.923 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:51.923 23:56:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:14:51.923 23:56:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:51.923 [2024-05-14 23:56:52.319019] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:14:51.923 [2024-05-14 23:56:52.319082] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:51.923 [2024-05-14 23:56:52.449365] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:52.181 [2024-05-14 23:56:52.552828] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:14:52.181 [2024-05-14 23:56:52.616549] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:52.181 [2024-05-14 23:56:52.616581] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:52.747 23:56:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:14:52.747 23:56:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # return 0 00:14:52.747 23:56:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:53.005 [2024-05-14 23:56:53.457800] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:53.005 [2024-05-14 23:56:53.457840] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:53.005 [2024-05-14 23:56:53.457852] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:53.005 [2024-05-14 23:56:53.457864] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:53.005 [2024-05-14 23:56:53.457873] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:53.005 [2024-05-14 23:56:53.457884] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:53.005 23:56:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:53.005 23:56:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:53.005 23:56:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:53.005 23:56:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:53.005 23:56:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:53.005 23:56:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:53.005 23:56:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:53.005 23:56:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:53.005 23:56:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:53.005 23:56:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:53.005 23:56:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.005 23:56:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:53.264 23:56:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:53.264 "name": "Existed_Raid", 00:14:53.264 "uuid": "d13a1443-8927-49f4-bd10-b1b5a1210c84", 00:14:53.264 "strip_size_kb": 0, 00:14:53.264 "state": "configuring", 00:14:53.264 "raid_level": "raid1", 00:14:53.264 "superblock": true, 00:14:53.264 "num_base_bdevs": 3, 00:14:53.264 "num_base_bdevs_discovered": 0, 00:14:53.264 "num_base_bdevs_operational": 3, 00:14:53.264 "base_bdevs_list": [ 00:14:53.264 { 00:14:53.264 "name": "BaseBdev1", 00:14:53.264 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:53.264 "is_configured": false, 00:14:53.264 "data_offset": 0, 00:14:53.264 "data_size": 0 00:14:53.264 }, 00:14:53.264 { 00:14:53.264 "name": "BaseBdev2", 00:14:53.264 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:53.264 "is_configured": false, 00:14:53.264 "data_offset": 0, 00:14:53.264 "data_size": 0 00:14:53.264 }, 00:14:53.264 { 00:14:53.264 "name": "BaseBdev3", 00:14:53.264 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:53.264 "is_configured": false, 00:14:53.264 "data_offset": 0, 00:14:53.264 "data_size": 0 00:14:53.264 } 00:14:53.264 ] 00:14:53.264 }' 00:14:53.264 23:56:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:53.264 23:56:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:53.830 23:56:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:54.087 [2024-05-14 23:56:54.544525] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:54.087 [2024-05-14 23:56:54.544556] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1323be0 name Existed_Raid, state configuring 00:14:54.087 23:56:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:54.346 [2024-05-14 23:56:54.781161] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:54.346 [2024-05-14 23:56:54.781190] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:54.346 [2024-05-14 23:56:54.781201] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:54.346 [2024-05-14 23:56:54.781212] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:54.346 [2024-05-14 23:56:54.781221] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:54.346 [2024-05-14 23:56:54.781232] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:54.346 23:56:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:54.604 [2024-05-14 23:56:55.035774] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:54.604 BaseBdev1 00:14:54.604 23:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:14:54.604 23:56:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:14:54.604 23:56:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:54.604 23:56:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:14:54.604 23:56:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:54.604 23:56:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:54.604 23:56:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:54.862 23:56:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:55.119 [ 00:14:55.119 { 00:14:55.119 "name": "BaseBdev1", 00:14:55.119 "aliases": [ 00:14:55.119 "4060f92f-c680-4134-ad37-4b73ddde3106" 00:14:55.119 ], 00:14:55.119 "product_name": "Malloc disk", 00:14:55.119 "block_size": 512, 00:14:55.119 "num_blocks": 65536, 00:14:55.119 "uuid": "4060f92f-c680-4134-ad37-4b73ddde3106", 00:14:55.120 "assigned_rate_limits": { 00:14:55.120 "rw_ios_per_sec": 0, 00:14:55.120 "rw_mbytes_per_sec": 0, 00:14:55.120 "r_mbytes_per_sec": 0, 00:14:55.120 "w_mbytes_per_sec": 0 00:14:55.120 }, 00:14:55.120 "claimed": true, 00:14:55.120 "claim_type": "exclusive_write", 00:14:55.120 "zoned": false, 00:14:55.120 "supported_io_types": { 00:14:55.120 "read": true, 00:14:55.120 "write": true, 00:14:55.120 "unmap": true, 00:14:55.120 "write_zeroes": true, 00:14:55.120 "flush": true, 00:14:55.120 "reset": true, 00:14:55.120 "compare": false, 00:14:55.120 "compare_and_write": false, 00:14:55.120 "abort": true, 00:14:55.120 "nvme_admin": false, 00:14:55.120 "nvme_io": false 00:14:55.120 }, 00:14:55.120 "memory_domains": [ 00:14:55.120 { 00:14:55.120 "dma_device_id": "system", 00:14:55.120 "dma_device_type": 1 00:14:55.120 }, 00:14:55.120 { 00:14:55.120 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:55.120 "dma_device_type": 2 00:14:55.120 } 00:14:55.120 ], 00:14:55.120 "driver_specific": {} 00:14:55.120 } 00:14:55.120 ] 00:14:55.120 23:56:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:14:55.120 23:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:55.120 23:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:55.120 23:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:55.120 23:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:55.120 23:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:55.120 23:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:55.120 23:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:55.120 23:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:55.120 23:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:55.120 23:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:55.120 23:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:55.120 23:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:55.377 23:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:55.377 "name": "Existed_Raid", 00:14:55.377 "uuid": "4994e405-a40b-44ed-8b2e-181a34de9b97", 00:14:55.377 "strip_size_kb": 0, 00:14:55.377 "state": "configuring", 00:14:55.377 "raid_level": "raid1", 00:14:55.377 "superblock": true, 00:14:55.377 "num_base_bdevs": 3, 00:14:55.377 "num_base_bdevs_discovered": 1, 00:14:55.377 "num_base_bdevs_operational": 3, 00:14:55.377 "base_bdevs_list": [ 00:14:55.377 { 00:14:55.377 "name": "BaseBdev1", 00:14:55.377 "uuid": "4060f92f-c680-4134-ad37-4b73ddde3106", 00:14:55.377 "is_configured": true, 00:14:55.377 "data_offset": 2048, 00:14:55.377 "data_size": 63488 00:14:55.377 }, 00:14:55.377 { 00:14:55.377 "name": "BaseBdev2", 00:14:55.377 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:55.377 "is_configured": false, 00:14:55.377 "data_offset": 0, 00:14:55.377 "data_size": 0 00:14:55.377 }, 00:14:55.377 { 00:14:55.377 "name": "BaseBdev3", 00:14:55.377 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:55.377 "is_configured": false, 00:14:55.378 "data_offset": 0, 00:14:55.378 "data_size": 0 00:14:55.378 } 00:14:55.378 ] 00:14:55.378 }' 00:14:55.378 23:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:55.378 23:56:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:55.944 23:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:56.201 [2024-05-14 23:56:56.615961] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:56.201 [2024-05-14 23:56:56.616001] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13234b0 name Existed_Raid, state configuring 00:14:56.201 23:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:56.458 [2024-05-14 23:56:56.856635] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:56.458 [2024-05-14 23:56:56.858110] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:56.458 [2024-05-14 23:56:56.858141] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:56.458 [2024-05-14 23:56:56.858151] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:56.458 [2024-05-14 23:56:56.858163] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:56.458 23:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:14:56.458 23:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:14:56.458 23:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:56.458 23:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:56.458 23:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:56.458 23:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:56.458 23:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:56.458 23:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:56.458 23:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:56.458 23:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:56.458 23:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:56.458 23:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:56.458 23:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:56.458 23:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:56.717 23:56:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:56.717 "name": "Existed_Raid", 00:14:56.717 "uuid": "0b49f1c0-1d65-47be-839f-3f100d3bc9cb", 00:14:56.717 "strip_size_kb": 0, 00:14:56.717 "state": "configuring", 00:14:56.717 "raid_level": "raid1", 00:14:56.717 "superblock": true, 00:14:56.717 "num_base_bdevs": 3, 00:14:56.717 "num_base_bdevs_discovered": 1, 00:14:56.717 "num_base_bdevs_operational": 3, 00:14:56.717 "base_bdevs_list": [ 00:14:56.717 { 00:14:56.717 "name": "BaseBdev1", 00:14:56.717 "uuid": "4060f92f-c680-4134-ad37-4b73ddde3106", 00:14:56.717 "is_configured": true, 00:14:56.717 "data_offset": 2048, 00:14:56.717 "data_size": 63488 00:14:56.717 }, 00:14:56.717 { 00:14:56.717 "name": "BaseBdev2", 00:14:56.717 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:56.717 "is_configured": false, 00:14:56.717 "data_offset": 0, 00:14:56.717 "data_size": 0 00:14:56.717 }, 00:14:56.717 { 00:14:56.717 "name": "BaseBdev3", 00:14:56.717 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:56.717 "is_configured": false, 00:14:56.717 "data_offset": 0, 00:14:56.717 "data_size": 0 00:14:56.717 } 00:14:56.717 ] 00:14:56.717 }' 00:14:56.717 23:56:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:56.717 23:56:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:57.284 23:56:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:57.542 [2024-05-14 23:56:57.927004] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:57.542 BaseBdev2 00:14:57.542 23:56:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:14:57.542 23:56:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:14:57.542 23:56:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:57.542 23:56:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:14:57.542 23:56:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:57.542 23:56:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:57.542 23:56:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:57.801 23:56:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:58.060 [ 00:14:58.060 { 00:14:58.060 "name": "BaseBdev2", 00:14:58.060 "aliases": [ 00:14:58.060 "594e339a-24a9-4c2b-a65d-a1f53f76258e" 00:14:58.060 ], 00:14:58.060 "product_name": "Malloc disk", 00:14:58.060 "block_size": 512, 00:14:58.060 "num_blocks": 65536, 00:14:58.060 "uuid": "594e339a-24a9-4c2b-a65d-a1f53f76258e", 00:14:58.060 "assigned_rate_limits": { 00:14:58.060 "rw_ios_per_sec": 0, 00:14:58.060 "rw_mbytes_per_sec": 0, 00:14:58.060 "r_mbytes_per_sec": 0, 00:14:58.060 "w_mbytes_per_sec": 0 00:14:58.060 }, 00:14:58.060 "claimed": true, 00:14:58.060 "claim_type": "exclusive_write", 00:14:58.060 "zoned": false, 00:14:58.060 "supported_io_types": { 00:14:58.060 "read": true, 00:14:58.060 "write": true, 00:14:58.060 "unmap": true, 00:14:58.060 "write_zeroes": true, 00:14:58.060 "flush": true, 00:14:58.060 "reset": true, 00:14:58.060 "compare": false, 00:14:58.060 "compare_and_write": false, 00:14:58.060 "abort": true, 00:14:58.060 "nvme_admin": false, 00:14:58.060 "nvme_io": false 00:14:58.060 }, 00:14:58.060 "memory_domains": [ 00:14:58.060 { 00:14:58.060 "dma_device_id": "system", 00:14:58.060 "dma_device_type": 1 00:14:58.060 }, 00:14:58.060 { 00:14:58.060 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.060 "dma_device_type": 2 00:14:58.060 } 00:14:58.060 ], 00:14:58.060 "driver_specific": {} 00:14:58.060 } 00:14:58.060 ] 00:14:58.060 23:56:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:14:58.060 23:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:14:58.060 23:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:14:58.060 23:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:58.060 23:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:58.060 23:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:58.060 23:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:58.060 23:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:58.060 23:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:58.060 23:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:58.060 23:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:58.060 23:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:58.060 23:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:58.060 23:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:58.060 23:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:58.319 23:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:58.319 "name": "Existed_Raid", 00:14:58.319 "uuid": "0b49f1c0-1d65-47be-839f-3f100d3bc9cb", 00:14:58.319 "strip_size_kb": 0, 00:14:58.319 "state": "configuring", 00:14:58.319 "raid_level": "raid1", 00:14:58.319 "superblock": true, 00:14:58.319 "num_base_bdevs": 3, 00:14:58.319 "num_base_bdevs_discovered": 2, 00:14:58.319 "num_base_bdevs_operational": 3, 00:14:58.319 "base_bdevs_list": [ 00:14:58.319 { 00:14:58.319 "name": "BaseBdev1", 00:14:58.319 "uuid": "4060f92f-c680-4134-ad37-4b73ddde3106", 00:14:58.319 "is_configured": true, 00:14:58.319 "data_offset": 2048, 00:14:58.319 "data_size": 63488 00:14:58.319 }, 00:14:58.319 { 00:14:58.319 "name": "BaseBdev2", 00:14:58.319 "uuid": "594e339a-24a9-4c2b-a65d-a1f53f76258e", 00:14:58.319 "is_configured": true, 00:14:58.319 "data_offset": 2048, 00:14:58.319 "data_size": 63488 00:14:58.320 }, 00:14:58.320 { 00:14:58.320 "name": "BaseBdev3", 00:14:58.320 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:58.320 "is_configured": false, 00:14:58.320 "data_offset": 0, 00:14:58.320 "data_size": 0 00:14:58.320 } 00:14:58.320 ] 00:14:58.320 }' 00:14:58.320 23:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:58.320 23:56:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:58.886 23:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:59.144 [2024-05-14 23:56:59.486830] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:59.144 [2024-05-14 23:56:59.486990] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1324560 00:14:59.144 [2024-05-14 23:56:59.487005] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:59.144 [2024-05-14 23:56:59.487176] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x133b490 00:14:59.144 [2024-05-14 23:56:59.487293] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1324560 00:14:59.144 [2024-05-14 23:56:59.487303] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1324560 00:14:59.144 [2024-05-14 23:56:59.487413] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:59.144 BaseBdev3 00:14:59.144 23:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:14:59.144 23:56:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:14:59.144 23:56:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:59.144 23:56:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:14:59.144 23:56:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:59.144 23:56:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:59.144 23:56:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:59.402 23:56:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:59.403 [ 00:14:59.403 { 00:14:59.403 "name": "BaseBdev3", 00:14:59.403 "aliases": [ 00:14:59.403 "4d0c748d-2cc5-4af9-8701-3d66e01718df" 00:14:59.403 ], 00:14:59.403 "product_name": "Malloc disk", 00:14:59.403 "block_size": 512, 00:14:59.403 "num_blocks": 65536, 00:14:59.403 "uuid": "4d0c748d-2cc5-4af9-8701-3d66e01718df", 00:14:59.403 "assigned_rate_limits": { 00:14:59.403 "rw_ios_per_sec": 0, 00:14:59.403 "rw_mbytes_per_sec": 0, 00:14:59.403 "r_mbytes_per_sec": 0, 00:14:59.403 "w_mbytes_per_sec": 0 00:14:59.403 }, 00:14:59.403 "claimed": true, 00:14:59.403 "claim_type": "exclusive_write", 00:14:59.403 "zoned": false, 00:14:59.403 "supported_io_types": { 00:14:59.403 "read": true, 00:14:59.403 "write": true, 00:14:59.403 "unmap": true, 00:14:59.403 "write_zeroes": true, 00:14:59.403 "flush": true, 00:14:59.403 "reset": true, 00:14:59.403 "compare": false, 00:14:59.403 "compare_and_write": false, 00:14:59.403 "abort": true, 00:14:59.403 "nvme_admin": false, 00:14:59.403 "nvme_io": false 00:14:59.403 }, 00:14:59.403 "memory_domains": [ 00:14:59.403 { 00:14:59.403 "dma_device_id": "system", 00:14:59.403 "dma_device_type": 1 00:14:59.403 }, 00:14:59.403 { 00:14:59.403 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:59.403 "dma_device_type": 2 00:14:59.403 } 00:14:59.403 ], 00:14:59.403 "driver_specific": {} 00:14:59.403 } 00:14:59.403 ] 00:14:59.403 23:56:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:14:59.403 23:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:14:59.403 23:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:14:59.403 23:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:14:59.660 23:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:59.660 23:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:14:59.660 23:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:59.660 23:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:59.660 23:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:59.660 23:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:59.660 23:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:59.660 23:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:59.660 23:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:59.660 23:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:59.660 23:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:59.660 23:57:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:59.660 "name": "Existed_Raid", 00:14:59.660 "uuid": "0b49f1c0-1d65-47be-839f-3f100d3bc9cb", 00:14:59.660 "strip_size_kb": 0, 00:14:59.660 "state": "online", 00:14:59.660 "raid_level": "raid1", 00:14:59.660 "superblock": true, 00:14:59.660 "num_base_bdevs": 3, 00:14:59.660 "num_base_bdevs_discovered": 3, 00:14:59.660 "num_base_bdevs_operational": 3, 00:14:59.661 "base_bdevs_list": [ 00:14:59.661 { 00:14:59.661 "name": "BaseBdev1", 00:14:59.661 "uuid": "4060f92f-c680-4134-ad37-4b73ddde3106", 00:14:59.661 "is_configured": true, 00:14:59.661 "data_offset": 2048, 00:14:59.661 "data_size": 63488 00:14:59.661 }, 00:14:59.661 { 00:14:59.661 "name": "BaseBdev2", 00:14:59.661 "uuid": "594e339a-24a9-4c2b-a65d-a1f53f76258e", 00:14:59.661 "is_configured": true, 00:14:59.661 "data_offset": 2048, 00:14:59.661 "data_size": 63488 00:14:59.661 }, 00:14:59.661 { 00:14:59.661 "name": "BaseBdev3", 00:14:59.661 "uuid": "4d0c748d-2cc5-4af9-8701-3d66e01718df", 00:14:59.661 "is_configured": true, 00:14:59.661 "data_offset": 2048, 00:14:59.661 "data_size": 63488 00:14:59.661 } 00:14:59.661 ] 00:14:59.661 }' 00:14:59.661 23:57:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:59.661 23:57:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:00.593 23:57:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:15:00.593 23:57:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:15:00.593 23:57:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:15:00.593 23:57:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:15:00.593 23:57:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:15:00.593 23:57:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:15:00.593 23:57:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:00.593 23:57:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:15:00.593 [2024-05-14 23:57:01.063297] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:00.593 23:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:15:00.593 "name": "Existed_Raid", 00:15:00.593 "aliases": [ 00:15:00.593 "0b49f1c0-1d65-47be-839f-3f100d3bc9cb" 00:15:00.593 ], 00:15:00.593 "product_name": "Raid Volume", 00:15:00.593 "block_size": 512, 00:15:00.593 "num_blocks": 63488, 00:15:00.593 "uuid": "0b49f1c0-1d65-47be-839f-3f100d3bc9cb", 00:15:00.593 "assigned_rate_limits": { 00:15:00.593 "rw_ios_per_sec": 0, 00:15:00.593 "rw_mbytes_per_sec": 0, 00:15:00.593 "r_mbytes_per_sec": 0, 00:15:00.593 "w_mbytes_per_sec": 0 00:15:00.593 }, 00:15:00.593 "claimed": false, 00:15:00.593 "zoned": false, 00:15:00.593 "supported_io_types": { 00:15:00.593 "read": true, 00:15:00.593 "write": true, 00:15:00.593 "unmap": false, 00:15:00.593 "write_zeroes": true, 00:15:00.593 "flush": false, 00:15:00.593 "reset": true, 00:15:00.593 "compare": false, 00:15:00.593 "compare_and_write": false, 00:15:00.593 "abort": false, 00:15:00.593 "nvme_admin": false, 00:15:00.593 "nvme_io": false 00:15:00.593 }, 00:15:00.593 "memory_domains": [ 00:15:00.593 { 00:15:00.593 "dma_device_id": "system", 00:15:00.593 "dma_device_type": 1 00:15:00.593 }, 00:15:00.593 { 00:15:00.593 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:00.593 "dma_device_type": 2 00:15:00.593 }, 00:15:00.593 { 00:15:00.593 "dma_device_id": "system", 00:15:00.593 "dma_device_type": 1 00:15:00.593 }, 00:15:00.593 { 00:15:00.593 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:00.593 "dma_device_type": 2 00:15:00.593 }, 00:15:00.593 { 00:15:00.593 "dma_device_id": "system", 00:15:00.593 "dma_device_type": 1 00:15:00.593 }, 00:15:00.593 { 00:15:00.593 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:00.593 "dma_device_type": 2 00:15:00.593 } 00:15:00.593 ], 00:15:00.593 "driver_specific": { 00:15:00.593 "raid": { 00:15:00.593 "uuid": "0b49f1c0-1d65-47be-839f-3f100d3bc9cb", 00:15:00.593 "strip_size_kb": 0, 00:15:00.593 "state": "online", 00:15:00.593 "raid_level": "raid1", 00:15:00.593 "superblock": true, 00:15:00.593 "num_base_bdevs": 3, 00:15:00.593 "num_base_bdevs_discovered": 3, 00:15:00.593 "num_base_bdevs_operational": 3, 00:15:00.593 "base_bdevs_list": [ 00:15:00.593 { 00:15:00.593 "name": "BaseBdev1", 00:15:00.593 "uuid": "4060f92f-c680-4134-ad37-4b73ddde3106", 00:15:00.593 "is_configured": true, 00:15:00.593 "data_offset": 2048, 00:15:00.593 "data_size": 63488 00:15:00.593 }, 00:15:00.593 { 00:15:00.593 "name": "BaseBdev2", 00:15:00.593 "uuid": "594e339a-24a9-4c2b-a65d-a1f53f76258e", 00:15:00.593 "is_configured": true, 00:15:00.593 "data_offset": 2048, 00:15:00.593 "data_size": 63488 00:15:00.593 }, 00:15:00.593 { 00:15:00.593 "name": "BaseBdev3", 00:15:00.593 "uuid": "4d0c748d-2cc5-4af9-8701-3d66e01718df", 00:15:00.593 "is_configured": true, 00:15:00.593 "data_offset": 2048, 00:15:00.593 "data_size": 63488 00:15:00.593 } 00:15:00.593 ] 00:15:00.593 } 00:15:00.593 } 00:15:00.593 }' 00:15:00.593 23:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:00.593 23:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:15:00.593 BaseBdev2 00:15:00.593 BaseBdev3' 00:15:00.593 23:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:00.594 23:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:00.594 23:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:00.851 23:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:00.851 "name": "BaseBdev1", 00:15:00.851 "aliases": [ 00:15:00.851 "4060f92f-c680-4134-ad37-4b73ddde3106" 00:15:00.851 ], 00:15:00.851 "product_name": "Malloc disk", 00:15:00.851 "block_size": 512, 00:15:00.851 "num_blocks": 65536, 00:15:00.851 "uuid": "4060f92f-c680-4134-ad37-4b73ddde3106", 00:15:00.851 "assigned_rate_limits": { 00:15:00.851 "rw_ios_per_sec": 0, 00:15:00.851 "rw_mbytes_per_sec": 0, 00:15:00.851 "r_mbytes_per_sec": 0, 00:15:00.851 "w_mbytes_per_sec": 0 00:15:00.851 }, 00:15:00.851 "claimed": true, 00:15:00.851 "claim_type": "exclusive_write", 00:15:00.851 "zoned": false, 00:15:00.851 "supported_io_types": { 00:15:00.851 "read": true, 00:15:00.851 "write": true, 00:15:00.851 "unmap": true, 00:15:00.851 "write_zeroes": true, 00:15:00.851 "flush": true, 00:15:00.851 "reset": true, 00:15:00.851 "compare": false, 00:15:00.851 "compare_and_write": false, 00:15:00.851 "abort": true, 00:15:00.851 "nvme_admin": false, 00:15:00.851 "nvme_io": false 00:15:00.851 }, 00:15:00.851 "memory_domains": [ 00:15:00.851 { 00:15:00.851 "dma_device_id": "system", 00:15:00.851 "dma_device_type": 1 00:15:00.851 }, 00:15:00.851 { 00:15:00.851 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:00.851 "dma_device_type": 2 00:15:00.851 } 00:15:00.851 ], 00:15:00.851 "driver_specific": {} 00:15:00.851 }' 00:15:00.851 23:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:00.851 23:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:01.109 23:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:01.109 23:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:01.109 23:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:01.109 23:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:01.109 23:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:01.109 23:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:01.109 23:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:01.109 23:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:01.109 23:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:01.369 23:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:01.369 23:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:01.369 23:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:01.369 23:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:01.640 23:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:01.640 "name": "BaseBdev2", 00:15:01.640 "aliases": [ 00:15:01.641 "594e339a-24a9-4c2b-a65d-a1f53f76258e" 00:15:01.641 ], 00:15:01.641 "product_name": "Malloc disk", 00:15:01.641 "block_size": 512, 00:15:01.641 "num_blocks": 65536, 00:15:01.641 "uuid": "594e339a-24a9-4c2b-a65d-a1f53f76258e", 00:15:01.641 "assigned_rate_limits": { 00:15:01.641 "rw_ios_per_sec": 0, 00:15:01.641 "rw_mbytes_per_sec": 0, 00:15:01.641 "r_mbytes_per_sec": 0, 00:15:01.641 "w_mbytes_per_sec": 0 00:15:01.641 }, 00:15:01.641 "claimed": true, 00:15:01.641 "claim_type": "exclusive_write", 00:15:01.641 "zoned": false, 00:15:01.641 "supported_io_types": { 00:15:01.641 "read": true, 00:15:01.641 "write": true, 00:15:01.641 "unmap": true, 00:15:01.641 "write_zeroes": true, 00:15:01.641 "flush": true, 00:15:01.641 "reset": true, 00:15:01.641 "compare": false, 00:15:01.641 "compare_and_write": false, 00:15:01.641 "abort": true, 00:15:01.641 "nvme_admin": false, 00:15:01.641 "nvme_io": false 00:15:01.641 }, 00:15:01.641 "memory_domains": [ 00:15:01.641 { 00:15:01.641 "dma_device_id": "system", 00:15:01.641 "dma_device_type": 1 00:15:01.641 }, 00:15:01.641 { 00:15:01.641 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:01.641 "dma_device_type": 2 00:15:01.641 } 00:15:01.641 ], 00:15:01.641 "driver_specific": {} 00:15:01.641 }' 00:15:01.641 23:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:01.641 23:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:01.641 23:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:01.641 23:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:01.641 23:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:01.641 23:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:01.641 23:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:01.641 23:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:01.641 23:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:01.898 23:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:01.899 23:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:01.899 23:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:01.899 23:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:01.899 23:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:01.899 23:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:02.155 23:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:02.155 "name": "BaseBdev3", 00:15:02.155 "aliases": [ 00:15:02.155 "4d0c748d-2cc5-4af9-8701-3d66e01718df" 00:15:02.155 ], 00:15:02.155 "product_name": "Malloc disk", 00:15:02.155 "block_size": 512, 00:15:02.155 "num_blocks": 65536, 00:15:02.155 "uuid": "4d0c748d-2cc5-4af9-8701-3d66e01718df", 00:15:02.155 "assigned_rate_limits": { 00:15:02.155 "rw_ios_per_sec": 0, 00:15:02.155 "rw_mbytes_per_sec": 0, 00:15:02.155 "r_mbytes_per_sec": 0, 00:15:02.155 "w_mbytes_per_sec": 0 00:15:02.155 }, 00:15:02.155 "claimed": true, 00:15:02.155 "claim_type": "exclusive_write", 00:15:02.155 "zoned": false, 00:15:02.155 "supported_io_types": { 00:15:02.155 "read": true, 00:15:02.155 "write": true, 00:15:02.155 "unmap": true, 00:15:02.155 "write_zeroes": true, 00:15:02.155 "flush": true, 00:15:02.155 "reset": true, 00:15:02.155 "compare": false, 00:15:02.155 "compare_and_write": false, 00:15:02.155 "abort": true, 00:15:02.155 "nvme_admin": false, 00:15:02.155 "nvme_io": false 00:15:02.155 }, 00:15:02.155 "memory_domains": [ 00:15:02.155 { 00:15:02.155 "dma_device_id": "system", 00:15:02.155 "dma_device_type": 1 00:15:02.155 }, 00:15:02.155 { 00:15:02.155 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:02.155 "dma_device_type": 2 00:15:02.155 } 00:15:02.155 ], 00:15:02.155 "driver_specific": {} 00:15:02.155 }' 00:15:02.155 23:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:02.155 23:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:02.155 23:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:02.155 23:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:02.155 23:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:02.155 23:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:02.155 23:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:02.155 23:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:02.413 23:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:02.413 23:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:02.413 23:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:02.413 23:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:02.413 23:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:02.671 [2024-05-14 23:57:03.048344] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:02.671 23:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # local expected_state 00:15:02.671 23:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # has_redundancy raid1 00:15:02.671 23:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # case $1 in 00:15:02.671 23:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 0 00:15:02.671 23:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@280 -- # expected_state=online 00:15:02.671 23:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:15:02.671 23:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:02.671 23:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:15:02.671 23:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:15:02.671 23:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:15:02.671 23:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:15:02.671 23:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:02.671 23:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:02.671 23:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:02.671 23:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:02.671 23:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:02.671 23:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:02.929 23:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:02.929 "name": "Existed_Raid", 00:15:02.929 "uuid": "0b49f1c0-1d65-47be-839f-3f100d3bc9cb", 00:15:02.929 "strip_size_kb": 0, 00:15:02.929 "state": "online", 00:15:02.929 "raid_level": "raid1", 00:15:02.929 "superblock": true, 00:15:02.929 "num_base_bdevs": 3, 00:15:02.929 "num_base_bdevs_discovered": 2, 00:15:02.929 "num_base_bdevs_operational": 2, 00:15:02.929 "base_bdevs_list": [ 00:15:02.929 { 00:15:02.929 "name": null, 00:15:02.929 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:02.929 "is_configured": false, 00:15:02.929 "data_offset": 2048, 00:15:02.929 "data_size": 63488 00:15:02.929 }, 00:15:02.929 { 00:15:02.929 "name": "BaseBdev2", 00:15:02.929 "uuid": "594e339a-24a9-4c2b-a65d-a1f53f76258e", 00:15:02.929 "is_configured": true, 00:15:02.929 "data_offset": 2048, 00:15:02.929 "data_size": 63488 00:15:02.929 }, 00:15:02.929 { 00:15:02.929 "name": "BaseBdev3", 00:15:02.929 "uuid": "4d0c748d-2cc5-4af9-8701-3d66e01718df", 00:15:02.929 "is_configured": true, 00:15:02.929 "data_offset": 2048, 00:15:02.929 "data_size": 63488 00:15:02.929 } 00:15:02.929 ] 00:15:02.929 }' 00:15:02.929 23:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:02.929 23:57:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:03.495 23:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:15:03.495 23:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:15:03.495 23:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:03.495 23:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:15:03.753 23:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:15:03.753 23:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:03.753 23:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:04.009 [2024-05-14 23:57:04.373014] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:04.009 23:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:15:04.009 23:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:15:04.010 23:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:04.010 23:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:15:04.267 23:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:15:04.267 23:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:04.267 23:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:04.524 [2024-05-14 23:57:04.874452] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:04.524 [2024-05-14 23:57:04.874528] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:04.524 [2024-05-14 23:57:04.887230] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:04.524 [2024-05-14 23:57:04.887296] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:04.524 [2024-05-14 23:57:04.887310] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1324560 name Existed_Raid, state offline 00:15:04.524 23:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:15:04.524 23:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:15:04.524 23:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:15:04.524 23:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:04.782 23:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:15:04.782 23:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:15:04.782 23:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@300 -- # '[' 3 -gt 2 ']' 00:15:04.782 23:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:15:04.782 23:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:15:04.782 23:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:05.040 BaseBdev2 00:15:05.040 23:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:15:05.040 23:57:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:15:05.040 23:57:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:05.040 23:57:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:15:05.040 23:57:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:05.040 23:57:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:05.040 23:57:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:05.040 23:57:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:05.299 [ 00:15:05.299 { 00:15:05.299 "name": "BaseBdev2", 00:15:05.299 "aliases": [ 00:15:05.299 "ea136e86-4ce2-413e-b2ea-1f3c1d41bcc8" 00:15:05.299 ], 00:15:05.299 "product_name": "Malloc disk", 00:15:05.299 "block_size": 512, 00:15:05.299 "num_blocks": 65536, 00:15:05.299 "uuid": "ea136e86-4ce2-413e-b2ea-1f3c1d41bcc8", 00:15:05.299 "assigned_rate_limits": { 00:15:05.299 "rw_ios_per_sec": 0, 00:15:05.299 "rw_mbytes_per_sec": 0, 00:15:05.299 "r_mbytes_per_sec": 0, 00:15:05.299 "w_mbytes_per_sec": 0 00:15:05.299 }, 00:15:05.299 "claimed": false, 00:15:05.299 "zoned": false, 00:15:05.299 "supported_io_types": { 00:15:05.299 "read": true, 00:15:05.299 "write": true, 00:15:05.299 "unmap": true, 00:15:05.299 "write_zeroes": true, 00:15:05.299 "flush": true, 00:15:05.299 "reset": true, 00:15:05.299 "compare": false, 00:15:05.299 "compare_and_write": false, 00:15:05.299 "abort": true, 00:15:05.299 "nvme_admin": false, 00:15:05.299 "nvme_io": false 00:15:05.299 }, 00:15:05.299 "memory_domains": [ 00:15:05.299 { 00:15:05.299 "dma_device_id": "system", 00:15:05.299 "dma_device_type": 1 00:15:05.299 }, 00:15:05.299 { 00:15:05.299 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:05.299 "dma_device_type": 2 00:15:05.299 } 00:15:05.299 ], 00:15:05.299 "driver_specific": {} 00:15:05.299 } 00:15:05.299 ] 00:15:05.299 23:57:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:15:05.299 23:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:15:05.299 23:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:15:05.299 23:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:05.557 BaseBdev3 00:15:05.557 23:57:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:15:05.557 23:57:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:15:05.557 23:57:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:05.557 23:57:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:15:05.557 23:57:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:05.557 23:57:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:05.557 23:57:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:05.814 23:57:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:06.072 [ 00:15:06.072 { 00:15:06.072 "name": "BaseBdev3", 00:15:06.072 "aliases": [ 00:15:06.072 "15c1790d-e20b-422c-86a2-72fac2137a51" 00:15:06.072 ], 00:15:06.072 "product_name": "Malloc disk", 00:15:06.072 "block_size": 512, 00:15:06.072 "num_blocks": 65536, 00:15:06.072 "uuid": "15c1790d-e20b-422c-86a2-72fac2137a51", 00:15:06.072 "assigned_rate_limits": { 00:15:06.072 "rw_ios_per_sec": 0, 00:15:06.072 "rw_mbytes_per_sec": 0, 00:15:06.072 "r_mbytes_per_sec": 0, 00:15:06.072 "w_mbytes_per_sec": 0 00:15:06.072 }, 00:15:06.072 "claimed": false, 00:15:06.072 "zoned": false, 00:15:06.072 "supported_io_types": { 00:15:06.072 "read": true, 00:15:06.072 "write": true, 00:15:06.072 "unmap": true, 00:15:06.072 "write_zeroes": true, 00:15:06.072 "flush": true, 00:15:06.072 "reset": true, 00:15:06.072 "compare": false, 00:15:06.072 "compare_and_write": false, 00:15:06.072 "abort": true, 00:15:06.072 "nvme_admin": false, 00:15:06.072 "nvme_io": false 00:15:06.072 }, 00:15:06.072 "memory_domains": [ 00:15:06.072 { 00:15:06.072 "dma_device_id": "system", 00:15:06.072 "dma_device_type": 1 00:15:06.072 }, 00:15:06.072 { 00:15:06.072 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:06.072 "dma_device_type": 2 00:15:06.072 } 00:15:06.072 ], 00:15:06.072 "driver_specific": {} 00:15:06.072 } 00:15:06.072 ] 00:15:06.072 23:57:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:15:06.072 23:57:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:15:06.072 23:57:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:15:06.072 23:57:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:06.330 [2024-05-14 23:57:06.796811] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:06.330 [2024-05-14 23:57:06.796852] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:06.330 [2024-05-14 23:57:06.796872] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:06.330 [2024-05-14 23:57:06.798260] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:06.330 23:57:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:06.330 23:57:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:06.330 23:57:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:06.330 23:57:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:15:06.330 23:57:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:15:06.330 23:57:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:15:06.330 23:57:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:06.330 23:57:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:06.330 23:57:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:06.330 23:57:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:06.330 23:57:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:06.330 23:57:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:06.588 23:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:06.588 "name": "Existed_Raid", 00:15:06.588 "uuid": "871aefd7-34e0-45fc-892d-de85e8fce1a9", 00:15:06.588 "strip_size_kb": 0, 00:15:06.588 "state": "configuring", 00:15:06.588 "raid_level": "raid1", 00:15:06.588 "superblock": true, 00:15:06.588 "num_base_bdevs": 3, 00:15:06.588 "num_base_bdevs_discovered": 2, 00:15:06.588 "num_base_bdevs_operational": 3, 00:15:06.588 "base_bdevs_list": [ 00:15:06.588 { 00:15:06.588 "name": "BaseBdev1", 00:15:06.588 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:06.588 "is_configured": false, 00:15:06.588 "data_offset": 0, 00:15:06.588 "data_size": 0 00:15:06.588 }, 00:15:06.588 { 00:15:06.588 "name": "BaseBdev2", 00:15:06.588 "uuid": "ea136e86-4ce2-413e-b2ea-1f3c1d41bcc8", 00:15:06.588 "is_configured": true, 00:15:06.588 "data_offset": 2048, 00:15:06.588 "data_size": 63488 00:15:06.588 }, 00:15:06.588 { 00:15:06.588 "name": "BaseBdev3", 00:15:06.588 "uuid": "15c1790d-e20b-422c-86a2-72fac2137a51", 00:15:06.588 "is_configured": true, 00:15:06.588 "data_offset": 2048, 00:15:06.588 "data_size": 63488 00:15:06.588 } 00:15:06.588 ] 00:15:06.588 }' 00:15:06.588 23:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:06.588 23:57:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:07.154 23:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:07.412 [2024-05-14 23:57:07.887690] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:07.412 23:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:07.412 23:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:07.412 23:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:07.412 23:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:15:07.412 23:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:15:07.412 23:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:15:07.412 23:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:07.412 23:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:07.412 23:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:07.412 23:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:07.412 23:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:07.412 23:57:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:07.671 23:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:07.671 "name": "Existed_Raid", 00:15:07.671 "uuid": "871aefd7-34e0-45fc-892d-de85e8fce1a9", 00:15:07.671 "strip_size_kb": 0, 00:15:07.671 "state": "configuring", 00:15:07.671 "raid_level": "raid1", 00:15:07.671 "superblock": true, 00:15:07.671 "num_base_bdevs": 3, 00:15:07.671 "num_base_bdevs_discovered": 1, 00:15:07.671 "num_base_bdevs_operational": 3, 00:15:07.671 "base_bdevs_list": [ 00:15:07.671 { 00:15:07.671 "name": "BaseBdev1", 00:15:07.671 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:07.671 "is_configured": false, 00:15:07.671 "data_offset": 0, 00:15:07.671 "data_size": 0 00:15:07.671 }, 00:15:07.671 { 00:15:07.671 "name": null, 00:15:07.671 "uuid": "ea136e86-4ce2-413e-b2ea-1f3c1d41bcc8", 00:15:07.671 "is_configured": false, 00:15:07.671 "data_offset": 2048, 00:15:07.671 "data_size": 63488 00:15:07.671 }, 00:15:07.671 { 00:15:07.671 "name": "BaseBdev3", 00:15:07.671 "uuid": "15c1790d-e20b-422c-86a2-72fac2137a51", 00:15:07.671 "is_configured": true, 00:15:07.671 "data_offset": 2048, 00:15:07.671 "data_size": 63488 00:15:07.671 } 00:15:07.671 ] 00:15:07.671 }' 00:15:07.671 23:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:07.671 23:57:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:08.237 23:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:08.237 23:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:08.495 23:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:15:08.495 23:57:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:08.753 [2024-05-14 23:57:09.130552] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:08.753 BaseBdev1 00:15:08.753 23:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:15:08.753 23:57:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:15:08.753 23:57:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:08.753 23:57:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:15:08.753 23:57:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:08.753 23:57:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:08.753 23:57:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:09.010 23:57:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:09.268 [ 00:15:09.268 { 00:15:09.268 "name": "BaseBdev1", 00:15:09.268 "aliases": [ 00:15:09.268 "c1a8d56a-d77d-4219-a1c1-cbef4bef4f38" 00:15:09.268 ], 00:15:09.268 "product_name": "Malloc disk", 00:15:09.268 "block_size": 512, 00:15:09.268 "num_blocks": 65536, 00:15:09.268 "uuid": "c1a8d56a-d77d-4219-a1c1-cbef4bef4f38", 00:15:09.268 "assigned_rate_limits": { 00:15:09.268 "rw_ios_per_sec": 0, 00:15:09.268 "rw_mbytes_per_sec": 0, 00:15:09.268 "r_mbytes_per_sec": 0, 00:15:09.268 "w_mbytes_per_sec": 0 00:15:09.268 }, 00:15:09.268 "claimed": true, 00:15:09.268 "claim_type": "exclusive_write", 00:15:09.268 "zoned": false, 00:15:09.268 "supported_io_types": { 00:15:09.268 "read": true, 00:15:09.268 "write": true, 00:15:09.268 "unmap": true, 00:15:09.268 "write_zeroes": true, 00:15:09.268 "flush": true, 00:15:09.268 "reset": true, 00:15:09.268 "compare": false, 00:15:09.268 "compare_and_write": false, 00:15:09.268 "abort": true, 00:15:09.268 "nvme_admin": false, 00:15:09.268 "nvme_io": false 00:15:09.268 }, 00:15:09.268 "memory_domains": [ 00:15:09.268 { 00:15:09.268 "dma_device_id": "system", 00:15:09.268 "dma_device_type": 1 00:15:09.268 }, 00:15:09.268 { 00:15:09.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:09.268 "dma_device_type": 2 00:15:09.268 } 00:15:09.268 ], 00:15:09.268 "driver_specific": {} 00:15:09.268 } 00:15:09.268 ] 00:15:09.268 23:57:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:15:09.268 23:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:09.269 23:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:09.269 23:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:09.269 23:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:15:09.269 23:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:15:09.269 23:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:15:09.269 23:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:09.269 23:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:09.269 23:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:09.269 23:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:09.269 23:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:09.269 23:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:09.527 23:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:09.527 "name": "Existed_Raid", 00:15:09.527 "uuid": "871aefd7-34e0-45fc-892d-de85e8fce1a9", 00:15:09.527 "strip_size_kb": 0, 00:15:09.527 "state": "configuring", 00:15:09.527 "raid_level": "raid1", 00:15:09.527 "superblock": true, 00:15:09.527 "num_base_bdevs": 3, 00:15:09.527 "num_base_bdevs_discovered": 2, 00:15:09.527 "num_base_bdevs_operational": 3, 00:15:09.527 "base_bdevs_list": [ 00:15:09.527 { 00:15:09.527 "name": "BaseBdev1", 00:15:09.527 "uuid": "c1a8d56a-d77d-4219-a1c1-cbef4bef4f38", 00:15:09.527 "is_configured": true, 00:15:09.527 "data_offset": 2048, 00:15:09.527 "data_size": 63488 00:15:09.527 }, 00:15:09.527 { 00:15:09.527 "name": null, 00:15:09.527 "uuid": "ea136e86-4ce2-413e-b2ea-1f3c1d41bcc8", 00:15:09.527 "is_configured": false, 00:15:09.527 "data_offset": 2048, 00:15:09.527 "data_size": 63488 00:15:09.527 }, 00:15:09.527 { 00:15:09.527 "name": "BaseBdev3", 00:15:09.527 "uuid": "15c1790d-e20b-422c-86a2-72fac2137a51", 00:15:09.527 "is_configured": true, 00:15:09.527 "data_offset": 2048, 00:15:09.527 "data_size": 63488 00:15:09.527 } 00:15:09.527 ] 00:15:09.527 }' 00:15:09.527 23:57:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:09.527 23:57:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:10.093 23:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:10.093 23:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:10.350 23:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:15:10.350 23:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:10.609 [2024-05-14 23:57:10.951421] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:10.609 23:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:10.609 23:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:10.609 23:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:10.609 23:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:15:10.609 23:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:15:10.609 23:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:15:10.609 23:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:10.609 23:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:10.609 23:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:10.609 23:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:10.609 23:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:10.609 23:57:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:10.609 23:57:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:10.609 "name": "Existed_Raid", 00:15:10.609 "uuid": "871aefd7-34e0-45fc-892d-de85e8fce1a9", 00:15:10.609 "strip_size_kb": 0, 00:15:10.609 "state": "configuring", 00:15:10.609 "raid_level": "raid1", 00:15:10.609 "superblock": true, 00:15:10.609 "num_base_bdevs": 3, 00:15:10.609 "num_base_bdevs_discovered": 1, 00:15:10.609 "num_base_bdevs_operational": 3, 00:15:10.609 "base_bdevs_list": [ 00:15:10.609 { 00:15:10.609 "name": "BaseBdev1", 00:15:10.609 "uuid": "c1a8d56a-d77d-4219-a1c1-cbef4bef4f38", 00:15:10.609 "is_configured": true, 00:15:10.609 "data_offset": 2048, 00:15:10.609 "data_size": 63488 00:15:10.609 }, 00:15:10.609 { 00:15:10.609 "name": null, 00:15:10.609 "uuid": "ea136e86-4ce2-413e-b2ea-1f3c1d41bcc8", 00:15:10.609 "is_configured": false, 00:15:10.609 "data_offset": 2048, 00:15:10.609 "data_size": 63488 00:15:10.609 }, 00:15:10.609 { 00:15:10.609 "name": null, 00:15:10.609 "uuid": "15c1790d-e20b-422c-86a2-72fac2137a51", 00:15:10.609 "is_configured": false, 00:15:10.609 "data_offset": 2048, 00:15:10.609 "data_size": 63488 00:15:10.609 } 00:15:10.609 ] 00:15:10.609 }' 00:15:10.609 23:57:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:10.609 23:57:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:11.181 23:57:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:11.181 23:57:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:11.442 23:57:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:15:11.442 23:57:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:11.700 [2024-05-14 23:57:12.234860] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:11.700 23:57:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:11.700 23:57:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:11.700 23:57:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:11.700 23:57:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:15:11.700 23:57:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:15:11.700 23:57:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:15:11.700 23:57:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:11.700 23:57:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:11.700 23:57:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:11.700 23:57:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:11.700 23:57:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:11.700 23:57:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:11.957 23:57:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:11.957 "name": "Existed_Raid", 00:15:11.957 "uuid": "871aefd7-34e0-45fc-892d-de85e8fce1a9", 00:15:11.957 "strip_size_kb": 0, 00:15:11.957 "state": "configuring", 00:15:11.957 "raid_level": "raid1", 00:15:11.957 "superblock": true, 00:15:11.957 "num_base_bdevs": 3, 00:15:11.957 "num_base_bdevs_discovered": 2, 00:15:11.957 "num_base_bdevs_operational": 3, 00:15:11.957 "base_bdevs_list": [ 00:15:11.957 { 00:15:11.957 "name": "BaseBdev1", 00:15:11.957 "uuid": "c1a8d56a-d77d-4219-a1c1-cbef4bef4f38", 00:15:11.957 "is_configured": true, 00:15:11.957 "data_offset": 2048, 00:15:11.957 "data_size": 63488 00:15:11.957 }, 00:15:11.957 { 00:15:11.957 "name": null, 00:15:11.957 "uuid": "ea136e86-4ce2-413e-b2ea-1f3c1d41bcc8", 00:15:11.957 "is_configured": false, 00:15:11.957 "data_offset": 2048, 00:15:11.957 "data_size": 63488 00:15:11.957 }, 00:15:11.957 { 00:15:11.957 "name": "BaseBdev3", 00:15:11.957 "uuid": "15c1790d-e20b-422c-86a2-72fac2137a51", 00:15:11.957 "is_configured": true, 00:15:11.957 "data_offset": 2048, 00:15:11.957 "data_size": 63488 00:15:11.957 } 00:15:11.957 ] 00:15:11.957 }' 00:15:11.957 23:57:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:11.957 23:57:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:12.523 23:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:12.523 23:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:12.782 23:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:15:12.782 23:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:13.039 [2024-05-14 23:57:13.566389] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:13.039 23:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:13.039 23:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:13.039 23:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:13.039 23:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:15:13.039 23:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:15:13.039 23:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:15:13.039 23:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:13.039 23:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:13.039 23:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:13.039 23:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:13.039 23:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:13.039 23:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:13.297 23:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:13.297 "name": "Existed_Raid", 00:15:13.297 "uuid": "871aefd7-34e0-45fc-892d-de85e8fce1a9", 00:15:13.297 "strip_size_kb": 0, 00:15:13.297 "state": "configuring", 00:15:13.297 "raid_level": "raid1", 00:15:13.297 "superblock": true, 00:15:13.297 "num_base_bdevs": 3, 00:15:13.297 "num_base_bdevs_discovered": 1, 00:15:13.297 "num_base_bdevs_operational": 3, 00:15:13.297 "base_bdevs_list": [ 00:15:13.297 { 00:15:13.297 "name": null, 00:15:13.297 "uuid": "c1a8d56a-d77d-4219-a1c1-cbef4bef4f38", 00:15:13.297 "is_configured": false, 00:15:13.297 "data_offset": 2048, 00:15:13.297 "data_size": 63488 00:15:13.297 }, 00:15:13.297 { 00:15:13.297 "name": null, 00:15:13.297 "uuid": "ea136e86-4ce2-413e-b2ea-1f3c1d41bcc8", 00:15:13.297 "is_configured": false, 00:15:13.297 "data_offset": 2048, 00:15:13.297 "data_size": 63488 00:15:13.297 }, 00:15:13.297 { 00:15:13.297 "name": "BaseBdev3", 00:15:13.297 "uuid": "15c1790d-e20b-422c-86a2-72fac2137a51", 00:15:13.297 "is_configured": true, 00:15:13.297 "data_offset": 2048, 00:15:13.297 "data_size": 63488 00:15:13.297 } 00:15:13.297 ] 00:15:13.297 }' 00:15:13.297 23:57:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:13.297 23:57:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:13.863 23:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:13.863 23:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:14.120 23:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:15:14.120 23:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:14.378 [2024-05-14 23:57:14.916552] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:14.378 23:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:14.378 23:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:14.378 23:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:14.378 23:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:15:14.378 23:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:15:14.378 23:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:15:14.378 23:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:14.378 23:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:14.378 23:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:14.378 23:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:14.378 23:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:14.378 23:57:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:14.636 23:57:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:14.636 "name": "Existed_Raid", 00:15:14.636 "uuid": "871aefd7-34e0-45fc-892d-de85e8fce1a9", 00:15:14.636 "strip_size_kb": 0, 00:15:14.636 "state": "configuring", 00:15:14.636 "raid_level": "raid1", 00:15:14.636 "superblock": true, 00:15:14.636 "num_base_bdevs": 3, 00:15:14.636 "num_base_bdevs_discovered": 2, 00:15:14.636 "num_base_bdevs_operational": 3, 00:15:14.636 "base_bdevs_list": [ 00:15:14.636 { 00:15:14.636 "name": null, 00:15:14.636 "uuid": "c1a8d56a-d77d-4219-a1c1-cbef4bef4f38", 00:15:14.636 "is_configured": false, 00:15:14.636 "data_offset": 2048, 00:15:14.636 "data_size": 63488 00:15:14.636 }, 00:15:14.636 { 00:15:14.636 "name": "BaseBdev2", 00:15:14.636 "uuid": "ea136e86-4ce2-413e-b2ea-1f3c1d41bcc8", 00:15:14.636 "is_configured": true, 00:15:14.636 "data_offset": 2048, 00:15:14.636 "data_size": 63488 00:15:14.636 }, 00:15:14.636 { 00:15:14.636 "name": "BaseBdev3", 00:15:14.636 "uuid": "15c1790d-e20b-422c-86a2-72fac2137a51", 00:15:14.636 "is_configured": true, 00:15:14.636 "data_offset": 2048, 00:15:14.636 "data_size": 63488 00:15:14.636 } 00:15:14.636 ] 00:15:14.636 }' 00:15:14.636 23:57:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:14.636 23:57:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:15.202 23:57:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:15.202 23:57:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:15.459 23:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:15:15.459 23:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:15.459 23:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:15.717 23:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u c1a8d56a-d77d-4219-a1c1-cbef4bef4f38 00:15:15.975 [2024-05-14 23:57:16.489619] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:15.975 [2024-05-14 23:57:16.489773] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1325730 00:15:15.975 [2024-05-14 23:57:16.489786] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:15.975 [2024-05-14 23:57:16.489960] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14d2600 00:15:15.975 [2024-05-14 23:57:16.490090] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1325730 00:15:15.975 [2024-05-14 23:57:16.490100] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1325730 00:15:15.975 [2024-05-14 23:57:16.490199] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:15.975 NewBaseBdev 00:15:15.975 23:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:15:15.975 23:57:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:15:15.975 23:57:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:15.975 23:57:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:15:15.975 23:57:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:15.975 23:57:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:15.975 23:57:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:16.233 23:57:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:16.490 [ 00:15:16.490 { 00:15:16.490 "name": "NewBaseBdev", 00:15:16.490 "aliases": [ 00:15:16.490 "c1a8d56a-d77d-4219-a1c1-cbef4bef4f38" 00:15:16.490 ], 00:15:16.490 "product_name": "Malloc disk", 00:15:16.490 "block_size": 512, 00:15:16.490 "num_blocks": 65536, 00:15:16.490 "uuid": "c1a8d56a-d77d-4219-a1c1-cbef4bef4f38", 00:15:16.490 "assigned_rate_limits": { 00:15:16.490 "rw_ios_per_sec": 0, 00:15:16.490 "rw_mbytes_per_sec": 0, 00:15:16.490 "r_mbytes_per_sec": 0, 00:15:16.490 "w_mbytes_per_sec": 0 00:15:16.490 }, 00:15:16.490 "claimed": true, 00:15:16.490 "claim_type": "exclusive_write", 00:15:16.490 "zoned": false, 00:15:16.490 "supported_io_types": { 00:15:16.490 "read": true, 00:15:16.490 "write": true, 00:15:16.490 "unmap": true, 00:15:16.490 "write_zeroes": true, 00:15:16.490 "flush": true, 00:15:16.490 "reset": true, 00:15:16.490 "compare": false, 00:15:16.490 "compare_and_write": false, 00:15:16.490 "abort": true, 00:15:16.490 "nvme_admin": false, 00:15:16.490 "nvme_io": false 00:15:16.490 }, 00:15:16.490 "memory_domains": [ 00:15:16.490 { 00:15:16.490 "dma_device_id": "system", 00:15:16.490 "dma_device_type": 1 00:15:16.490 }, 00:15:16.490 { 00:15:16.490 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:16.490 "dma_device_type": 2 00:15:16.490 } 00:15:16.490 ], 00:15:16.490 "driver_specific": {} 00:15:16.490 } 00:15:16.490 ] 00:15:16.490 23:57:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:15:16.490 23:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:15:16.490 23:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:16.490 23:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:15:16.490 23:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:15:16.490 23:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:15:16.490 23:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:15:16.490 23:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:16.490 23:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:16.490 23:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:16.490 23:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:16.490 23:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:16.490 23:57:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:16.748 23:57:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:16.748 "name": "Existed_Raid", 00:15:16.748 "uuid": "871aefd7-34e0-45fc-892d-de85e8fce1a9", 00:15:16.748 "strip_size_kb": 0, 00:15:16.748 "state": "online", 00:15:16.748 "raid_level": "raid1", 00:15:16.748 "superblock": true, 00:15:16.748 "num_base_bdevs": 3, 00:15:16.748 "num_base_bdevs_discovered": 3, 00:15:16.748 "num_base_bdevs_operational": 3, 00:15:16.748 "base_bdevs_list": [ 00:15:16.748 { 00:15:16.748 "name": "NewBaseBdev", 00:15:16.748 "uuid": "c1a8d56a-d77d-4219-a1c1-cbef4bef4f38", 00:15:16.748 "is_configured": true, 00:15:16.748 "data_offset": 2048, 00:15:16.748 "data_size": 63488 00:15:16.748 }, 00:15:16.748 { 00:15:16.748 "name": "BaseBdev2", 00:15:16.748 "uuid": "ea136e86-4ce2-413e-b2ea-1f3c1d41bcc8", 00:15:16.748 "is_configured": true, 00:15:16.748 "data_offset": 2048, 00:15:16.748 "data_size": 63488 00:15:16.748 }, 00:15:16.748 { 00:15:16.748 "name": "BaseBdev3", 00:15:16.748 "uuid": "15c1790d-e20b-422c-86a2-72fac2137a51", 00:15:16.748 "is_configured": true, 00:15:16.748 "data_offset": 2048, 00:15:16.748 "data_size": 63488 00:15:16.748 } 00:15:16.748 ] 00:15:16.748 }' 00:15:16.748 23:57:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:16.748 23:57:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:17.314 23:57:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:15:17.314 23:57:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:15:17.314 23:57:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:15:17.314 23:57:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:15:17.314 23:57:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:15:17.314 23:57:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:15:17.314 23:57:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:17.314 23:57:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:15:17.602 [2024-05-14 23:57:18.062059] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:17.602 23:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:15:17.602 "name": "Existed_Raid", 00:15:17.602 "aliases": [ 00:15:17.602 "871aefd7-34e0-45fc-892d-de85e8fce1a9" 00:15:17.602 ], 00:15:17.602 "product_name": "Raid Volume", 00:15:17.602 "block_size": 512, 00:15:17.602 "num_blocks": 63488, 00:15:17.602 "uuid": "871aefd7-34e0-45fc-892d-de85e8fce1a9", 00:15:17.602 "assigned_rate_limits": { 00:15:17.602 "rw_ios_per_sec": 0, 00:15:17.602 "rw_mbytes_per_sec": 0, 00:15:17.602 "r_mbytes_per_sec": 0, 00:15:17.602 "w_mbytes_per_sec": 0 00:15:17.602 }, 00:15:17.602 "claimed": false, 00:15:17.602 "zoned": false, 00:15:17.602 "supported_io_types": { 00:15:17.602 "read": true, 00:15:17.602 "write": true, 00:15:17.602 "unmap": false, 00:15:17.602 "write_zeroes": true, 00:15:17.602 "flush": false, 00:15:17.602 "reset": true, 00:15:17.602 "compare": false, 00:15:17.602 "compare_and_write": false, 00:15:17.602 "abort": false, 00:15:17.602 "nvme_admin": false, 00:15:17.602 "nvme_io": false 00:15:17.602 }, 00:15:17.602 "memory_domains": [ 00:15:17.602 { 00:15:17.602 "dma_device_id": "system", 00:15:17.602 "dma_device_type": 1 00:15:17.602 }, 00:15:17.602 { 00:15:17.602 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:17.602 "dma_device_type": 2 00:15:17.602 }, 00:15:17.602 { 00:15:17.602 "dma_device_id": "system", 00:15:17.602 "dma_device_type": 1 00:15:17.602 }, 00:15:17.602 { 00:15:17.602 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:17.602 "dma_device_type": 2 00:15:17.602 }, 00:15:17.602 { 00:15:17.602 "dma_device_id": "system", 00:15:17.602 "dma_device_type": 1 00:15:17.602 }, 00:15:17.602 { 00:15:17.602 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:17.602 "dma_device_type": 2 00:15:17.602 } 00:15:17.602 ], 00:15:17.602 "driver_specific": { 00:15:17.602 "raid": { 00:15:17.602 "uuid": "871aefd7-34e0-45fc-892d-de85e8fce1a9", 00:15:17.602 "strip_size_kb": 0, 00:15:17.602 "state": "online", 00:15:17.602 "raid_level": "raid1", 00:15:17.602 "superblock": true, 00:15:17.602 "num_base_bdevs": 3, 00:15:17.602 "num_base_bdevs_discovered": 3, 00:15:17.602 "num_base_bdevs_operational": 3, 00:15:17.602 "base_bdevs_list": [ 00:15:17.602 { 00:15:17.602 "name": "NewBaseBdev", 00:15:17.602 "uuid": "c1a8d56a-d77d-4219-a1c1-cbef4bef4f38", 00:15:17.602 "is_configured": true, 00:15:17.602 "data_offset": 2048, 00:15:17.602 "data_size": 63488 00:15:17.602 }, 00:15:17.602 { 00:15:17.602 "name": "BaseBdev2", 00:15:17.602 "uuid": "ea136e86-4ce2-413e-b2ea-1f3c1d41bcc8", 00:15:17.602 "is_configured": true, 00:15:17.602 "data_offset": 2048, 00:15:17.602 "data_size": 63488 00:15:17.602 }, 00:15:17.602 { 00:15:17.602 "name": "BaseBdev3", 00:15:17.602 "uuid": "15c1790d-e20b-422c-86a2-72fac2137a51", 00:15:17.602 "is_configured": true, 00:15:17.602 "data_offset": 2048, 00:15:17.602 "data_size": 63488 00:15:17.602 } 00:15:17.602 ] 00:15:17.602 } 00:15:17.602 } 00:15:17.602 }' 00:15:17.602 23:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:17.602 23:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:15:17.602 BaseBdev2 00:15:17.602 BaseBdev3' 00:15:17.602 23:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:17.602 23:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:17.603 23:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:17.861 23:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:17.861 "name": "NewBaseBdev", 00:15:17.861 "aliases": [ 00:15:17.861 "c1a8d56a-d77d-4219-a1c1-cbef4bef4f38" 00:15:17.861 ], 00:15:17.861 "product_name": "Malloc disk", 00:15:17.861 "block_size": 512, 00:15:17.861 "num_blocks": 65536, 00:15:17.861 "uuid": "c1a8d56a-d77d-4219-a1c1-cbef4bef4f38", 00:15:17.861 "assigned_rate_limits": { 00:15:17.861 "rw_ios_per_sec": 0, 00:15:17.861 "rw_mbytes_per_sec": 0, 00:15:17.861 "r_mbytes_per_sec": 0, 00:15:17.861 "w_mbytes_per_sec": 0 00:15:17.861 }, 00:15:17.861 "claimed": true, 00:15:17.861 "claim_type": "exclusive_write", 00:15:17.861 "zoned": false, 00:15:17.861 "supported_io_types": { 00:15:17.861 "read": true, 00:15:17.861 "write": true, 00:15:17.861 "unmap": true, 00:15:17.861 "write_zeroes": true, 00:15:17.861 "flush": true, 00:15:17.861 "reset": true, 00:15:17.861 "compare": false, 00:15:17.861 "compare_and_write": false, 00:15:17.861 "abort": true, 00:15:17.861 "nvme_admin": false, 00:15:17.861 "nvme_io": false 00:15:17.861 }, 00:15:17.861 "memory_domains": [ 00:15:17.861 { 00:15:17.861 "dma_device_id": "system", 00:15:17.861 "dma_device_type": 1 00:15:17.861 }, 00:15:17.861 { 00:15:17.861 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:17.861 "dma_device_type": 2 00:15:17.861 } 00:15:17.861 ], 00:15:17.861 "driver_specific": {} 00:15:17.861 }' 00:15:17.861 23:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:17.861 23:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:18.119 23:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:18.119 23:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:18.119 23:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:18.119 23:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:18.119 23:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:18.119 23:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:18.119 23:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:18.119 23:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:18.119 23:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:18.119 23:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:18.119 23:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:18.119 23:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:18.119 23:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:18.377 23:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:18.377 "name": "BaseBdev2", 00:15:18.377 "aliases": [ 00:15:18.377 "ea136e86-4ce2-413e-b2ea-1f3c1d41bcc8" 00:15:18.377 ], 00:15:18.377 "product_name": "Malloc disk", 00:15:18.377 "block_size": 512, 00:15:18.377 "num_blocks": 65536, 00:15:18.377 "uuid": "ea136e86-4ce2-413e-b2ea-1f3c1d41bcc8", 00:15:18.377 "assigned_rate_limits": { 00:15:18.377 "rw_ios_per_sec": 0, 00:15:18.377 "rw_mbytes_per_sec": 0, 00:15:18.377 "r_mbytes_per_sec": 0, 00:15:18.377 "w_mbytes_per_sec": 0 00:15:18.377 }, 00:15:18.377 "claimed": true, 00:15:18.377 "claim_type": "exclusive_write", 00:15:18.377 "zoned": false, 00:15:18.377 "supported_io_types": { 00:15:18.377 "read": true, 00:15:18.377 "write": true, 00:15:18.377 "unmap": true, 00:15:18.377 "write_zeroes": true, 00:15:18.377 "flush": true, 00:15:18.377 "reset": true, 00:15:18.377 "compare": false, 00:15:18.377 "compare_and_write": false, 00:15:18.377 "abort": true, 00:15:18.377 "nvme_admin": false, 00:15:18.377 "nvme_io": false 00:15:18.377 }, 00:15:18.377 "memory_domains": [ 00:15:18.377 { 00:15:18.377 "dma_device_id": "system", 00:15:18.377 "dma_device_type": 1 00:15:18.377 }, 00:15:18.377 { 00:15:18.377 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:18.377 "dma_device_type": 2 00:15:18.377 } 00:15:18.377 ], 00:15:18.377 "driver_specific": {} 00:15:18.377 }' 00:15:18.377 23:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:18.377 23:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:18.635 23:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:18.635 23:57:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:18.635 23:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:18.635 23:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:18.635 23:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:18.635 23:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:18.635 23:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:18.635 23:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:18.635 23:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:18.892 23:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:18.892 23:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:18.892 23:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:18.892 23:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:19.150 23:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:19.150 "name": "BaseBdev3", 00:15:19.150 "aliases": [ 00:15:19.150 "15c1790d-e20b-422c-86a2-72fac2137a51" 00:15:19.150 ], 00:15:19.150 "product_name": "Malloc disk", 00:15:19.150 "block_size": 512, 00:15:19.150 "num_blocks": 65536, 00:15:19.150 "uuid": "15c1790d-e20b-422c-86a2-72fac2137a51", 00:15:19.150 "assigned_rate_limits": { 00:15:19.150 "rw_ios_per_sec": 0, 00:15:19.150 "rw_mbytes_per_sec": 0, 00:15:19.150 "r_mbytes_per_sec": 0, 00:15:19.150 "w_mbytes_per_sec": 0 00:15:19.150 }, 00:15:19.150 "claimed": true, 00:15:19.150 "claim_type": "exclusive_write", 00:15:19.150 "zoned": false, 00:15:19.150 "supported_io_types": { 00:15:19.150 "read": true, 00:15:19.150 "write": true, 00:15:19.150 "unmap": true, 00:15:19.150 "write_zeroes": true, 00:15:19.150 "flush": true, 00:15:19.150 "reset": true, 00:15:19.150 "compare": false, 00:15:19.150 "compare_and_write": false, 00:15:19.150 "abort": true, 00:15:19.150 "nvme_admin": false, 00:15:19.150 "nvme_io": false 00:15:19.150 }, 00:15:19.150 "memory_domains": [ 00:15:19.150 { 00:15:19.150 "dma_device_id": "system", 00:15:19.150 "dma_device_type": 1 00:15:19.150 }, 00:15:19.150 { 00:15:19.150 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:19.150 "dma_device_type": 2 00:15:19.150 } 00:15:19.150 ], 00:15:19.150 "driver_specific": {} 00:15:19.150 }' 00:15:19.150 23:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:19.150 23:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:19.150 23:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:19.150 23:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:19.150 23:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:19.150 23:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:19.150 23:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:19.150 23:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:19.408 23:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:19.408 23:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:19.408 23:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:19.408 23:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:19.408 23:57:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:19.666 [2024-05-14 23:57:20.079276] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:19.666 [2024-05-14 23:57:20.079305] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:19.666 [2024-05-14 23:57:20.079362] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:19.666 [2024-05-14 23:57:20.079637] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:19.666 [2024-05-14 23:57:20.079651] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1325730 name Existed_Raid, state offline 00:15:19.666 23:57:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@342 -- # killprocess 426028 00:15:19.666 23:57:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@946 -- # '[' -z 426028 ']' 00:15:19.666 23:57:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # kill -0 426028 00:15:19.666 23:57:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # uname 00:15:19.666 23:57:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:15:19.666 23:57:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 426028 00:15:19.666 23:57:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:15:19.666 23:57:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:15:19.666 23:57:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 426028' 00:15:19.666 killing process with pid 426028 00:15:19.666 23:57:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@965 -- # kill 426028 00:15:19.666 [2024-05-14 23:57:20.137224] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:19.666 23:57:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@970 -- # wait 426028 00:15:19.666 [2024-05-14 23:57:20.164969] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:19.924 23:57:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@344 -- # return 0 00:15:19.924 00:15:19.924 real 0m28.157s 00:15:19.924 user 0m51.646s 00:15:19.924 sys 0m5.046s 00:15:19.924 23:57:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:15:19.924 23:57:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:19.924 ************************************ 00:15:19.924 END TEST raid_state_function_test_sb 00:15:19.924 ************************************ 00:15:19.924 23:57:20 bdev_raid -- bdev/bdev_raid.sh@817 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:15:19.924 23:57:20 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:15:19.924 23:57:20 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:15:19.924 23:57:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:19.924 ************************************ 00:15:19.924 START TEST raid_superblock_test 00:15:19.924 ************************************ 00:15:19.924 23:57:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1121 -- # raid_superblock_test raid1 3 00:15:19.924 23:57:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local raid_level=raid1 00:15:19.924 23:57:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=3 00:15:19.924 23:57:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:15:19.924 23:57:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:15:19.924 23:57:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:15:19.924 23:57:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:15:19.924 23:57:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:15:19.924 23:57:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:15:19.924 23:57:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:15:19.924 23:57:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size 00:15:19.924 23:57:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:15:19.924 23:57:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:15:19.924 23:57:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:15:19.924 23:57:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # '[' raid1 '!=' raid1 ']' 00:15:19.924 23:57:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # strip_size=0 00:15:19.924 23:57:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # raid_pid=430828 00:15:19.924 23:57:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # waitforlisten 430828 /var/tmp/spdk-raid.sock 00:15:19.924 23:57:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:15:19.924 23:57:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@827 -- # '[' -z 430828 ']' 00:15:19.924 23:57:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:19.924 23:57:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:15:19.924 23:57:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:19.924 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:19.924 23:57:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:15:19.924 23:57:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:20.183 [2024-05-14 23:57:20.560921] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:15:20.183 [2024-05-14 23:57:20.560984] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid430828 ] 00:15:20.183 [2024-05-14 23:57:20.688850] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:20.441 [2024-05-14 23:57:20.797927] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:15:20.441 [2024-05-14 23:57:20.866220] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:20.441 [2024-05-14 23:57:20.866260] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:21.006 23:57:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:15:21.007 23:57:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # return 0 00:15:21.007 23:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:15:21.007 23:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:15:21.007 23:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:15:21.007 23:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:15:21.007 23:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:15:21.007 23:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:21.007 23:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:15:21.007 23:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:21.007 23:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:15:21.264 malloc1 00:15:21.264 23:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:21.523 [2024-05-14 23:57:21.957519] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:21.523 [2024-05-14 23:57:21.957570] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:21.523 [2024-05-14 23:57:21.957593] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1329780 00:15:21.523 [2024-05-14 23:57:21.957605] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:21.523 [2024-05-14 23:57:21.959389] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:21.523 [2024-05-14 23:57:21.959424] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:21.523 pt1 00:15:21.523 23:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:15:21.523 23:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:15:21.523 23:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:15:21.523 23:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:15:21.523 23:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:15:21.523 23:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:21.523 23:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:15:21.523 23:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:21.523 23:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:15:21.782 malloc2 00:15:21.782 23:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:22.040 [2024-05-14 23:57:22.456995] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:22.040 [2024-05-14 23:57:22.457043] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:22.040 [2024-05-14 23:57:22.457062] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x132ab60 00:15:22.040 [2024-05-14 23:57:22.457074] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:22.040 [2024-05-14 23:57:22.458651] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:22.040 [2024-05-14 23:57:22.458681] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:22.040 pt2 00:15:22.040 23:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:15:22.040 23:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:15:22.040 23:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc3 00:15:22.040 23:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt3 00:15:22.040 23:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:15:22.040 23:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:22.040 23:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:15:22.040 23:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:22.040 23:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:15:22.298 malloc3 00:15:22.298 23:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:22.556 [2024-05-14 23:57:22.947495] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:22.556 [2024-05-14 23:57:22.947542] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:22.556 [2024-05-14 23:57:22.947562] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14d5080 00:15:22.556 [2024-05-14 23:57:22.947574] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:22.556 [2024-05-14 23:57:22.949113] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:22.556 [2024-05-14 23:57:22.949140] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:22.556 pt3 00:15:22.556 23:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:15:22.556 23:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:15:22.556 23:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:15:22.814 [2024-05-14 23:57:23.176127] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:22.814 [2024-05-14 23:57:23.177463] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:22.814 [2024-05-14 23:57:23.177521] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:22.814 [2024-05-14 23:57:23.177683] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x14d8910 00:15:22.814 [2024-05-14 23:57:23.177694] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:22.814 [2024-05-14 23:57:23.177891] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1340670 00:15:22.814 [2024-05-14 23:57:23.178038] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14d8910 00:15:22.814 [2024-05-14 23:57:23.178048] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14d8910 00:15:22.814 [2024-05-14 23:57:23.178152] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:22.814 23:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:15:22.814 23:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:15:22.814 23:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:15:22.814 23:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:15:22.814 23:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:15:22.814 23:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:15:22.814 23:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:22.814 23:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:22.814 23:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:22.814 23:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:22.814 23:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:22.814 23:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:22.814 23:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:22.814 "name": "raid_bdev1", 00:15:22.814 "uuid": "496a448b-16e1-45ae-9ed5-a286a6279200", 00:15:22.814 "strip_size_kb": 0, 00:15:22.814 "state": "online", 00:15:22.814 "raid_level": "raid1", 00:15:22.814 "superblock": true, 00:15:22.814 "num_base_bdevs": 3, 00:15:22.814 "num_base_bdevs_discovered": 3, 00:15:22.814 "num_base_bdevs_operational": 3, 00:15:22.814 "base_bdevs_list": [ 00:15:22.814 { 00:15:22.814 "name": "pt1", 00:15:22.814 "uuid": "2f853ff4-3c33-5abc-8c7a-edbacfacc18d", 00:15:22.814 "is_configured": true, 00:15:22.814 "data_offset": 2048, 00:15:22.814 "data_size": 63488 00:15:22.814 }, 00:15:22.814 { 00:15:22.814 "name": "pt2", 00:15:22.814 "uuid": "bafcc775-85cf-5acf-8def-6fecd4d2f101", 00:15:22.814 "is_configured": true, 00:15:22.814 "data_offset": 2048, 00:15:22.814 "data_size": 63488 00:15:22.814 }, 00:15:22.814 { 00:15:22.814 "name": "pt3", 00:15:22.814 "uuid": "b553e3e7-b168-5191-8206-79840a7403a2", 00:15:22.814 "is_configured": true, 00:15:22.814 "data_offset": 2048, 00:15:22.814 "data_size": 63488 00:15:22.814 } 00:15:22.814 ] 00:15:22.814 }' 00:15:22.814 23:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:22.814 23:57:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:23.748 23:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:15:23.748 23:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:15:23.748 23:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:15:23.748 23:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:15:23.748 23:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:15:23.748 23:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:15:23.748 23:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:23.748 23:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:15:23.748 [2024-05-14 23:57:24.203056] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:23.748 23:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:15:23.748 "name": "raid_bdev1", 00:15:23.748 "aliases": [ 00:15:23.748 "496a448b-16e1-45ae-9ed5-a286a6279200" 00:15:23.748 ], 00:15:23.748 "product_name": "Raid Volume", 00:15:23.748 "block_size": 512, 00:15:23.748 "num_blocks": 63488, 00:15:23.748 "uuid": "496a448b-16e1-45ae-9ed5-a286a6279200", 00:15:23.748 "assigned_rate_limits": { 00:15:23.748 "rw_ios_per_sec": 0, 00:15:23.748 "rw_mbytes_per_sec": 0, 00:15:23.748 "r_mbytes_per_sec": 0, 00:15:23.748 "w_mbytes_per_sec": 0 00:15:23.748 }, 00:15:23.748 "claimed": false, 00:15:23.748 "zoned": false, 00:15:23.748 "supported_io_types": { 00:15:23.748 "read": true, 00:15:23.748 "write": true, 00:15:23.748 "unmap": false, 00:15:23.748 "write_zeroes": true, 00:15:23.748 "flush": false, 00:15:23.748 "reset": true, 00:15:23.748 "compare": false, 00:15:23.748 "compare_and_write": false, 00:15:23.748 "abort": false, 00:15:23.748 "nvme_admin": false, 00:15:23.748 "nvme_io": false 00:15:23.748 }, 00:15:23.748 "memory_domains": [ 00:15:23.748 { 00:15:23.748 "dma_device_id": "system", 00:15:23.748 "dma_device_type": 1 00:15:23.748 }, 00:15:23.748 { 00:15:23.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:23.748 "dma_device_type": 2 00:15:23.748 }, 00:15:23.748 { 00:15:23.748 "dma_device_id": "system", 00:15:23.748 "dma_device_type": 1 00:15:23.748 }, 00:15:23.748 { 00:15:23.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:23.748 "dma_device_type": 2 00:15:23.748 }, 00:15:23.748 { 00:15:23.748 "dma_device_id": "system", 00:15:23.748 "dma_device_type": 1 00:15:23.748 }, 00:15:23.748 { 00:15:23.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:23.748 "dma_device_type": 2 00:15:23.748 } 00:15:23.748 ], 00:15:23.748 "driver_specific": { 00:15:23.748 "raid": { 00:15:23.748 "uuid": "496a448b-16e1-45ae-9ed5-a286a6279200", 00:15:23.748 "strip_size_kb": 0, 00:15:23.748 "state": "online", 00:15:23.748 "raid_level": "raid1", 00:15:23.748 "superblock": true, 00:15:23.748 "num_base_bdevs": 3, 00:15:23.748 "num_base_bdevs_discovered": 3, 00:15:23.748 "num_base_bdevs_operational": 3, 00:15:23.748 "base_bdevs_list": [ 00:15:23.748 { 00:15:23.748 "name": "pt1", 00:15:23.748 "uuid": "2f853ff4-3c33-5abc-8c7a-edbacfacc18d", 00:15:23.748 "is_configured": true, 00:15:23.748 "data_offset": 2048, 00:15:23.748 "data_size": 63488 00:15:23.748 }, 00:15:23.748 { 00:15:23.748 "name": "pt2", 00:15:23.748 "uuid": "bafcc775-85cf-5acf-8def-6fecd4d2f101", 00:15:23.748 "is_configured": true, 00:15:23.748 "data_offset": 2048, 00:15:23.748 "data_size": 63488 00:15:23.748 }, 00:15:23.748 { 00:15:23.748 "name": "pt3", 00:15:23.748 "uuid": "b553e3e7-b168-5191-8206-79840a7403a2", 00:15:23.748 "is_configured": true, 00:15:23.748 "data_offset": 2048, 00:15:23.748 "data_size": 63488 00:15:23.748 } 00:15:23.748 ] 00:15:23.748 } 00:15:23.748 } 00:15:23.748 }' 00:15:23.748 23:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:23.748 23:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:15:23.748 pt2 00:15:23.748 pt3' 00:15:23.748 23:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:23.748 23:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:23.748 23:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:24.006 23:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:24.006 "name": "pt1", 00:15:24.006 "aliases": [ 00:15:24.006 "2f853ff4-3c33-5abc-8c7a-edbacfacc18d" 00:15:24.006 ], 00:15:24.006 "product_name": "passthru", 00:15:24.006 "block_size": 512, 00:15:24.006 "num_blocks": 65536, 00:15:24.006 "uuid": "2f853ff4-3c33-5abc-8c7a-edbacfacc18d", 00:15:24.006 "assigned_rate_limits": { 00:15:24.006 "rw_ios_per_sec": 0, 00:15:24.006 "rw_mbytes_per_sec": 0, 00:15:24.006 "r_mbytes_per_sec": 0, 00:15:24.006 "w_mbytes_per_sec": 0 00:15:24.006 }, 00:15:24.006 "claimed": true, 00:15:24.006 "claim_type": "exclusive_write", 00:15:24.006 "zoned": false, 00:15:24.006 "supported_io_types": { 00:15:24.006 "read": true, 00:15:24.006 "write": true, 00:15:24.006 "unmap": true, 00:15:24.006 "write_zeroes": true, 00:15:24.006 "flush": true, 00:15:24.006 "reset": true, 00:15:24.006 "compare": false, 00:15:24.006 "compare_and_write": false, 00:15:24.006 "abort": true, 00:15:24.006 "nvme_admin": false, 00:15:24.006 "nvme_io": false 00:15:24.006 }, 00:15:24.006 "memory_domains": [ 00:15:24.006 { 00:15:24.006 "dma_device_id": "system", 00:15:24.006 "dma_device_type": 1 00:15:24.006 }, 00:15:24.006 { 00:15:24.006 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:24.006 "dma_device_type": 2 00:15:24.006 } 00:15:24.006 ], 00:15:24.006 "driver_specific": { 00:15:24.006 "passthru": { 00:15:24.006 "name": "pt1", 00:15:24.006 "base_bdev_name": "malloc1" 00:15:24.006 } 00:15:24.006 } 00:15:24.006 }' 00:15:24.006 23:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:24.006 23:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:24.265 23:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:24.265 23:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:24.265 23:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:24.265 23:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:24.265 23:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:24.265 23:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:24.265 23:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:24.265 23:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:24.265 23:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:24.523 23:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:24.523 23:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:24.523 23:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:24.523 23:57:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:24.523 23:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:24.523 "name": "pt2", 00:15:24.523 "aliases": [ 00:15:24.523 "bafcc775-85cf-5acf-8def-6fecd4d2f101" 00:15:24.523 ], 00:15:24.523 "product_name": "passthru", 00:15:24.523 "block_size": 512, 00:15:24.523 "num_blocks": 65536, 00:15:24.523 "uuid": "bafcc775-85cf-5acf-8def-6fecd4d2f101", 00:15:24.523 "assigned_rate_limits": { 00:15:24.523 "rw_ios_per_sec": 0, 00:15:24.523 "rw_mbytes_per_sec": 0, 00:15:24.523 "r_mbytes_per_sec": 0, 00:15:24.523 "w_mbytes_per_sec": 0 00:15:24.523 }, 00:15:24.523 "claimed": true, 00:15:24.523 "claim_type": "exclusive_write", 00:15:24.523 "zoned": false, 00:15:24.523 "supported_io_types": { 00:15:24.523 "read": true, 00:15:24.523 "write": true, 00:15:24.523 "unmap": true, 00:15:24.523 "write_zeroes": true, 00:15:24.523 "flush": true, 00:15:24.523 "reset": true, 00:15:24.523 "compare": false, 00:15:24.523 "compare_and_write": false, 00:15:24.523 "abort": true, 00:15:24.523 "nvme_admin": false, 00:15:24.523 "nvme_io": false 00:15:24.523 }, 00:15:24.523 "memory_domains": [ 00:15:24.523 { 00:15:24.523 "dma_device_id": "system", 00:15:24.523 "dma_device_type": 1 00:15:24.523 }, 00:15:24.523 { 00:15:24.523 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:24.523 "dma_device_type": 2 00:15:24.523 } 00:15:24.523 ], 00:15:24.523 "driver_specific": { 00:15:24.523 "passthru": { 00:15:24.523 "name": "pt2", 00:15:24.523 "base_bdev_name": "malloc2" 00:15:24.523 } 00:15:24.523 } 00:15:24.523 }' 00:15:24.523 23:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:24.782 23:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:24.782 23:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:24.782 23:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:24.782 23:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:24.782 23:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:24.782 23:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:24.782 23:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:24.782 23:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:25.040 23:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:25.040 23:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:25.040 23:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:25.040 23:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:25.040 23:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:25.040 23:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:25.298 23:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:25.298 "name": "pt3", 00:15:25.298 "aliases": [ 00:15:25.298 "b553e3e7-b168-5191-8206-79840a7403a2" 00:15:25.298 ], 00:15:25.298 "product_name": "passthru", 00:15:25.298 "block_size": 512, 00:15:25.298 "num_blocks": 65536, 00:15:25.298 "uuid": "b553e3e7-b168-5191-8206-79840a7403a2", 00:15:25.298 "assigned_rate_limits": { 00:15:25.298 "rw_ios_per_sec": 0, 00:15:25.298 "rw_mbytes_per_sec": 0, 00:15:25.298 "r_mbytes_per_sec": 0, 00:15:25.298 "w_mbytes_per_sec": 0 00:15:25.298 }, 00:15:25.298 "claimed": true, 00:15:25.298 "claim_type": "exclusive_write", 00:15:25.298 "zoned": false, 00:15:25.298 "supported_io_types": { 00:15:25.298 "read": true, 00:15:25.298 "write": true, 00:15:25.298 "unmap": true, 00:15:25.298 "write_zeroes": true, 00:15:25.298 "flush": true, 00:15:25.298 "reset": true, 00:15:25.298 "compare": false, 00:15:25.298 "compare_and_write": false, 00:15:25.298 "abort": true, 00:15:25.298 "nvme_admin": false, 00:15:25.298 "nvme_io": false 00:15:25.298 }, 00:15:25.298 "memory_domains": [ 00:15:25.298 { 00:15:25.298 "dma_device_id": "system", 00:15:25.298 "dma_device_type": 1 00:15:25.298 }, 00:15:25.298 { 00:15:25.298 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:25.298 "dma_device_type": 2 00:15:25.298 } 00:15:25.298 ], 00:15:25.299 "driver_specific": { 00:15:25.299 "passthru": { 00:15:25.299 "name": "pt3", 00:15:25.299 "base_bdev_name": "malloc3" 00:15:25.299 } 00:15:25.299 } 00:15:25.299 }' 00:15:25.299 23:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:25.299 23:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:25.299 23:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:25.299 23:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:25.299 23:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:25.557 23:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:25.557 23:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:25.557 23:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:25.557 23:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:25.557 23:57:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:25.557 23:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:25.557 23:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:25.557 23:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:25.557 23:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:15:25.815 [2024-05-14 23:57:26.296584] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:25.815 23:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=496a448b-16e1-45ae-9ed5-a286a6279200 00:15:25.815 23:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # '[' -z 496a448b-16e1-45ae-9ed5-a286a6279200 ']' 00:15:25.815 23:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:26.073 [2024-05-14 23:57:26.540978] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:26.073 [2024-05-14 23:57:26.541003] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:26.073 [2024-05-14 23:57:26.541055] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:26.073 [2024-05-14 23:57:26.541126] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:26.073 [2024-05-14 23:57:26.541138] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14d8910 name raid_bdev1, state offline 00:15:26.073 23:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:26.073 23:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:15:26.331 23:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:15:26.331 23:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:15:26.331 23:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:15:26.331 23:57:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:15:26.588 23:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:15:26.588 23:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:26.846 23:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:15:26.846 23:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:27.104 23:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:15:27.104 23:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:15:27.362 23:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:15:27.362 23:57:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:27.362 23:57:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:15:27.362 23:57:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:27.362 23:57:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:27.362 23:57:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:27.362 23:57:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:27.362 23:57:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:27.362 23:57:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:27.362 23:57:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:27.362 23:57:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:27.362 23:57:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:15:27.362 23:57:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:27.619 [2024-05-14 23:57:28.000774] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:15:27.619 [2024-05-14 23:57:28.002178] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:15:27.619 [2024-05-14 23:57:28.002222] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:15:27.619 [2024-05-14 23:57:28.002271] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:15:27.619 [2024-05-14 23:57:28.002310] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:15:27.619 [2024-05-14 23:57:28.002333] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:15:27.619 [2024-05-14 23:57:28.002351] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:27.619 [2024-05-14 23:57:28.002361] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14d5480 name raid_bdev1, state configuring 00:15:27.619 request: 00:15:27.619 { 00:15:27.619 "name": "raid_bdev1", 00:15:27.619 "raid_level": "raid1", 00:15:27.619 "base_bdevs": [ 00:15:27.619 "malloc1", 00:15:27.619 "malloc2", 00:15:27.619 "malloc3" 00:15:27.619 ], 00:15:27.619 "superblock": false, 00:15:27.619 "method": "bdev_raid_create", 00:15:27.619 "req_id": 1 00:15:27.619 } 00:15:27.619 Got JSON-RPC error response 00:15:27.619 response: 00:15:27.619 { 00:15:27.619 "code": -17, 00:15:27.619 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:15:27.619 } 00:15:27.619 23:57:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:15:27.619 23:57:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:15:27.619 23:57:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:15:27.619 23:57:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:15:27.619 23:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:27.619 23:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:15:27.878 23:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:15:27.878 23:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:15:27.878 23:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:28.136 [2024-05-14 23:57:28.490005] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:28.136 [2024-05-14 23:57:28.490050] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:28.136 [2024-05-14 23:57:28.490071] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14d2aa0 00:15:28.136 [2024-05-14 23:57:28.490084] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:28.136 [2024-05-14 23:57:28.491731] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:28.136 [2024-05-14 23:57:28.491759] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:28.136 [2024-05-14 23:57:28.491830] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt1 00:15:28.136 [2024-05-14 23:57:28.491857] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:28.136 pt1 00:15:28.136 23:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:15:28.136 23:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:15:28.136 23:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:28.136 23:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:15:28.136 23:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:15:28.136 23:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:15:28.136 23:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:28.136 23:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:28.136 23:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:28.136 23:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:28.136 23:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:28.136 23:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:28.394 23:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:28.394 "name": "raid_bdev1", 00:15:28.394 "uuid": "496a448b-16e1-45ae-9ed5-a286a6279200", 00:15:28.394 "strip_size_kb": 0, 00:15:28.394 "state": "configuring", 00:15:28.394 "raid_level": "raid1", 00:15:28.394 "superblock": true, 00:15:28.394 "num_base_bdevs": 3, 00:15:28.394 "num_base_bdevs_discovered": 1, 00:15:28.394 "num_base_bdevs_operational": 3, 00:15:28.394 "base_bdevs_list": [ 00:15:28.394 { 00:15:28.394 "name": "pt1", 00:15:28.394 "uuid": "2f853ff4-3c33-5abc-8c7a-edbacfacc18d", 00:15:28.394 "is_configured": true, 00:15:28.394 "data_offset": 2048, 00:15:28.394 "data_size": 63488 00:15:28.394 }, 00:15:28.394 { 00:15:28.394 "name": null, 00:15:28.394 "uuid": "bafcc775-85cf-5acf-8def-6fecd4d2f101", 00:15:28.394 "is_configured": false, 00:15:28.394 "data_offset": 2048, 00:15:28.394 "data_size": 63488 00:15:28.394 }, 00:15:28.394 { 00:15:28.394 "name": null, 00:15:28.394 "uuid": "b553e3e7-b168-5191-8206-79840a7403a2", 00:15:28.394 "is_configured": false, 00:15:28.394 "data_offset": 2048, 00:15:28.394 "data_size": 63488 00:15:28.394 } 00:15:28.394 ] 00:15:28.394 }' 00:15:28.394 23:57:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:28.394 23:57:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:28.960 23:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@470 -- # '[' 3 -gt 2 ']' 00:15:28.960 23:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:28.960 [2024-05-14 23:57:29.532776] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:28.960 [2024-05-14 23:57:29.532826] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:28.960 [2024-05-14 23:57:29.532848] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14d40c0 00:15:28.960 [2024-05-14 23:57:29.532861] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:28.960 [2024-05-14 23:57:29.533191] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:28.960 [2024-05-14 23:57:29.533208] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:28.960 [2024-05-14 23:57:29.533269] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:15:28.960 [2024-05-14 23:57:29.533288] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:28.960 pt2 00:15:29.218 23:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:29.218 [2024-05-14 23:57:29.777458] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:15:29.218 23:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:15:29.218 23:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:15:29.218 23:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:29.218 23:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:15:29.218 23:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:15:29.218 23:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:15:29.218 23:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:29.218 23:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:29.218 23:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:29.218 23:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:29.218 23:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.218 23:57:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:29.476 23:57:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:29.476 "name": "raid_bdev1", 00:15:29.476 "uuid": "496a448b-16e1-45ae-9ed5-a286a6279200", 00:15:29.476 "strip_size_kb": 0, 00:15:29.476 "state": "configuring", 00:15:29.476 "raid_level": "raid1", 00:15:29.476 "superblock": true, 00:15:29.476 "num_base_bdevs": 3, 00:15:29.476 "num_base_bdevs_discovered": 1, 00:15:29.476 "num_base_bdevs_operational": 3, 00:15:29.476 "base_bdevs_list": [ 00:15:29.476 { 00:15:29.476 "name": "pt1", 00:15:29.476 "uuid": "2f853ff4-3c33-5abc-8c7a-edbacfacc18d", 00:15:29.476 "is_configured": true, 00:15:29.476 "data_offset": 2048, 00:15:29.476 "data_size": 63488 00:15:29.476 }, 00:15:29.476 { 00:15:29.476 "name": null, 00:15:29.476 "uuid": "bafcc775-85cf-5acf-8def-6fecd4d2f101", 00:15:29.476 "is_configured": false, 00:15:29.476 "data_offset": 2048, 00:15:29.476 "data_size": 63488 00:15:29.476 }, 00:15:29.476 { 00:15:29.476 "name": null, 00:15:29.476 "uuid": "b553e3e7-b168-5191-8206-79840a7403a2", 00:15:29.476 "is_configured": false, 00:15:29.476 "data_offset": 2048, 00:15:29.476 "data_size": 63488 00:15:29.476 } 00:15:29.476 ] 00:15:29.476 }' 00:15:29.476 23:57:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:29.476 23:57:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:30.408 23:57:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:15:30.408 23:57:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:15:30.408 23:57:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:30.408 [2024-05-14 23:57:30.868331] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:30.408 [2024-05-14 23:57:30.868384] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:30.408 [2024-05-14 23:57:30.868414] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x132a370 00:15:30.408 [2024-05-14 23:57:30.868428] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:30.408 [2024-05-14 23:57:30.868763] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:30.408 [2024-05-14 23:57:30.868779] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:30.408 [2024-05-14 23:57:30.868843] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:15:30.408 [2024-05-14 23:57:30.868863] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:30.408 pt2 00:15:30.408 23:57:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:15:30.408 23:57:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:15:30.408 23:57:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:30.666 [2024-05-14 23:57:31.116985] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:30.666 [2024-05-14 23:57:31.117021] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:30.666 [2024-05-14 23:57:31.117042] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14d4660 00:15:30.666 [2024-05-14 23:57:31.117054] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:30.666 [2024-05-14 23:57:31.117345] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:30.666 [2024-05-14 23:57:31.117362] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:30.666 [2024-05-14 23:57:31.117427] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt3 00:15:30.666 [2024-05-14 23:57:31.117451] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:30.666 [2024-05-14 23:57:31.117577] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x14d6890 00:15:30.666 [2024-05-14 23:57:31.117592] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:30.666 [2024-05-14 23:57:31.117781] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1340670 00:15:30.666 [2024-05-14 23:57:31.117913] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14d6890 00:15:30.666 [2024-05-14 23:57:31.117922] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14d6890 00:15:30.666 [2024-05-14 23:57:31.118021] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:30.666 pt3 00:15:30.666 23:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:15:30.666 23:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:15:30.666 23:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:15:30.666 23:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:15:30.666 23:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:15:30.666 23:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:15:30.666 23:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:15:30.666 23:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:15:30.666 23:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:30.666 23:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:30.666 23:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:30.666 23:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:30.666 23:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:30.666 23:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:30.924 23:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:30.924 "name": "raid_bdev1", 00:15:30.924 "uuid": "496a448b-16e1-45ae-9ed5-a286a6279200", 00:15:30.924 "strip_size_kb": 0, 00:15:30.924 "state": "online", 00:15:30.924 "raid_level": "raid1", 00:15:30.924 "superblock": true, 00:15:30.924 "num_base_bdevs": 3, 00:15:30.924 "num_base_bdevs_discovered": 3, 00:15:30.924 "num_base_bdevs_operational": 3, 00:15:30.924 "base_bdevs_list": [ 00:15:30.924 { 00:15:30.924 "name": "pt1", 00:15:30.924 "uuid": "2f853ff4-3c33-5abc-8c7a-edbacfacc18d", 00:15:30.924 "is_configured": true, 00:15:30.924 "data_offset": 2048, 00:15:30.924 "data_size": 63488 00:15:30.924 }, 00:15:30.924 { 00:15:30.924 "name": "pt2", 00:15:30.924 "uuid": "bafcc775-85cf-5acf-8def-6fecd4d2f101", 00:15:30.924 "is_configured": true, 00:15:30.924 "data_offset": 2048, 00:15:30.924 "data_size": 63488 00:15:30.924 }, 00:15:30.924 { 00:15:30.924 "name": "pt3", 00:15:30.924 "uuid": "b553e3e7-b168-5191-8206-79840a7403a2", 00:15:30.924 "is_configured": true, 00:15:30.924 "data_offset": 2048, 00:15:30.924 "data_size": 63488 00:15:30.924 } 00:15:30.924 ] 00:15:30.924 }' 00:15:30.924 23:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:30.924 23:57:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:31.491 23:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:15:31.491 23:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:15:31.491 23:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:15:31.491 23:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:15:31.491 23:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:15:31.491 23:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:15:31.491 23:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:31.491 23:57:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:15:31.751 [2024-05-14 23:57:32.200110] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:31.751 23:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:15:31.751 "name": "raid_bdev1", 00:15:31.751 "aliases": [ 00:15:31.751 "496a448b-16e1-45ae-9ed5-a286a6279200" 00:15:31.751 ], 00:15:31.751 "product_name": "Raid Volume", 00:15:31.751 "block_size": 512, 00:15:31.751 "num_blocks": 63488, 00:15:31.751 "uuid": "496a448b-16e1-45ae-9ed5-a286a6279200", 00:15:31.751 "assigned_rate_limits": { 00:15:31.751 "rw_ios_per_sec": 0, 00:15:31.751 "rw_mbytes_per_sec": 0, 00:15:31.751 "r_mbytes_per_sec": 0, 00:15:31.751 "w_mbytes_per_sec": 0 00:15:31.751 }, 00:15:31.751 "claimed": false, 00:15:31.751 "zoned": false, 00:15:31.751 "supported_io_types": { 00:15:31.751 "read": true, 00:15:31.751 "write": true, 00:15:31.751 "unmap": false, 00:15:31.751 "write_zeroes": true, 00:15:31.751 "flush": false, 00:15:31.751 "reset": true, 00:15:31.751 "compare": false, 00:15:31.751 "compare_and_write": false, 00:15:31.751 "abort": false, 00:15:31.751 "nvme_admin": false, 00:15:31.751 "nvme_io": false 00:15:31.751 }, 00:15:31.751 "memory_domains": [ 00:15:31.751 { 00:15:31.751 "dma_device_id": "system", 00:15:31.751 "dma_device_type": 1 00:15:31.751 }, 00:15:31.751 { 00:15:31.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.751 "dma_device_type": 2 00:15:31.751 }, 00:15:31.751 { 00:15:31.751 "dma_device_id": "system", 00:15:31.751 "dma_device_type": 1 00:15:31.751 }, 00:15:31.751 { 00:15:31.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.751 "dma_device_type": 2 00:15:31.751 }, 00:15:31.751 { 00:15:31.751 "dma_device_id": "system", 00:15:31.751 "dma_device_type": 1 00:15:31.751 }, 00:15:31.751 { 00:15:31.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.751 "dma_device_type": 2 00:15:31.751 } 00:15:31.751 ], 00:15:31.751 "driver_specific": { 00:15:31.751 "raid": { 00:15:31.751 "uuid": "496a448b-16e1-45ae-9ed5-a286a6279200", 00:15:31.751 "strip_size_kb": 0, 00:15:31.751 "state": "online", 00:15:31.751 "raid_level": "raid1", 00:15:31.751 "superblock": true, 00:15:31.751 "num_base_bdevs": 3, 00:15:31.751 "num_base_bdevs_discovered": 3, 00:15:31.751 "num_base_bdevs_operational": 3, 00:15:31.751 "base_bdevs_list": [ 00:15:31.751 { 00:15:31.751 "name": "pt1", 00:15:31.751 "uuid": "2f853ff4-3c33-5abc-8c7a-edbacfacc18d", 00:15:31.751 "is_configured": true, 00:15:31.751 "data_offset": 2048, 00:15:31.752 "data_size": 63488 00:15:31.752 }, 00:15:31.752 { 00:15:31.752 "name": "pt2", 00:15:31.752 "uuid": "bafcc775-85cf-5acf-8def-6fecd4d2f101", 00:15:31.752 "is_configured": true, 00:15:31.752 "data_offset": 2048, 00:15:31.752 "data_size": 63488 00:15:31.752 }, 00:15:31.752 { 00:15:31.752 "name": "pt3", 00:15:31.752 "uuid": "b553e3e7-b168-5191-8206-79840a7403a2", 00:15:31.752 "is_configured": true, 00:15:31.752 "data_offset": 2048, 00:15:31.752 "data_size": 63488 00:15:31.752 } 00:15:31.752 ] 00:15:31.752 } 00:15:31.752 } 00:15:31.752 }' 00:15:31.752 23:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:31.752 23:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:15:31.752 pt2 00:15:31.752 pt3' 00:15:31.752 23:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:31.752 23:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:31.752 23:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:32.011 23:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:32.011 "name": "pt1", 00:15:32.011 "aliases": [ 00:15:32.011 "2f853ff4-3c33-5abc-8c7a-edbacfacc18d" 00:15:32.011 ], 00:15:32.011 "product_name": "passthru", 00:15:32.011 "block_size": 512, 00:15:32.011 "num_blocks": 65536, 00:15:32.011 "uuid": "2f853ff4-3c33-5abc-8c7a-edbacfacc18d", 00:15:32.011 "assigned_rate_limits": { 00:15:32.011 "rw_ios_per_sec": 0, 00:15:32.011 "rw_mbytes_per_sec": 0, 00:15:32.011 "r_mbytes_per_sec": 0, 00:15:32.011 "w_mbytes_per_sec": 0 00:15:32.011 }, 00:15:32.012 "claimed": true, 00:15:32.012 "claim_type": "exclusive_write", 00:15:32.012 "zoned": false, 00:15:32.012 "supported_io_types": { 00:15:32.012 "read": true, 00:15:32.012 "write": true, 00:15:32.012 "unmap": true, 00:15:32.012 "write_zeroes": true, 00:15:32.012 "flush": true, 00:15:32.012 "reset": true, 00:15:32.012 "compare": false, 00:15:32.012 "compare_and_write": false, 00:15:32.012 "abort": true, 00:15:32.012 "nvme_admin": false, 00:15:32.012 "nvme_io": false 00:15:32.012 }, 00:15:32.012 "memory_domains": [ 00:15:32.012 { 00:15:32.012 "dma_device_id": "system", 00:15:32.012 "dma_device_type": 1 00:15:32.012 }, 00:15:32.012 { 00:15:32.012 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:32.012 "dma_device_type": 2 00:15:32.012 } 00:15:32.012 ], 00:15:32.012 "driver_specific": { 00:15:32.012 "passthru": { 00:15:32.012 "name": "pt1", 00:15:32.012 "base_bdev_name": "malloc1" 00:15:32.012 } 00:15:32.012 } 00:15:32.012 }' 00:15:32.012 23:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:32.012 23:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:32.012 23:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:32.012 23:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:32.270 23:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:32.270 23:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:32.270 23:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:32.270 23:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:32.270 23:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:32.270 23:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:32.270 23:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:32.270 23:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:32.270 23:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:32.270 23:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:32.270 23:57:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:32.528 23:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:32.528 "name": "pt2", 00:15:32.528 "aliases": [ 00:15:32.528 "bafcc775-85cf-5acf-8def-6fecd4d2f101" 00:15:32.528 ], 00:15:32.528 "product_name": "passthru", 00:15:32.528 "block_size": 512, 00:15:32.528 "num_blocks": 65536, 00:15:32.528 "uuid": "bafcc775-85cf-5acf-8def-6fecd4d2f101", 00:15:32.528 "assigned_rate_limits": { 00:15:32.528 "rw_ios_per_sec": 0, 00:15:32.528 "rw_mbytes_per_sec": 0, 00:15:32.528 "r_mbytes_per_sec": 0, 00:15:32.528 "w_mbytes_per_sec": 0 00:15:32.528 }, 00:15:32.528 "claimed": true, 00:15:32.528 "claim_type": "exclusive_write", 00:15:32.528 "zoned": false, 00:15:32.528 "supported_io_types": { 00:15:32.528 "read": true, 00:15:32.528 "write": true, 00:15:32.528 "unmap": true, 00:15:32.528 "write_zeroes": true, 00:15:32.528 "flush": true, 00:15:32.528 "reset": true, 00:15:32.528 "compare": false, 00:15:32.528 "compare_and_write": false, 00:15:32.528 "abort": true, 00:15:32.528 "nvme_admin": false, 00:15:32.528 "nvme_io": false 00:15:32.528 }, 00:15:32.528 "memory_domains": [ 00:15:32.528 { 00:15:32.528 "dma_device_id": "system", 00:15:32.528 "dma_device_type": 1 00:15:32.528 }, 00:15:32.528 { 00:15:32.528 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:32.528 "dma_device_type": 2 00:15:32.528 } 00:15:32.528 ], 00:15:32.528 "driver_specific": { 00:15:32.528 "passthru": { 00:15:32.528 "name": "pt2", 00:15:32.528 "base_bdev_name": "malloc2" 00:15:32.528 } 00:15:32.528 } 00:15:32.528 }' 00:15:32.528 23:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:32.786 23:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:32.786 23:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:32.786 23:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:32.786 23:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:32.786 23:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:32.786 23:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:32.786 23:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:32.786 23:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:32.786 23:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:33.044 23:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:33.044 23:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:33.044 23:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:33.044 23:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:33.044 23:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:33.302 23:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:33.303 "name": "pt3", 00:15:33.303 "aliases": [ 00:15:33.303 "b553e3e7-b168-5191-8206-79840a7403a2" 00:15:33.303 ], 00:15:33.303 "product_name": "passthru", 00:15:33.303 "block_size": 512, 00:15:33.303 "num_blocks": 65536, 00:15:33.303 "uuid": "b553e3e7-b168-5191-8206-79840a7403a2", 00:15:33.303 "assigned_rate_limits": { 00:15:33.303 "rw_ios_per_sec": 0, 00:15:33.303 "rw_mbytes_per_sec": 0, 00:15:33.303 "r_mbytes_per_sec": 0, 00:15:33.303 "w_mbytes_per_sec": 0 00:15:33.303 }, 00:15:33.303 "claimed": true, 00:15:33.303 "claim_type": "exclusive_write", 00:15:33.303 "zoned": false, 00:15:33.303 "supported_io_types": { 00:15:33.303 "read": true, 00:15:33.303 "write": true, 00:15:33.303 "unmap": true, 00:15:33.303 "write_zeroes": true, 00:15:33.303 "flush": true, 00:15:33.303 "reset": true, 00:15:33.303 "compare": false, 00:15:33.303 "compare_and_write": false, 00:15:33.303 "abort": true, 00:15:33.303 "nvme_admin": false, 00:15:33.303 "nvme_io": false 00:15:33.303 }, 00:15:33.303 "memory_domains": [ 00:15:33.303 { 00:15:33.303 "dma_device_id": "system", 00:15:33.303 "dma_device_type": 1 00:15:33.303 }, 00:15:33.303 { 00:15:33.303 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.303 "dma_device_type": 2 00:15:33.303 } 00:15:33.303 ], 00:15:33.303 "driver_specific": { 00:15:33.303 "passthru": { 00:15:33.303 "name": "pt3", 00:15:33.303 "base_bdev_name": "malloc3" 00:15:33.303 } 00:15:33.303 } 00:15:33.303 }' 00:15:33.303 23:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:33.303 23:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:33.303 23:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:33.303 23:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:33.303 23:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:33.303 23:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:33.303 23:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:33.562 23:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:33.562 23:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:33.562 23:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:33.562 23:57:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:33.562 23:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:33.562 23:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:33.562 23:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:15:33.824 [2024-05-14 23:57:34.265577] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:33.824 23:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # '[' 496a448b-16e1-45ae-9ed5-a286a6279200 '!=' 496a448b-16e1-45ae-9ed5-a286a6279200 ']' 00:15:33.824 23:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@491 -- # has_redundancy raid1 00:15:33.824 23:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:15:33.824 23:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 0 00:15:33.824 23:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:15:34.144 [2024-05-14 23:57:34.509998] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:15:34.144 23:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@496 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:15:34.144 23:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:15:34.144 23:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:15:34.144 23:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:15:34.144 23:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:15:34.144 23:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:15:34.144 23:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:34.144 23:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:34.144 23:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:34.144 23:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:34.144 23:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.144 23:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:34.403 23:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:34.403 "name": "raid_bdev1", 00:15:34.403 "uuid": "496a448b-16e1-45ae-9ed5-a286a6279200", 00:15:34.403 "strip_size_kb": 0, 00:15:34.403 "state": "online", 00:15:34.403 "raid_level": "raid1", 00:15:34.403 "superblock": true, 00:15:34.403 "num_base_bdevs": 3, 00:15:34.403 "num_base_bdevs_discovered": 2, 00:15:34.403 "num_base_bdevs_operational": 2, 00:15:34.403 "base_bdevs_list": [ 00:15:34.403 { 00:15:34.403 "name": null, 00:15:34.403 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:34.403 "is_configured": false, 00:15:34.403 "data_offset": 2048, 00:15:34.403 "data_size": 63488 00:15:34.403 }, 00:15:34.403 { 00:15:34.403 "name": "pt2", 00:15:34.403 "uuid": "bafcc775-85cf-5acf-8def-6fecd4d2f101", 00:15:34.403 "is_configured": true, 00:15:34.403 "data_offset": 2048, 00:15:34.403 "data_size": 63488 00:15:34.403 }, 00:15:34.403 { 00:15:34.403 "name": "pt3", 00:15:34.403 "uuid": "b553e3e7-b168-5191-8206-79840a7403a2", 00:15:34.403 "is_configured": true, 00:15:34.403 "data_offset": 2048, 00:15:34.403 "data_size": 63488 00:15:34.403 } 00:15:34.403 ] 00:15:34.403 }' 00:15:34.403 23:57:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:34.403 23:57:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:34.994 23:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:34.994 [2024-05-14 23:57:35.500587] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:34.994 [2024-05-14 23:57:35.500615] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:34.994 [2024-05-14 23:57:35.500673] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:34.994 [2024-05-14 23:57:35.500727] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:34.994 [2024-05-14 23:57:35.500740] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14d6890 name raid_bdev1, state offline 00:15:34.994 23:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.994 23:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # jq -r '.[]' 00:15:35.254 23:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # raid_bdev= 00:15:35.254 23:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@501 -- # '[' -n '' ']' 00:15:35.254 23:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i = 1 )) 00:15:35.254 23:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:15:35.254 23:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@507 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:35.513 23:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i++ )) 00:15:35.513 23:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:15:35.513 23:57:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@507 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:35.513 23:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i++ )) 00:15:35.513 23:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:15:35.513 23:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i = 1 )) 00:15:35.513 23:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i < num_base_bdevs - 1 )) 00:15:35.513 23:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@512 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:35.771 [2024-05-14 23:57:36.274605] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:35.771 [2024-05-14 23:57:36.274655] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:35.771 [2024-05-14 23:57:36.274675] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14d3e50 00:15:35.771 [2024-05-14 23:57:36.274687] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:35.771 [2024-05-14 23:57:36.276287] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:35.771 [2024-05-14 23:57:36.276315] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:35.771 [2024-05-14 23:57:36.276382] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:15:35.771 [2024-05-14 23:57:36.276419] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:35.771 pt2 00:15:35.771 23:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:15:35.771 23:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:15:35.771 23:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:35.771 23:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:15:35.771 23:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:15:35.771 23:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:15:35.771 23:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:35.771 23:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:35.771 23:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:35.771 23:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:35.771 23:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:35.771 23:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:36.029 23:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:36.029 "name": "raid_bdev1", 00:15:36.029 "uuid": "496a448b-16e1-45ae-9ed5-a286a6279200", 00:15:36.029 "strip_size_kb": 0, 00:15:36.029 "state": "configuring", 00:15:36.029 "raid_level": "raid1", 00:15:36.029 "superblock": true, 00:15:36.029 "num_base_bdevs": 3, 00:15:36.029 "num_base_bdevs_discovered": 1, 00:15:36.029 "num_base_bdevs_operational": 2, 00:15:36.029 "base_bdevs_list": [ 00:15:36.029 { 00:15:36.029 "name": null, 00:15:36.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:36.029 "is_configured": false, 00:15:36.029 "data_offset": 2048, 00:15:36.029 "data_size": 63488 00:15:36.029 }, 00:15:36.029 { 00:15:36.030 "name": "pt2", 00:15:36.030 "uuid": "bafcc775-85cf-5acf-8def-6fecd4d2f101", 00:15:36.030 "is_configured": true, 00:15:36.030 "data_offset": 2048, 00:15:36.030 "data_size": 63488 00:15:36.030 }, 00:15:36.030 { 00:15:36.030 "name": null, 00:15:36.030 "uuid": "b553e3e7-b168-5191-8206-79840a7403a2", 00:15:36.030 "is_configured": false, 00:15:36.030 "data_offset": 2048, 00:15:36.030 "data_size": 63488 00:15:36.030 } 00:15:36.030 ] 00:15:36.030 }' 00:15:36.030 23:57:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:36.030 23:57:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:36.594 23:57:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i++ )) 00:15:36.594 23:57:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i < num_base_bdevs - 1 )) 00:15:36.594 23:57:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # i=2 00:15:36.595 23:57:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@520 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:36.852 [2024-05-14 23:57:37.293301] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:36.853 [2024-05-14 23:57:37.293356] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:36.853 [2024-05-14 23:57:37.293380] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1329b50 00:15:36.853 [2024-05-14 23:57:37.293394] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:36.853 [2024-05-14 23:57:37.293757] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:36.853 [2024-05-14 23:57:37.293775] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:36.853 [2024-05-14 23:57:37.293842] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt3 00:15:36.853 [2024-05-14 23:57:37.293863] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:36.853 [2024-05-14 23:57:37.293961] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x14da670 00:15:36.853 [2024-05-14 23:57:37.293972] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:36.853 [2024-05-14 23:57:37.294137] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14c91a0 00:15:36.853 [2024-05-14 23:57:37.294261] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14da670 00:15:36.853 [2024-05-14 23:57:37.294271] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14da670 00:15:36.853 [2024-05-14 23:57:37.294366] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:36.853 pt3 00:15:36.853 23:57:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@523 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:15:36.853 23:57:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:15:36.853 23:57:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:15:36.853 23:57:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:15:36.853 23:57:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:15:36.853 23:57:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:15:36.853 23:57:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:36.853 23:57:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:36.853 23:57:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:36.853 23:57:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:36.853 23:57:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:36.853 23:57:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:37.111 23:57:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:37.111 "name": "raid_bdev1", 00:15:37.111 "uuid": "496a448b-16e1-45ae-9ed5-a286a6279200", 00:15:37.111 "strip_size_kb": 0, 00:15:37.111 "state": "online", 00:15:37.111 "raid_level": "raid1", 00:15:37.111 "superblock": true, 00:15:37.111 "num_base_bdevs": 3, 00:15:37.111 "num_base_bdevs_discovered": 2, 00:15:37.111 "num_base_bdevs_operational": 2, 00:15:37.111 "base_bdevs_list": [ 00:15:37.111 { 00:15:37.111 "name": null, 00:15:37.111 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:37.111 "is_configured": false, 00:15:37.111 "data_offset": 2048, 00:15:37.111 "data_size": 63488 00:15:37.111 }, 00:15:37.111 { 00:15:37.111 "name": "pt2", 00:15:37.111 "uuid": "bafcc775-85cf-5acf-8def-6fecd4d2f101", 00:15:37.111 "is_configured": true, 00:15:37.111 "data_offset": 2048, 00:15:37.111 "data_size": 63488 00:15:37.111 }, 00:15:37.111 { 00:15:37.111 "name": "pt3", 00:15:37.111 "uuid": "b553e3e7-b168-5191-8206-79840a7403a2", 00:15:37.111 "is_configured": true, 00:15:37.112 "data_offset": 2048, 00:15:37.112 "data_size": 63488 00:15:37.112 } 00:15:37.112 ] 00:15:37.112 }' 00:15:37.112 23:57:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:37.112 23:57:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:37.678 23:57:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # '[' 3 -gt 2 ']' 00:15:37.678 23:57:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:37.937 [2024-05-14 23:57:38.368145] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:37.937 [2024-05-14 23:57:38.368170] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:37.937 [2024-05-14 23:57:38.368224] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:37.937 [2024-05-14 23:57:38.368289] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:37.937 [2024-05-14 23:57:38.368302] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14da670 name raid_bdev1, state offline 00:15:37.937 23:57:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@528 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:37.937 23:57:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@528 -- # jq -r '.[]' 00:15:38.195 23:57:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@528 -- # raid_bdev= 00:15:38.195 23:57:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@529 -- # '[' -n '' ']' 00:15:38.195 23:57:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:38.455 [2024-05-14 23:57:38.865449] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:38.455 [2024-05-14 23:57:38.865504] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:38.455 [2024-05-14 23:57:38.865525] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14c9a90 00:15:38.455 [2024-05-14 23:57:38.865538] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:38.455 [2024-05-14 23:57:38.867139] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:38.455 [2024-05-14 23:57:38.867176] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:38.455 [2024-05-14 23:57:38.867244] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt1 00:15:38.455 [2024-05-14 23:57:38.867270] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:38.455 pt1 00:15:38.455 23:57:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@538 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:15:38.455 23:57:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:15:38.455 23:57:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:38.455 23:57:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:15:38.455 23:57:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:15:38.455 23:57:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:15:38.455 23:57:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:38.455 23:57:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:38.455 23:57:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:38.455 23:57:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:38.455 23:57:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:38.455 23:57:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:38.714 23:57:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:38.714 "name": "raid_bdev1", 00:15:38.714 "uuid": "496a448b-16e1-45ae-9ed5-a286a6279200", 00:15:38.714 "strip_size_kb": 0, 00:15:38.714 "state": "configuring", 00:15:38.714 "raid_level": "raid1", 00:15:38.714 "superblock": true, 00:15:38.714 "num_base_bdevs": 3, 00:15:38.714 "num_base_bdevs_discovered": 1, 00:15:38.714 "num_base_bdevs_operational": 3, 00:15:38.714 "base_bdevs_list": [ 00:15:38.714 { 00:15:38.714 "name": "pt1", 00:15:38.714 "uuid": "2f853ff4-3c33-5abc-8c7a-edbacfacc18d", 00:15:38.714 "is_configured": true, 00:15:38.714 "data_offset": 2048, 00:15:38.714 "data_size": 63488 00:15:38.714 }, 00:15:38.714 { 00:15:38.714 "name": null, 00:15:38.714 "uuid": "bafcc775-85cf-5acf-8def-6fecd4d2f101", 00:15:38.714 "is_configured": false, 00:15:38.714 "data_offset": 2048, 00:15:38.714 "data_size": 63488 00:15:38.714 }, 00:15:38.714 { 00:15:38.714 "name": null, 00:15:38.714 "uuid": "b553e3e7-b168-5191-8206-79840a7403a2", 00:15:38.714 "is_configured": false, 00:15:38.714 "data_offset": 2048, 00:15:38.714 "data_size": 63488 00:15:38.714 } 00:15:38.714 ] 00:15:38.714 }' 00:15:38.714 23:57:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:38.714 23:57:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:39.282 23:57:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # (( i = 1 )) 00:15:39.282 23:57:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # (( i < num_base_bdevs )) 00:15:39.282 23:57:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:39.541 23:57:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # (( i++ )) 00:15:39.541 23:57:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # (( i < num_base_bdevs )) 00:15:39.541 23:57:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:39.799 23:57:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # (( i++ )) 00:15:39.799 23:57:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # (( i < num_base_bdevs )) 00:15:39.799 23:57:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@546 -- # i=2 00:15:39.799 23:57:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@547 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:40.057 [2024-05-14 23:57:40.465702] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:40.057 [2024-05-14 23:57:40.465748] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:40.057 [2024-05-14 23:57:40.465770] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14d3e50 00:15:40.057 [2024-05-14 23:57:40.465782] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:40.057 [2024-05-14 23:57:40.466118] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:40.057 [2024-05-14 23:57:40.466135] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:40.057 [2024-05-14 23:57:40.466196] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt3 00:15:40.058 [2024-05-14 23:57:40.466208] bdev_raid.c:3396:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt3 (4) greater than existing raid bdev raid_bdev1 (2) 00:15:40.058 [2024-05-14 23:57:40.466217] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:40.058 [2024-05-14 23:57:40.466233] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14d9b00 name raid_bdev1, state configuring 00:15:40.058 [2024-05-14 23:57:40.466265] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:40.058 pt3 00:15:40.058 23:57:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@551 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:15:40.058 23:57:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:15:40.058 23:57:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:40.058 23:57:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:15:40.058 23:57:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:15:40.058 23:57:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:15:40.058 23:57:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:40.058 23:57:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:40.058 23:57:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:40.058 23:57:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:40.058 23:57:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:40.058 23:57:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:40.317 23:57:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:40.317 "name": "raid_bdev1", 00:15:40.317 "uuid": "496a448b-16e1-45ae-9ed5-a286a6279200", 00:15:40.317 "strip_size_kb": 0, 00:15:40.317 "state": "configuring", 00:15:40.317 "raid_level": "raid1", 00:15:40.317 "superblock": true, 00:15:40.317 "num_base_bdevs": 3, 00:15:40.317 "num_base_bdevs_discovered": 1, 00:15:40.317 "num_base_bdevs_operational": 2, 00:15:40.317 "base_bdevs_list": [ 00:15:40.317 { 00:15:40.317 "name": null, 00:15:40.317 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:40.317 "is_configured": false, 00:15:40.317 "data_offset": 2048, 00:15:40.317 "data_size": 63488 00:15:40.317 }, 00:15:40.317 { 00:15:40.317 "name": null, 00:15:40.317 "uuid": "bafcc775-85cf-5acf-8def-6fecd4d2f101", 00:15:40.317 "is_configured": false, 00:15:40.317 "data_offset": 2048, 00:15:40.317 "data_size": 63488 00:15:40.317 }, 00:15:40.317 { 00:15:40.317 "name": "pt3", 00:15:40.317 "uuid": "b553e3e7-b168-5191-8206-79840a7403a2", 00:15:40.317 "is_configured": true, 00:15:40.317 "data_offset": 2048, 00:15:40.317 "data_size": 63488 00:15:40.317 } 00:15:40.317 ] 00:15:40.317 }' 00:15:40.317 23:57:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:40.317 23:57:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:40.885 23:57:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # (( i = 1 )) 00:15:40.885 23:57:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # (( i < num_base_bdevs - 1 )) 00:15:40.885 23:57:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:41.144 [2024-05-14 23:57:41.564614] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:41.144 [2024-05-14 23:57:41.564664] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:41.144 [2024-05-14 23:57:41.564685] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14d8f70 00:15:41.144 [2024-05-14 23:57:41.564699] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:41.144 [2024-05-14 23:57:41.565039] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:41.144 [2024-05-14 23:57:41.565056] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:41.144 [2024-05-14 23:57:41.565118] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:15:41.144 [2024-05-14 23:57:41.565138] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:41.144 [2024-05-14 23:57:41.565235] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x14db0e0 00:15:41.144 [2024-05-14 23:57:41.565246] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:41.144 [2024-05-14 23:57:41.565422] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14d1f60 00:15:41.144 [2024-05-14 23:57:41.565551] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14db0e0 00:15:41.144 [2024-05-14 23:57:41.565561] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14db0e0 00:15:41.144 [2024-05-14 23:57:41.565661] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:41.144 pt2 00:15:41.144 23:57:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # (( i++ )) 00:15:41.144 23:57:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # (( i < num_base_bdevs - 1 )) 00:15:41.144 23:57:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@559 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:15:41.144 23:57:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:15:41.144 23:57:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:15:41.144 23:57:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:15:41.144 23:57:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:15:41.144 23:57:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:15:41.144 23:57:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:41.144 23:57:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:41.144 23:57:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:41.144 23:57:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:41.145 23:57:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:41.145 23:57:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:41.404 23:57:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:41.404 "name": "raid_bdev1", 00:15:41.404 "uuid": "496a448b-16e1-45ae-9ed5-a286a6279200", 00:15:41.404 "strip_size_kb": 0, 00:15:41.404 "state": "online", 00:15:41.404 "raid_level": "raid1", 00:15:41.404 "superblock": true, 00:15:41.404 "num_base_bdevs": 3, 00:15:41.404 "num_base_bdevs_discovered": 2, 00:15:41.404 "num_base_bdevs_operational": 2, 00:15:41.404 "base_bdevs_list": [ 00:15:41.404 { 00:15:41.404 "name": null, 00:15:41.404 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:41.404 "is_configured": false, 00:15:41.404 "data_offset": 2048, 00:15:41.404 "data_size": 63488 00:15:41.404 }, 00:15:41.404 { 00:15:41.404 "name": "pt2", 00:15:41.404 "uuid": "bafcc775-85cf-5acf-8def-6fecd4d2f101", 00:15:41.404 "is_configured": true, 00:15:41.404 "data_offset": 2048, 00:15:41.404 "data_size": 63488 00:15:41.404 }, 00:15:41.404 { 00:15:41.404 "name": "pt3", 00:15:41.404 "uuid": "b553e3e7-b168-5191-8206-79840a7403a2", 00:15:41.404 "is_configured": true, 00:15:41.404 "data_offset": 2048, 00:15:41.404 "data_size": 63488 00:15:41.404 } 00:15:41.404 ] 00:15:41.404 }' 00:15:41.404 23:57:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:41.404 23:57:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:41.972 23:57:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@563 -- # jq -r '.[] | .uuid' 00:15:41.972 23:57:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@563 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:42.233 [2024-05-14 23:57:42.659752] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:42.233 23:57:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@563 -- # '[' 496a448b-16e1-45ae-9ed5-a286a6279200 '!=' 496a448b-16e1-45ae-9ed5-a286a6279200 ']' 00:15:42.233 23:57:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@568 -- # killprocess 430828 00:15:42.233 23:57:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@946 -- # '[' -z 430828 ']' 00:15:42.233 23:57:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # kill -0 430828 00:15:42.233 23:57:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # uname 00:15:42.233 23:57:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:15:42.233 23:57:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 430828 00:15:42.233 23:57:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:15:42.233 23:57:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:15:42.233 23:57:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 430828' 00:15:42.233 killing process with pid 430828 00:15:42.233 23:57:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@965 -- # kill 430828 00:15:42.233 [2024-05-14 23:57:42.732922] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:42.233 [2024-05-14 23:57:42.732992] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:42.233 [2024-05-14 23:57:42.733049] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:42.233 [2024-05-14 23:57:42.733062] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14db0e0 name raid_bdev1, state offline 00:15:42.233 23:57:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@970 -- # wait 430828 00:15:42.233 [2024-05-14 23:57:42.761989] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:42.492 23:57:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # return 0 00:15:42.492 00:15:42.492 real 0m22.513s 00:15:42.492 user 0m41.093s 00:15:42.492 sys 0m4.019s 00:15:42.492 23:57:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:15:42.492 23:57:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:42.492 ************************************ 00:15:42.492 END TEST raid_superblock_test 00:15:42.492 ************************************ 00:15:42.492 23:57:43 bdev_raid -- bdev/bdev_raid.sh@813 -- # for n in {2..4} 00:15:42.492 23:57:43 bdev_raid -- bdev/bdev_raid.sh@814 -- # for level in raid0 concat raid1 00:15:42.492 23:57:43 bdev_raid -- bdev/bdev_raid.sh@815 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:15:42.492 23:57:43 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:15:42.492 23:57:43 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:15:42.492 23:57:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:42.751 ************************************ 00:15:42.751 START TEST raid_state_function_test 00:15:42.751 ************************************ 00:15:42.751 23:57:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1121 -- # raid_state_function_test raid0 4 false 00:15:42.751 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local raid_level=raid0 00:15:42.751 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=4 00:15:42.751 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local superblock=false 00:15:42.751 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:15:42.751 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:15:42.751 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:15:42.751 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:15:42.751 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:15:42.751 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:15:42.751 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:15:42.751 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:15:42.751 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:15:42.751 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:15:42.751 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:15:42.751 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:15:42.752 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev4 00:15:42.752 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:15:42.752 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:15:42.752 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:15:42.752 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:15:42.752 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:15:42.752 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size 00:15:42.752 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:15:42.752 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:15:42.752 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # '[' raid0 '!=' raid1 ']' 00:15:42.752 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:15:42.752 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:15:42.752 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@238 -- # '[' false = true ']' 00:15:42.752 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@241 -- # superblock_create_arg= 00:15:42.752 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:42.752 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # raid_pid=434289 00:15:42.752 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 434289' 00:15:42.752 Process raid pid: 434289 00:15:42.752 23:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@247 -- # waitforlisten 434289 /var/tmp/spdk-raid.sock 00:15:42.752 23:57:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@827 -- # '[' -z 434289 ']' 00:15:42.752 23:57:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:42.752 23:57:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:15:42.752 23:57:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:42.752 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:42.752 23:57:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:15:42.752 23:57:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:42.752 [2024-05-14 23:57:43.147845] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:15:42.752 [2024-05-14 23:57:43.147890] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:42.752 [2024-05-14 23:57:43.259746] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:43.022 [2024-05-14 23:57:43.369751] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:15:43.022 [2024-05-14 23:57:43.433706] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:43.022 [2024-05-14 23:57:43.433738] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:43.593 23:57:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:15:43.593 23:57:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # return 0 00:15:43.593 23:57:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:43.852 [2024-05-14 23:57:44.318839] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:43.852 [2024-05-14 23:57:44.318885] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:43.852 [2024-05-14 23:57:44.318900] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:43.852 [2024-05-14 23:57:44.318917] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:43.852 [2024-05-14 23:57:44.318929] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:43.852 [2024-05-14 23:57:44.318947] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:43.852 [2024-05-14 23:57:44.318959] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:43.852 [2024-05-14 23:57:44.318975] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:43.852 23:57:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:43.852 23:57:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:43.852 23:57:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:43.853 23:57:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:43.853 23:57:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:43.853 23:57:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:43.853 23:57:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:43.853 23:57:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:43.853 23:57:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:43.853 23:57:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:43.853 23:57:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:43.853 23:57:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:44.112 23:57:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:44.112 "name": "Existed_Raid", 00:15:44.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:44.112 "strip_size_kb": 64, 00:15:44.112 "state": "configuring", 00:15:44.112 "raid_level": "raid0", 00:15:44.112 "superblock": false, 00:15:44.112 "num_base_bdevs": 4, 00:15:44.112 "num_base_bdevs_discovered": 0, 00:15:44.112 "num_base_bdevs_operational": 4, 00:15:44.112 "base_bdevs_list": [ 00:15:44.112 { 00:15:44.112 "name": "BaseBdev1", 00:15:44.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:44.112 "is_configured": false, 00:15:44.112 "data_offset": 0, 00:15:44.112 "data_size": 0 00:15:44.112 }, 00:15:44.112 { 00:15:44.112 "name": "BaseBdev2", 00:15:44.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:44.112 "is_configured": false, 00:15:44.112 "data_offset": 0, 00:15:44.112 "data_size": 0 00:15:44.112 }, 00:15:44.112 { 00:15:44.112 "name": "BaseBdev3", 00:15:44.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:44.112 "is_configured": false, 00:15:44.112 "data_offset": 0, 00:15:44.112 "data_size": 0 00:15:44.112 }, 00:15:44.112 { 00:15:44.112 "name": "BaseBdev4", 00:15:44.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:44.112 "is_configured": false, 00:15:44.112 "data_offset": 0, 00:15:44.112 "data_size": 0 00:15:44.112 } 00:15:44.112 ] 00:15:44.112 }' 00:15:44.112 23:57:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:44.112 23:57:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:44.680 23:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:44.939 [2024-05-14 23:57:45.405568] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:44.939 [2024-05-14 23:57:45.405601] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2678c00 name Existed_Raid, state configuring 00:15:44.939 23:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:45.197 [2024-05-14 23:57:45.646223] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:45.197 [2024-05-14 23:57:45.646256] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:45.197 [2024-05-14 23:57:45.646271] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:45.198 [2024-05-14 23:57:45.646286] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:45.198 [2024-05-14 23:57:45.646298] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:45.198 [2024-05-14 23:57:45.646314] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:45.198 [2024-05-14 23:57:45.646327] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:45.198 [2024-05-14 23:57:45.646345] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:45.198 23:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:45.457 [2024-05-14 23:57:45.840700] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:45.457 BaseBdev1 00:15:45.457 23:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:15:45.457 23:57:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:15:45.457 23:57:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:45.457 23:57:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:15:45.457 23:57:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:45.457 23:57:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:45.457 23:57:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:45.457 23:57:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:45.715 [ 00:15:45.715 { 00:15:45.715 "name": "BaseBdev1", 00:15:45.715 "aliases": [ 00:15:45.716 "5ae6df36-7b51-4108-9e76-5c9999736fae" 00:15:45.716 ], 00:15:45.716 "product_name": "Malloc disk", 00:15:45.716 "block_size": 512, 00:15:45.716 "num_blocks": 65536, 00:15:45.716 "uuid": "5ae6df36-7b51-4108-9e76-5c9999736fae", 00:15:45.716 "assigned_rate_limits": { 00:15:45.716 "rw_ios_per_sec": 0, 00:15:45.716 "rw_mbytes_per_sec": 0, 00:15:45.716 "r_mbytes_per_sec": 0, 00:15:45.716 "w_mbytes_per_sec": 0 00:15:45.716 }, 00:15:45.716 "claimed": true, 00:15:45.716 "claim_type": "exclusive_write", 00:15:45.716 "zoned": false, 00:15:45.716 "supported_io_types": { 00:15:45.716 "read": true, 00:15:45.716 "write": true, 00:15:45.716 "unmap": true, 00:15:45.716 "write_zeroes": true, 00:15:45.716 "flush": true, 00:15:45.716 "reset": true, 00:15:45.716 "compare": false, 00:15:45.716 "compare_and_write": false, 00:15:45.716 "abort": true, 00:15:45.716 "nvme_admin": false, 00:15:45.716 "nvme_io": false 00:15:45.716 }, 00:15:45.716 "memory_domains": [ 00:15:45.716 { 00:15:45.716 "dma_device_id": "system", 00:15:45.716 "dma_device_type": 1 00:15:45.716 }, 00:15:45.716 { 00:15:45.716 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:45.716 "dma_device_type": 2 00:15:45.716 } 00:15:45.716 ], 00:15:45.716 "driver_specific": {} 00:15:45.716 } 00:15:45.716 ] 00:15:45.716 23:57:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:15:45.716 23:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:45.716 23:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:45.716 23:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:45.716 23:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:45.716 23:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:45.716 23:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:45.716 23:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:45.716 23:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:45.716 23:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:45.716 23:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:45.716 23:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:45.716 23:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:45.975 23:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:45.975 "name": "Existed_Raid", 00:15:45.975 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:45.975 "strip_size_kb": 64, 00:15:45.975 "state": "configuring", 00:15:45.975 "raid_level": "raid0", 00:15:45.975 "superblock": false, 00:15:45.975 "num_base_bdevs": 4, 00:15:45.975 "num_base_bdevs_discovered": 1, 00:15:45.975 "num_base_bdevs_operational": 4, 00:15:45.975 "base_bdevs_list": [ 00:15:45.975 { 00:15:45.975 "name": "BaseBdev1", 00:15:45.975 "uuid": "5ae6df36-7b51-4108-9e76-5c9999736fae", 00:15:45.975 "is_configured": true, 00:15:45.975 "data_offset": 0, 00:15:45.975 "data_size": 65536 00:15:45.975 }, 00:15:45.975 { 00:15:45.975 "name": "BaseBdev2", 00:15:45.975 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:45.975 "is_configured": false, 00:15:45.975 "data_offset": 0, 00:15:45.975 "data_size": 0 00:15:45.975 }, 00:15:45.975 { 00:15:45.975 "name": "BaseBdev3", 00:15:45.975 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:45.975 "is_configured": false, 00:15:45.975 "data_offset": 0, 00:15:45.975 "data_size": 0 00:15:45.975 }, 00:15:45.975 { 00:15:45.975 "name": "BaseBdev4", 00:15:45.975 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:45.975 "is_configured": false, 00:15:45.975 "data_offset": 0, 00:15:45.975 "data_size": 0 00:15:45.975 } 00:15:45.975 ] 00:15:45.975 }' 00:15:45.975 23:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:45.975 23:57:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:46.542 23:57:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:46.800 [2024-05-14 23:57:47.208322] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:46.800 [2024-05-14 23:57:47.208363] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2678ea0 name Existed_Raid, state configuring 00:15:46.801 23:57:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:46.801 [2024-05-14 23:57:47.384829] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:46.801 [2024-05-14 23:57:47.386298] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:46.801 [2024-05-14 23:57:47.386333] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:46.801 [2024-05-14 23:57:47.386349] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:46.801 [2024-05-14 23:57:47.386363] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:46.801 [2024-05-14 23:57:47.386376] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:46.801 [2024-05-14 23:57:47.386391] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:47.060 23:57:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:15:47.060 23:57:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:15:47.060 23:57:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:47.060 23:57:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:47.060 23:57:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:47.060 23:57:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:47.060 23:57:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:47.060 23:57:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:47.060 23:57:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:47.060 23:57:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:47.060 23:57:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:47.060 23:57:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:47.060 23:57:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:47.060 23:57:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:47.060 23:57:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:47.060 "name": "Existed_Raid", 00:15:47.060 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:47.060 "strip_size_kb": 64, 00:15:47.060 "state": "configuring", 00:15:47.060 "raid_level": "raid0", 00:15:47.060 "superblock": false, 00:15:47.060 "num_base_bdevs": 4, 00:15:47.060 "num_base_bdevs_discovered": 1, 00:15:47.060 "num_base_bdevs_operational": 4, 00:15:47.060 "base_bdevs_list": [ 00:15:47.060 { 00:15:47.060 "name": "BaseBdev1", 00:15:47.060 "uuid": "5ae6df36-7b51-4108-9e76-5c9999736fae", 00:15:47.060 "is_configured": true, 00:15:47.060 "data_offset": 0, 00:15:47.060 "data_size": 65536 00:15:47.060 }, 00:15:47.060 { 00:15:47.060 "name": "BaseBdev2", 00:15:47.060 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:47.060 "is_configured": false, 00:15:47.060 "data_offset": 0, 00:15:47.060 "data_size": 0 00:15:47.060 }, 00:15:47.060 { 00:15:47.060 "name": "BaseBdev3", 00:15:47.060 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:47.060 "is_configured": false, 00:15:47.060 "data_offset": 0, 00:15:47.060 "data_size": 0 00:15:47.060 }, 00:15:47.060 { 00:15:47.060 "name": "BaseBdev4", 00:15:47.060 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:47.060 "is_configured": false, 00:15:47.060 "data_offset": 0, 00:15:47.060 "data_size": 0 00:15:47.060 } 00:15:47.060 ] 00:15:47.060 }' 00:15:47.060 23:57:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:47.060 23:57:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:47.627 23:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:47.886 [2024-05-14 23:57:48.374703] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:47.886 BaseBdev2 00:15:47.886 23:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:15:47.886 23:57:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:15:47.886 23:57:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:47.886 23:57:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:15:47.886 23:57:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:47.886 23:57:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:47.886 23:57:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:48.145 23:57:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:48.404 [ 00:15:48.404 { 00:15:48.404 "name": "BaseBdev2", 00:15:48.404 "aliases": [ 00:15:48.404 "f3665dcf-49c0-4e87-9152-9cc80b840a09" 00:15:48.404 ], 00:15:48.404 "product_name": "Malloc disk", 00:15:48.404 "block_size": 512, 00:15:48.404 "num_blocks": 65536, 00:15:48.404 "uuid": "f3665dcf-49c0-4e87-9152-9cc80b840a09", 00:15:48.404 "assigned_rate_limits": { 00:15:48.404 "rw_ios_per_sec": 0, 00:15:48.404 "rw_mbytes_per_sec": 0, 00:15:48.404 "r_mbytes_per_sec": 0, 00:15:48.404 "w_mbytes_per_sec": 0 00:15:48.404 }, 00:15:48.404 "claimed": true, 00:15:48.404 "claim_type": "exclusive_write", 00:15:48.404 "zoned": false, 00:15:48.404 "supported_io_types": { 00:15:48.404 "read": true, 00:15:48.404 "write": true, 00:15:48.404 "unmap": true, 00:15:48.404 "write_zeroes": true, 00:15:48.404 "flush": true, 00:15:48.404 "reset": true, 00:15:48.404 "compare": false, 00:15:48.404 "compare_and_write": false, 00:15:48.404 "abort": true, 00:15:48.404 "nvme_admin": false, 00:15:48.404 "nvme_io": false 00:15:48.404 }, 00:15:48.404 "memory_domains": [ 00:15:48.404 { 00:15:48.404 "dma_device_id": "system", 00:15:48.404 "dma_device_type": 1 00:15:48.404 }, 00:15:48.404 { 00:15:48.404 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:48.404 "dma_device_type": 2 00:15:48.404 } 00:15:48.404 ], 00:15:48.404 "driver_specific": {} 00:15:48.404 } 00:15:48.404 ] 00:15:48.404 23:57:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:15:48.404 23:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:15:48.404 23:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:15:48.404 23:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:48.404 23:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:48.404 23:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:48.404 23:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:48.404 23:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:48.404 23:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:48.404 23:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:48.404 23:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:48.404 23:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:48.404 23:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:48.404 23:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.404 23:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:48.662 23:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:48.662 "name": "Existed_Raid", 00:15:48.662 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:48.662 "strip_size_kb": 64, 00:15:48.662 "state": "configuring", 00:15:48.662 "raid_level": "raid0", 00:15:48.662 "superblock": false, 00:15:48.662 "num_base_bdevs": 4, 00:15:48.662 "num_base_bdevs_discovered": 2, 00:15:48.662 "num_base_bdevs_operational": 4, 00:15:48.662 "base_bdevs_list": [ 00:15:48.662 { 00:15:48.662 "name": "BaseBdev1", 00:15:48.662 "uuid": "5ae6df36-7b51-4108-9e76-5c9999736fae", 00:15:48.662 "is_configured": true, 00:15:48.662 "data_offset": 0, 00:15:48.662 "data_size": 65536 00:15:48.662 }, 00:15:48.662 { 00:15:48.662 "name": "BaseBdev2", 00:15:48.662 "uuid": "f3665dcf-49c0-4e87-9152-9cc80b840a09", 00:15:48.662 "is_configured": true, 00:15:48.662 "data_offset": 0, 00:15:48.662 "data_size": 65536 00:15:48.662 }, 00:15:48.662 { 00:15:48.662 "name": "BaseBdev3", 00:15:48.662 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:48.662 "is_configured": false, 00:15:48.662 "data_offset": 0, 00:15:48.662 "data_size": 0 00:15:48.662 }, 00:15:48.662 { 00:15:48.662 "name": "BaseBdev4", 00:15:48.662 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:48.662 "is_configured": false, 00:15:48.662 "data_offset": 0, 00:15:48.662 "data_size": 0 00:15:48.662 } 00:15:48.662 ] 00:15:48.662 }' 00:15:48.662 23:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:48.662 23:57:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:49.230 23:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:49.489 [2024-05-14 23:57:49.947507] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:49.489 BaseBdev3 00:15:49.489 23:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:15:49.489 23:57:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:15:49.489 23:57:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:49.489 23:57:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:15:49.489 23:57:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:49.489 23:57:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:49.489 23:57:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:49.747 23:57:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:50.006 [ 00:15:50.006 { 00:15:50.006 "name": "BaseBdev3", 00:15:50.006 "aliases": [ 00:15:50.006 "8efa6991-c0e2-4283-87b0-1fcf734d89d5" 00:15:50.006 ], 00:15:50.006 "product_name": "Malloc disk", 00:15:50.006 "block_size": 512, 00:15:50.006 "num_blocks": 65536, 00:15:50.006 "uuid": "8efa6991-c0e2-4283-87b0-1fcf734d89d5", 00:15:50.006 "assigned_rate_limits": { 00:15:50.006 "rw_ios_per_sec": 0, 00:15:50.006 "rw_mbytes_per_sec": 0, 00:15:50.006 "r_mbytes_per_sec": 0, 00:15:50.006 "w_mbytes_per_sec": 0 00:15:50.006 }, 00:15:50.006 "claimed": true, 00:15:50.006 "claim_type": "exclusive_write", 00:15:50.006 "zoned": false, 00:15:50.006 "supported_io_types": { 00:15:50.006 "read": true, 00:15:50.006 "write": true, 00:15:50.006 "unmap": true, 00:15:50.006 "write_zeroes": true, 00:15:50.006 "flush": true, 00:15:50.006 "reset": true, 00:15:50.006 "compare": false, 00:15:50.006 "compare_and_write": false, 00:15:50.006 "abort": true, 00:15:50.006 "nvme_admin": false, 00:15:50.006 "nvme_io": false 00:15:50.006 }, 00:15:50.006 "memory_domains": [ 00:15:50.006 { 00:15:50.006 "dma_device_id": "system", 00:15:50.006 "dma_device_type": 1 00:15:50.006 }, 00:15:50.006 { 00:15:50.006 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:50.006 "dma_device_type": 2 00:15:50.006 } 00:15:50.006 ], 00:15:50.006 "driver_specific": {} 00:15:50.006 } 00:15:50.006 ] 00:15:50.006 23:57:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:15:50.006 23:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:15:50.007 23:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:15:50.007 23:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:50.007 23:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:50.007 23:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:50.007 23:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:50.007 23:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:50.007 23:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:50.007 23:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:50.007 23:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:50.007 23:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:50.007 23:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:50.007 23:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:50.007 23:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:50.265 23:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:50.265 "name": "Existed_Raid", 00:15:50.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:50.265 "strip_size_kb": 64, 00:15:50.265 "state": "configuring", 00:15:50.265 "raid_level": "raid0", 00:15:50.265 "superblock": false, 00:15:50.265 "num_base_bdevs": 4, 00:15:50.265 "num_base_bdevs_discovered": 3, 00:15:50.265 "num_base_bdevs_operational": 4, 00:15:50.265 "base_bdevs_list": [ 00:15:50.265 { 00:15:50.265 "name": "BaseBdev1", 00:15:50.265 "uuid": "5ae6df36-7b51-4108-9e76-5c9999736fae", 00:15:50.265 "is_configured": true, 00:15:50.265 "data_offset": 0, 00:15:50.265 "data_size": 65536 00:15:50.265 }, 00:15:50.265 { 00:15:50.265 "name": "BaseBdev2", 00:15:50.265 "uuid": "f3665dcf-49c0-4e87-9152-9cc80b840a09", 00:15:50.265 "is_configured": true, 00:15:50.265 "data_offset": 0, 00:15:50.265 "data_size": 65536 00:15:50.265 }, 00:15:50.265 { 00:15:50.265 "name": "BaseBdev3", 00:15:50.265 "uuid": "8efa6991-c0e2-4283-87b0-1fcf734d89d5", 00:15:50.265 "is_configured": true, 00:15:50.265 "data_offset": 0, 00:15:50.265 "data_size": 65536 00:15:50.265 }, 00:15:50.265 { 00:15:50.265 "name": "BaseBdev4", 00:15:50.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:50.265 "is_configured": false, 00:15:50.265 "data_offset": 0, 00:15:50.265 "data_size": 0 00:15:50.265 } 00:15:50.265 ] 00:15:50.265 }' 00:15:50.265 23:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:50.265 23:57:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:50.856 23:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:15:51.115 [2024-05-14 23:57:51.527042] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:51.115 [2024-05-14 23:57:51.527081] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x2678470 00:15:51.115 [2024-05-14 23:57:51.527090] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:15:51.115 [2024-05-14 23:57:51.527289] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2678b40 00:15:51.115 [2024-05-14 23:57:51.527424] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2678470 00:15:51.115 [2024-05-14 23:57:51.527435] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2678470 00:15:51.115 [2024-05-14 23:57:51.527602] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:51.115 BaseBdev4 00:15:51.115 23:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev4 00:15:51.115 23:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:15:51.115 23:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:51.115 23:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:15:51.115 23:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:51.115 23:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:51.115 23:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:51.373 23:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:15:51.632 [ 00:15:51.632 { 00:15:51.632 "name": "BaseBdev4", 00:15:51.632 "aliases": [ 00:15:51.632 "d5beac33-dbc3-4096-8c8a-ed9e45cc93cb" 00:15:51.632 ], 00:15:51.632 "product_name": "Malloc disk", 00:15:51.632 "block_size": 512, 00:15:51.632 "num_blocks": 65536, 00:15:51.632 "uuid": "d5beac33-dbc3-4096-8c8a-ed9e45cc93cb", 00:15:51.632 "assigned_rate_limits": { 00:15:51.632 "rw_ios_per_sec": 0, 00:15:51.632 "rw_mbytes_per_sec": 0, 00:15:51.632 "r_mbytes_per_sec": 0, 00:15:51.632 "w_mbytes_per_sec": 0 00:15:51.632 }, 00:15:51.632 "claimed": true, 00:15:51.632 "claim_type": "exclusive_write", 00:15:51.632 "zoned": false, 00:15:51.632 "supported_io_types": { 00:15:51.632 "read": true, 00:15:51.632 "write": true, 00:15:51.632 "unmap": true, 00:15:51.632 "write_zeroes": true, 00:15:51.632 "flush": true, 00:15:51.632 "reset": true, 00:15:51.632 "compare": false, 00:15:51.632 "compare_and_write": false, 00:15:51.632 "abort": true, 00:15:51.632 "nvme_admin": false, 00:15:51.632 "nvme_io": false 00:15:51.632 }, 00:15:51.632 "memory_domains": [ 00:15:51.632 { 00:15:51.632 "dma_device_id": "system", 00:15:51.632 "dma_device_type": 1 00:15:51.632 }, 00:15:51.632 { 00:15:51.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:51.632 "dma_device_type": 2 00:15:51.632 } 00:15:51.632 ], 00:15:51.632 "driver_specific": {} 00:15:51.632 } 00:15:51.632 ] 00:15:51.632 23:57:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:15:51.632 23:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:15:51.632 23:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:15:51.632 23:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:15:51.632 23:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:51.632 23:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:15:51.632 23:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:51.632 23:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:51.632 23:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:51.632 23:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:51.632 23:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:51.632 23:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:51.632 23:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:51.632 23:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:51.632 23:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:51.891 23:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:51.891 "name": "Existed_Raid", 00:15:51.891 "uuid": "bfc9dc42-0a99-4d17-80e9-665c4c367f8f", 00:15:51.891 "strip_size_kb": 64, 00:15:51.891 "state": "online", 00:15:51.891 "raid_level": "raid0", 00:15:51.891 "superblock": false, 00:15:51.891 "num_base_bdevs": 4, 00:15:51.891 "num_base_bdevs_discovered": 4, 00:15:51.891 "num_base_bdevs_operational": 4, 00:15:51.891 "base_bdevs_list": [ 00:15:51.891 { 00:15:51.891 "name": "BaseBdev1", 00:15:51.891 "uuid": "5ae6df36-7b51-4108-9e76-5c9999736fae", 00:15:51.891 "is_configured": true, 00:15:51.891 "data_offset": 0, 00:15:51.891 "data_size": 65536 00:15:51.891 }, 00:15:51.891 { 00:15:51.891 "name": "BaseBdev2", 00:15:51.891 "uuid": "f3665dcf-49c0-4e87-9152-9cc80b840a09", 00:15:51.892 "is_configured": true, 00:15:51.892 "data_offset": 0, 00:15:51.892 "data_size": 65536 00:15:51.892 }, 00:15:51.892 { 00:15:51.892 "name": "BaseBdev3", 00:15:51.892 "uuid": "8efa6991-c0e2-4283-87b0-1fcf734d89d5", 00:15:51.892 "is_configured": true, 00:15:51.892 "data_offset": 0, 00:15:51.892 "data_size": 65536 00:15:51.892 }, 00:15:51.892 { 00:15:51.892 "name": "BaseBdev4", 00:15:51.892 "uuid": "d5beac33-dbc3-4096-8c8a-ed9e45cc93cb", 00:15:51.892 "is_configured": true, 00:15:51.892 "data_offset": 0, 00:15:51.892 "data_size": 65536 00:15:51.892 } 00:15:51.892 ] 00:15:51.892 }' 00:15:51.892 23:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:51.892 23:57:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:52.457 23:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:15:52.457 23:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:15:52.457 23:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:15:52.457 23:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:15:52.457 23:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:15:52.457 23:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:15:52.457 23:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:52.457 23:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:15:52.716 [2024-05-14 23:57:53.127571] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:52.716 23:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:15:52.716 "name": "Existed_Raid", 00:15:52.716 "aliases": [ 00:15:52.716 "bfc9dc42-0a99-4d17-80e9-665c4c367f8f" 00:15:52.716 ], 00:15:52.716 "product_name": "Raid Volume", 00:15:52.716 "block_size": 512, 00:15:52.716 "num_blocks": 262144, 00:15:52.716 "uuid": "bfc9dc42-0a99-4d17-80e9-665c4c367f8f", 00:15:52.716 "assigned_rate_limits": { 00:15:52.716 "rw_ios_per_sec": 0, 00:15:52.716 "rw_mbytes_per_sec": 0, 00:15:52.716 "r_mbytes_per_sec": 0, 00:15:52.716 "w_mbytes_per_sec": 0 00:15:52.716 }, 00:15:52.716 "claimed": false, 00:15:52.716 "zoned": false, 00:15:52.716 "supported_io_types": { 00:15:52.716 "read": true, 00:15:52.716 "write": true, 00:15:52.716 "unmap": true, 00:15:52.716 "write_zeroes": true, 00:15:52.716 "flush": true, 00:15:52.716 "reset": true, 00:15:52.716 "compare": false, 00:15:52.716 "compare_and_write": false, 00:15:52.716 "abort": false, 00:15:52.716 "nvme_admin": false, 00:15:52.716 "nvme_io": false 00:15:52.716 }, 00:15:52.716 "memory_domains": [ 00:15:52.716 { 00:15:52.716 "dma_device_id": "system", 00:15:52.716 "dma_device_type": 1 00:15:52.716 }, 00:15:52.716 { 00:15:52.716 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:52.716 "dma_device_type": 2 00:15:52.716 }, 00:15:52.716 { 00:15:52.716 "dma_device_id": "system", 00:15:52.716 "dma_device_type": 1 00:15:52.716 }, 00:15:52.716 { 00:15:52.716 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:52.716 "dma_device_type": 2 00:15:52.716 }, 00:15:52.716 { 00:15:52.716 "dma_device_id": "system", 00:15:52.716 "dma_device_type": 1 00:15:52.716 }, 00:15:52.716 { 00:15:52.716 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:52.716 "dma_device_type": 2 00:15:52.716 }, 00:15:52.716 { 00:15:52.716 "dma_device_id": "system", 00:15:52.716 "dma_device_type": 1 00:15:52.716 }, 00:15:52.716 { 00:15:52.716 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:52.716 "dma_device_type": 2 00:15:52.716 } 00:15:52.716 ], 00:15:52.716 "driver_specific": { 00:15:52.716 "raid": { 00:15:52.716 "uuid": "bfc9dc42-0a99-4d17-80e9-665c4c367f8f", 00:15:52.716 "strip_size_kb": 64, 00:15:52.716 "state": "online", 00:15:52.716 "raid_level": "raid0", 00:15:52.716 "superblock": false, 00:15:52.716 "num_base_bdevs": 4, 00:15:52.716 "num_base_bdevs_discovered": 4, 00:15:52.716 "num_base_bdevs_operational": 4, 00:15:52.716 "base_bdevs_list": [ 00:15:52.716 { 00:15:52.716 "name": "BaseBdev1", 00:15:52.716 "uuid": "5ae6df36-7b51-4108-9e76-5c9999736fae", 00:15:52.716 "is_configured": true, 00:15:52.716 "data_offset": 0, 00:15:52.716 "data_size": 65536 00:15:52.716 }, 00:15:52.716 { 00:15:52.716 "name": "BaseBdev2", 00:15:52.716 "uuid": "f3665dcf-49c0-4e87-9152-9cc80b840a09", 00:15:52.716 "is_configured": true, 00:15:52.716 "data_offset": 0, 00:15:52.716 "data_size": 65536 00:15:52.716 }, 00:15:52.716 { 00:15:52.716 "name": "BaseBdev3", 00:15:52.716 "uuid": "8efa6991-c0e2-4283-87b0-1fcf734d89d5", 00:15:52.716 "is_configured": true, 00:15:52.716 "data_offset": 0, 00:15:52.716 "data_size": 65536 00:15:52.716 }, 00:15:52.716 { 00:15:52.716 "name": "BaseBdev4", 00:15:52.716 "uuid": "d5beac33-dbc3-4096-8c8a-ed9e45cc93cb", 00:15:52.716 "is_configured": true, 00:15:52.716 "data_offset": 0, 00:15:52.716 "data_size": 65536 00:15:52.716 } 00:15:52.716 ] 00:15:52.716 } 00:15:52.716 } 00:15:52.716 }' 00:15:52.716 23:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:52.716 23:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:15:52.716 BaseBdev2 00:15:52.716 BaseBdev3 00:15:52.716 BaseBdev4' 00:15:52.716 23:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:52.716 23:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:52.716 23:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:52.974 23:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:52.974 "name": "BaseBdev1", 00:15:52.974 "aliases": [ 00:15:52.974 "5ae6df36-7b51-4108-9e76-5c9999736fae" 00:15:52.974 ], 00:15:52.974 "product_name": "Malloc disk", 00:15:52.974 "block_size": 512, 00:15:52.974 "num_blocks": 65536, 00:15:52.974 "uuid": "5ae6df36-7b51-4108-9e76-5c9999736fae", 00:15:52.974 "assigned_rate_limits": { 00:15:52.974 "rw_ios_per_sec": 0, 00:15:52.974 "rw_mbytes_per_sec": 0, 00:15:52.974 "r_mbytes_per_sec": 0, 00:15:52.974 "w_mbytes_per_sec": 0 00:15:52.974 }, 00:15:52.974 "claimed": true, 00:15:52.974 "claim_type": "exclusive_write", 00:15:52.974 "zoned": false, 00:15:52.974 "supported_io_types": { 00:15:52.974 "read": true, 00:15:52.974 "write": true, 00:15:52.974 "unmap": true, 00:15:52.974 "write_zeroes": true, 00:15:52.974 "flush": true, 00:15:52.974 "reset": true, 00:15:52.974 "compare": false, 00:15:52.974 "compare_and_write": false, 00:15:52.974 "abort": true, 00:15:52.974 "nvme_admin": false, 00:15:52.974 "nvme_io": false 00:15:52.974 }, 00:15:52.974 "memory_domains": [ 00:15:52.974 { 00:15:52.974 "dma_device_id": "system", 00:15:52.974 "dma_device_type": 1 00:15:52.974 }, 00:15:52.974 { 00:15:52.974 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:52.974 "dma_device_type": 2 00:15:52.974 } 00:15:52.974 ], 00:15:52.974 "driver_specific": {} 00:15:52.974 }' 00:15:52.974 23:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:52.974 23:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:52.974 23:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:52.974 23:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:53.231 23:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:53.231 23:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:53.231 23:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:53.231 23:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:53.231 23:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:53.231 23:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:53.231 23:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:53.231 23:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:53.231 23:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:53.231 23:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:53.231 23:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:53.489 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:53.489 "name": "BaseBdev2", 00:15:53.489 "aliases": [ 00:15:53.489 "f3665dcf-49c0-4e87-9152-9cc80b840a09" 00:15:53.489 ], 00:15:53.489 "product_name": "Malloc disk", 00:15:53.489 "block_size": 512, 00:15:53.489 "num_blocks": 65536, 00:15:53.489 "uuid": "f3665dcf-49c0-4e87-9152-9cc80b840a09", 00:15:53.489 "assigned_rate_limits": { 00:15:53.489 "rw_ios_per_sec": 0, 00:15:53.489 "rw_mbytes_per_sec": 0, 00:15:53.489 "r_mbytes_per_sec": 0, 00:15:53.489 "w_mbytes_per_sec": 0 00:15:53.489 }, 00:15:53.489 "claimed": true, 00:15:53.489 "claim_type": "exclusive_write", 00:15:53.489 "zoned": false, 00:15:53.489 "supported_io_types": { 00:15:53.489 "read": true, 00:15:53.489 "write": true, 00:15:53.489 "unmap": true, 00:15:53.489 "write_zeroes": true, 00:15:53.489 "flush": true, 00:15:53.489 "reset": true, 00:15:53.489 "compare": false, 00:15:53.489 "compare_and_write": false, 00:15:53.489 "abort": true, 00:15:53.489 "nvme_admin": false, 00:15:53.489 "nvme_io": false 00:15:53.489 }, 00:15:53.489 "memory_domains": [ 00:15:53.489 { 00:15:53.489 "dma_device_id": "system", 00:15:53.489 "dma_device_type": 1 00:15:53.489 }, 00:15:53.489 { 00:15:53.489 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:53.489 "dma_device_type": 2 00:15:53.489 } 00:15:53.489 ], 00:15:53.489 "driver_specific": {} 00:15:53.489 }' 00:15:53.489 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:53.489 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:53.750 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:53.750 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:53.751 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:53.751 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:53.751 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:53.751 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:53.751 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:53.751 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:53.751 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:54.010 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:54.010 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:54.010 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:54.010 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:54.268 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:54.268 "name": "BaseBdev3", 00:15:54.268 "aliases": [ 00:15:54.268 "8efa6991-c0e2-4283-87b0-1fcf734d89d5" 00:15:54.268 ], 00:15:54.268 "product_name": "Malloc disk", 00:15:54.268 "block_size": 512, 00:15:54.268 "num_blocks": 65536, 00:15:54.268 "uuid": "8efa6991-c0e2-4283-87b0-1fcf734d89d5", 00:15:54.268 "assigned_rate_limits": { 00:15:54.268 "rw_ios_per_sec": 0, 00:15:54.268 "rw_mbytes_per_sec": 0, 00:15:54.268 "r_mbytes_per_sec": 0, 00:15:54.268 "w_mbytes_per_sec": 0 00:15:54.268 }, 00:15:54.268 "claimed": true, 00:15:54.268 "claim_type": "exclusive_write", 00:15:54.268 "zoned": false, 00:15:54.268 "supported_io_types": { 00:15:54.268 "read": true, 00:15:54.268 "write": true, 00:15:54.268 "unmap": true, 00:15:54.268 "write_zeroes": true, 00:15:54.268 "flush": true, 00:15:54.268 "reset": true, 00:15:54.268 "compare": false, 00:15:54.268 "compare_and_write": false, 00:15:54.268 "abort": true, 00:15:54.268 "nvme_admin": false, 00:15:54.268 "nvme_io": false 00:15:54.268 }, 00:15:54.268 "memory_domains": [ 00:15:54.268 { 00:15:54.268 "dma_device_id": "system", 00:15:54.268 "dma_device_type": 1 00:15:54.268 }, 00:15:54.268 { 00:15:54.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:54.268 "dma_device_type": 2 00:15:54.268 } 00:15:54.268 ], 00:15:54.268 "driver_specific": {} 00:15:54.268 }' 00:15:54.268 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:54.268 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:54.268 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:54.268 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:54.268 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:54.268 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:54.268 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:54.268 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:54.526 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:54.526 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:54.526 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:54.526 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:54.526 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:54.526 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:15:54.526 23:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:54.784 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:54.784 "name": "BaseBdev4", 00:15:54.784 "aliases": [ 00:15:54.784 "d5beac33-dbc3-4096-8c8a-ed9e45cc93cb" 00:15:54.784 ], 00:15:54.784 "product_name": "Malloc disk", 00:15:54.784 "block_size": 512, 00:15:54.784 "num_blocks": 65536, 00:15:54.784 "uuid": "d5beac33-dbc3-4096-8c8a-ed9e45cc93cb", 00:15:54.784 "assigned_rate_limits": { 00:15:54.784 "rw_ios_per_sec": 0, 00:15:54.784 "rw_mbytes_per_sec": 0, 00:15:54.784 "r_mbytes_per_sec": 0, 00:15:54.784 "w_mbytes_per_sec": 0 00:15:54.784 }, 00:15:54.784 "claimed": true, 00:15:54.784 "claim_type": "exclusive_write", 00:15:54.784 "zoned": false, 00:15:54.784 "supported_io_types": { 00:15:54.784 "read": true, 00:15:54.784 "write": true, 00:15:54.784 "unmap": true, 00:15:54.784 "write_zeroes": true, 00:15:54.784 "flush": true, 00:15:54.784 "reset": true, 00:15:54.784 "compare": false, 00:15:54.784 "compare_and_write": false, 00:15:54.784 "abort": true, 00:15:54.784 "nvme_admin": false, 00:15:54.784 "nvme_io": false 00:15:54.784 }, 00:15:54.784 "memory_domains": [ 00:15:54.784 { 00:15:54.784 "dma_device_id": "system", 00:15:54.784 "dma_device_type": 1 00:15:54.784 }, 00:15:54.784 { 00:15:54.784 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:54.784 "dma_device_type": 2 00:15:54.784 } 00:15:54.784 ], 00:15:54.784 "driver_specific": {} 00:15:54.784 }' 00:15:54.784 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:54.784 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:54.784 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:54.784 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:54.784 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:55.042 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:55.042 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:55.042 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:55.042 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:55.042 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:55.042 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:55.042 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:55.042 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:55.300 [2024-05-14 23:57:55.806645] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:55.300 [2024-05-14 23:57:55.806670] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:55.300 [2024-05-14 23:57:55.806719] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:55.300 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # local expected_state 00:15:55.300 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # has_redundancy raid0 00:15:55.300 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:15:55.300 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@216 -- # return 1 00:15:55.300 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:15:55.300 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:15:55.300 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:55.300 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:15:55.300 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:55.300 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:55.300 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:15:55.300 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:55.300 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:55.300 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:55.300 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:55.300 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:55.300 23:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:55.558 23:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:55.558 "name": "Existed_Raid", 00:15:55.558 "uuid": "bfc9dc42-0a99-4d17-80e9-665c4c367f8f", 00:15:55.558 "strip_size_kb": 64, 00:15:55.558 "state": "offline", 00:15:55.558 "raid_level": "raid0", 00:15:55.558 "superblock": false, 00:15:55.558 "num_base_bdevs": 4, 00:15:55.558 "num_base_bdevs_discovered": 3, 00:15:55.558 "num_base_bdevs_operational": 3, 00:15:55.558 "base_bdevs_list": [ 00:15:55.558 { 00:15:55.558 "name": null, 00:15:55.558 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:55.558 "is_configured": false, 00:15:55.558 "data_offset": 0, 00:15:55.558 "data_size": 65536 00:15:55.558 }, 00:15:55.558 { 00:15:55.558 "name": "BaseBdev2", 00:15:55.558 "uuid": "f3665dcf-49c0-4e87-9152-9cc80b840a09", 00:15:55.558 "is_configured": true, 00:15:55.558 "data_offset": 0, 00:15:55.558 "data_size": 65536 00:15:55.558 }, 00:15:55.558 { 00:15:55.558 "name": "BaseBdev3", 00:15:55.558 "uuid": "8efa6991-c0e2-4283-87b0-1fcf734d89d5", 00:15:55.558 "is_configured": true, 00:15:55.558 "data_offset": 0, 00:15:55.558 "data_size": 65536 00:15:55.558 }, 00:15:55.558 { 00:15:55.558 "name": "BaseBdev4", 00:15:55.558 "uuid": "d5beac33-dbc3-4096-8c8a-ed9e45cc93cb", 00:15:55.558 "is_configured": true, 00:15:55.558 "data_offset": 0, 00:15:55.558 "data_size": 65536 00:15:55.558 } 00:15:55.558 ] 00:15:55.558 }' 00:15:55.558 23:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:55.558 23:57:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:56.123 23:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:15:56.123 23:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:15:56.123 23:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:56.123 23:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:15:56.381 23:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:15:56.381 23:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:56.381 23:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:56.638 [2024-05-14 23:57:57.135180] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:56.638 23:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:15:56.638 23:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:15:56.638 23:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:56.638 23:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:15:56.895 23:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:15:56.895 23:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:56.896 23:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:57.153 [2024-05-14 23:57:57.632926] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:57.153 23:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:15:57.153 23:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:15:57.153 23:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:57.153 23:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:15:57.411 23:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:15:57.411 23:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:57.411 23:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:15:57.669 [2024-05-14 23:57:58.124670] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:15:57.669 [2024-05-14 23:57:58.124713] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2678470 name Existed_Raid, state offline 00:15:57.669 23:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:15:57.669 23:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:15:57.669 23:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:57.669 23:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:15:57.927 23:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:15:57.927 23:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:15:57.927 23:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@300 -- # '[' 4 -gt 2 ']' 00:15:57.927 23:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:15:57.927 23:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:15:57.927 23:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:58.185 BaseBdev2 00:15:58.185 23:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:15:58.185 23:57:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:15:58.185 23:57:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:58.185 23:57:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:15:58.185 23:57:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:58.185 23:57:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:58.185 23:57:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:58.443 23:57:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:58.702 [ 00:15:58.702 { 00:15:58.702 "name": "BaseBdev2", 00:15:58.702 "aliases": [ 00:15:58.702 "5db7afff-0226-4b65-bc5c-da99a4d56e63" 00:15:58.702 ], 00:15:58.702 "product_name": "Malloc disk", 00:15:58.702 "block_size": 512, 00:15:58.702 "num_blocks": 65536, 00:15:58.702 "uuid": "5db7afff-0226-4b65-bc5c-da99a4d56e63", 00:15:58.702 "assigned_rate_limits": { 00:15:58.702 "rw_ios_per_sec": 0, 00:15:58.702 "rw_mbytes_per_sec": 0, 00:15:58.702 "r_mbytes_per_sec": 0, 00:15:58.702 "w_mbytes_per_sec": 0 00:15:58.702 }, 00:15:58.702 "claimed": false, 00:15:58.702 "zoned": false, 00:15:58.702 "supported_io_types": { 00:15:58.702 "read": true, 00:15:58.702 "write": true, 00:15:58.702 "unmap": true, 00:15:58.702 "write_zeroes": true, 00:15:58.702 "flush": true, 00:15:58.702 "reset": true, 00:15:58.702 "compare": false, 00:15:58.702 "compare_and_write": false, 00:15:58.702 "abort": true, 00:15:58.702 "nvme_admin": false, 00:15:58.702 "nvme_io": false 00:15:58.702 }, 00:15:58.702 "memory_domains": [ 00:15:58.702 { 00:15:58.702 "dma_device_id": "system", 00:15:58.702 "dma_device_type": 1 00:15:58.702 }, 00:15:58.702 { 00:15:58.702 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:58.702 "dma_device_type": 2 00:15:58.702 } 00:15:58.702 ], 00:15:58.702 "driver_specific": {} 00:15:58.702 } 00:15:58.702 ] 00:15:58.702 23:57:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:15:58.702 23:57:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:15:58.702 23:57:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:15:58.702 23:57:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:58.960 BaseBdev3 00:15:58.960 23:57:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:15:58.960 23:57:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:15:58.960 23:57:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:58.960 23:57:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:15:58.960 23:57:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:58.960 23:57:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:58.960 23:57:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:59.218 23:57:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:59.477 [ 00:15:59.477 { 00:15:59.477 "name": "BaseBdev3", 00:15:59.477 "aliases": [ 00:15:59.477 "9cbd60c4-6e20-4f30-a10c-086be640d3df" 00:15:59.477 ], 00:15:59.477 "product_name": "Malloc disk", 00:15:59.477 "block_size": 512, 00:15:59.477 "num_blocks": 65536, 00:15:59.477 "uuid": "9cbd60c4-6e20-4f30-a10c-086be640d3df", 00:15:59.477 "assigned_rate_limits": { 00:15:59.477 "rw_ios_per_sec": 0, 00:15:59.477 "rw_mbytes_per_sec": 0, 00:15:59.477 "r_mbytes_per_sec": 0, 00:15:59.477 "w_mbytes_per_sec": 0 00:15:59.477 }, 00:15:59.477 "claimed": false, 00:15:59.477 "zoned": false, 00:15:59.477 "supported_io_types": { 00:15:59.477 "read": true, 00:15:59.477 "write": true, 00:15:59.477 "unmap": true, 00:15:59.477 "write_zeroes": true, 00:15:59.477 "flush": true, 00:15:59.477 "reset": true, 00:15:59.477 "compare": false, 00:15:59.477 "compare_and_write": false, 00:15:59.477 "abort": true, 00:15:59.477 "nvme_admin": false, 00:15:59.477 "nvme_io": false 00:15:59.477 }, 00:15:59.477 "memory_domains": [ 00:15:59.477 { 00:15:59.477 "dma_device_id": "system", 00:15:59.477 "dma_device_type": 1 00:15:59.477 }, 00:15:59.477 { 00:15:59.477 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:59.477 "dma_device_type": 2 00:15:59.477 } 00:15:59.477 ], 00:15:59.477 "driver_specific": {} 00:15:59.477 } 00:15:59.477 ] 00:15:59.477 23:57:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:15:59.477 23:57:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:15:59.477 23:57:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:15:59.477 23:57:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:15:59.735 BaseBdev4 00:15:59.735 23:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev4 00:15:59.735 23:58:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:15:59.735 23:58:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:59.735 23:58:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:15:59.735 23:58:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:59.735 23:58:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:59.735 23:58:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:59.994 23:58:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:15:59.994 [ 00:15:59.994 { 00:15:59.994 "name": "BaseBdev4", 00:15:59.994 "aliases": [ 00:15:59.994 "3c819a48-a5ad-45eb-90e5-a73ab4c7f54e" 00:15:59.994 ], 00:15:59.994 "product_name": "Malloc disk", 00:15:59.994 "block_size": 512, 00:15:59.994 "num_blocks": 65536, 00:15:59.994 "uuid": "3c819a48-a5ad-45eb-90e5-a73ab4c7f54e", 00:15:59.994 "assigned_rate_limits": { 00:15:59.994 "rw_ios_per_sec": 0, 00:15:59.994 "rw_mbytes_per_sec": 0, 00:15:59.994 "r_mbytes_per_sec": 0, 00:15:59.994 "w_mbytes_per_sec": 0 00:15:59.994 }, 00:15:59.994 "claimed": false, 00:15:59.994 "zoned": false, 00:15:59.994 "supported_io_types": { 00:15:59.994 "read": true, 00:15:59.994 "write": true, 00:15:59.994 "unmap": true, 00:15:59.994 "write_zeroes": true, 00:15:59.994 "flush": true, 00:15:59.994 "reset": true, 00:15:59.994 "compare": false, 00:15:59.994 "compare_and_write": false, 00:15:59.994 "abort": true, 00:15:59.994 "nvme_admin": false, 00:15:59.994 "nvme_io": false 00:15:59.994 }, 00:15:59.994 "memory_domains": [ 00:15:59.994 { 00:15:59.994 "dma_device_id": "system", 00:15:59.994 "dma_device_type": 1 00:15:59.994 }, 00:15:59.994 { 00:15:59.994 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:59.994 "dma_device_type": 2 00:15:59.994 } 00:15:59.994 ], 00:15:59.994 "driver_specific": {} 00:15:59.994 } 00:15:59.994 ] 00:15:59.994 23:58:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:15:59.994 23:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:15:59.994 23:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:15:59.994 23:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:00.252 [2024-05-14 23:58:00.798411] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:00.252 [2024-05-14 23:58:00.798454] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:00.252 [2024-05-14 23:58:00.798479] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:00.252 [2024-05-14 23:58:00.799924] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:00.252 [2024-05-14 23:58:00.799971] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:00.252 23:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:00.252 23:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:00.252 23:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:00.252 23:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:00.252 23:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:00.252 23:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:00.252 23:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:00.252 23:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:00.252 23:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:00.252 23:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:00.252 23:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:00.252 23:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:00.511 23:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:00.511 "name": "Existed_Raid", 00:16:00.511 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:00.511 "strip_size_kb": 64, 00:16:00.511 "state": "configuring", 00:16:00.511 "raid_level": "raid0", 00:16:00.511 "superblock": false, 00:16:00.511 "num_base_bdevs": 4, 00:16:00.511 "num_base_bdevs_discovered": 3, 00:16:00.511 "num_base_bdevs_operational": 4, 00:16:00.511 "base_bdevs_list": [ 00:16:00.511 { 00:16:00.511 "name": "BaseBdev1", 00:16:00.511 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:00.511 "is_configured": false, 00:16:00.511 "data_offset": 0, 00:16:00.511 "data_size": 0 00:16:00.511 }, 00:16:00.511 { 00:16:00.511 "name": "BaseBdev2", 00:16:00.511 "uuid": "5db7afff-0226-4b65-bc5c-da99a4d56e63", 00:16:00.511 "is_configured": true, 00:16:00.511 "data_offset": 0, 00:16:00.511 "data_size": 65536 00:16:00.511 }, 00:16:00.511 { 00:16:00.511 "name": "BaseBdev3", 00:16:00.511 "uuid": "9cbd60c4-6e20-4f30-a10c-086be640d3df", 00:16:00.511 "is_configured": true, 00:16:00.511 "data_offset": 0, 00:16:00.511 "data_size": 65536 00:16:00.511 }, 00:16:00.511 { 00:16:00.511 "name": "BaseBdev4", 00:16:00.511 "uuid": "3c819a48-a5ad-45eb-90e5-a73ab4c7f54e", 00:16:00.511 "is_configured": true, 00:16:00.511 "data_offset": 0, 00:16:00.511 "data_size": 65536 00:16:00.511 } 00:16:00.511 ] 00:16:00.511 }' 00:16:00.511 23:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:00.511 23:58:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:01.077 23:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:01.334 [2024-05-14 23:58:01.873332] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:01.334 23:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:01.334 23:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:01.334 23:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:01.334 23:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:01.334 23:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:01.334 23:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:01.334 23:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:01.334 23:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:01.334 23:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:01.334 23:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:01.334 23:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:01.334 23:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:01.591 23:58:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:01.591 "name": "Existed_Raid", 00:16:01.591 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:01.591 "strip_size_kb": 64, 00:16:01.591 "state": "configuring", 00:16:01.591 "raid_level": "raid0", 00:16:01.591 "superblock": false, 00:16:01.591 "num_base_bdevs": 4, 00:16:01.591 "num_base_bdevs_discovered": 2, 00:16:01.591 "num_base_bdevs_operational": 4, 00:16:01.591 "base_bdevs_list": [ 00:16:01.591 { 00:16:01.591 "name": "BaseBdev1", 00:16:01.591 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:01.591 "is_configured": false, 00:16:01.591 "data_offset": 0, 00:16:01.591 "data_size": 0 00:16:01.591 }, 00:16:01.591 { 00:16:01.591 "name": null, 00:16:01.591 "uuid": "5db7afff-0226-4b65-bc5c-da99a4d56e63", 00:16:01.591 "is_configured": false, 00:16:01.591 "data_offset": 0, 00:16:01.591 "data_size": 65536 00:16:01.591 }, 00:16:01.591 { 00:16:01.591 "name": "BaseBdev3", 00:16:01.591 "uuid": "9cbd60c4-6e20-4f30-a10c-086be640d3df", 00:16:01.591 "is_configured": true, 00:16:01.591 "data_offset": 0, 00:16:01.591 "data_size": 65536 00:16:01.591 }, 00:16:01.591 { 00:16:01.591 "name": "BaseBdev4", 00:16:01.591 "uuid": "3c819a48-a5ad-45eb-90e5-a73ab4c7f54e", 00:16:01.591 "is_configured": true, 00:16:01.591 "data_offset": 0, 00:16:01.591 "data_size": 65536 00:16:01.591 } 00:16:01.591 ] 00:16:01.591 }' 00:16:01.591 23:58:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:01.591 23:58:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:02.157 23:58:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.157 23:58:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:02.415 23:58:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:16:02.415 23:58:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:02.673 [2024-05-14 23:58:03.137231] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:02.673 BaseBdev1 00:16:02.673 23:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:16:02.673 23:58:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:16:02.673 23:58:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:02.673 23:58:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:16:02.673 23:58:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:02.673 23:58:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:02.673 23:58:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:02.933 23:58:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:03.191 [ 00:16:03.191 { 00:16:03.191 "name": "BaseBdev1", 00:16:03.191 "aliases": [ 00:16:03.191 "a6eb5cd8-9e13-4a65-acb0-b148eae43b01" 00:16:03.191 ], 00:16:03.191 "product_name": "Malloc disk", 00:16:03.191 "block_size": 512, 00:16:03.191 "num_blocks": 65536, 00:16:03.191 "uuid": "a6eb5cd8-9e13-4a65-acb0-b148eae43b01", 00:16:03.191 "assigned_rate_limits": { 00:16:03.191 "rw_ios_per_sec": 0, 00:16:03.191 "rw_mbytes_per_sec": 0, 00:16:03.191 "r_mbytes_per_sec": 0, 00:16:03.191 "w_mbytes_per_sec": 0 00:16:03.191 }, 00:16:03.191 "claimed": true, 00:16:03.191 "claim_type": "exclusive_write", 00:16:03.191 "zoned": false, 00:16:03.191 "supported_io_types": { 00:16:03.191 "read": true, 00:16:03.191 "write": true, 00:16:03.191 "unmap": true, 00:16:03.191 "write_zeroes": true, 00:16:03.191 "flush": true, 00:16:03.191 "reset": true, 00:16:03.191 "compare": false, 00:16:03.191 "compare_and_write": false, 00:16:03.191 "abort": true, 00:16:03.191 "nvme_admin": false, 00:16:03.191 "nvme_io": false 00:16:03.191 }, 00:16:03.191 "memory_domains": [ 00:16:03.191 { 00:16:03.191 "dma_device_id": "system", 00:16:03.191 "dma_device_type": 1 00:16:03.191 }, 00:16:03.191 { 00:16:03.191 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:03.191 "dma_device_type": 2 00:16:03.191 } 00:16:03.191 ], 00:16:03.191 "driver_specific": {} 00:16:03.191 } 00:16:03.191 ] 00:16:03.191 23:58:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:16:03.191 23:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:03.191 23:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:03.191 23:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:03.191 23:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:03.191 23:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:03.191 23:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:03.191 23:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:03.191 23:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:03.191 23:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:03.191 23:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:03.191 23:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.191 23:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:03.449 23:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:03.449 "name": "Existed_Raid", 00:16:03.449 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:03.449 "strip_size_kb": 64, 00:16:03.449 "state": "configuring", 00:16:03.449 "raid_level": "raid0", 00:16:03.449 "superblock": false, 00:16:03.449 "num_base_bdevs": 4, 00:16:03.449 "num_base_bdevs_discovered": 3, 00:16:03.449 "num_base_bdevs_operational": 4, 00:16:03.449 "base_bdevs_list": [ 00:16:03.449 { 00:16:03.449 "name": "BaseBdev1", 00:16:03.449 "uuid": "a6eb5cd8-9e13-4a65-acb0-b148eae43b01", 00:16:03.449 "is_configured": true, 00:16:03.449 "data_offset": 0, 00:16:03.449 "data_size": 65536 00:16:03.449 }, 00:16:03.449 { 00:16:03.449 "name": null, 00:16:03.449 "uuid": "5db7afff-0226-4b65-bc5c-da99a4d56e63", 00:16:03.449 "is_configured": false, 00:16:03.449 "data_offset": 0, 00:16:03.449 "data_size": 65536 00:16:03.449 }, 00:16:03.449 { 00:16:03.449 "name": "BaseBdev3", 00:16:03.449 "uuid": "9cbd60c4-6e20-4f30-a10c-086be640d3df", 00:16:03.449 "is_configured": true, 00:16:03.449 "data_offset": 0, 00:16:03.449 "data_size": 65536 00:16:03.449 }, 00:16:03.449 { 00:16:03.449 "name": "BaseBdev4", 00:16:03.449 "uuid": "3c819a48-a5ad-45eb-90e5-a73ab4c7f54e", 00:16:03.449 "is_configured": true, 00:16:03.449 "data_offset": 0, 00:16:03.449 "data_size": 65536 00:16:03.449 } 00:16:03.449 ] 00:16:03.449 }' 00:16:03.449 23:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:03.449 23:58:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:04.014 23:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:04.014 23:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:04.273 23:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:16:04.273 23:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:04.530 [2024-05-14 23:58:04.942142] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:04.530 23:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:04.530 23:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:04.530 23:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:04.530 23:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:04.530 23:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:04.530 23:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:04.530 23:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:04.530 23:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:04.530 23:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:04.530 23:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:04.530 23:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:04.530 23:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:04.788 23:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:04.788 "name": "Existed_Raid", 00:16:04.788 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:04.788 "strip_size_kb": 64, 00:16:04.788 "state": "configuring", 00:16:04.788 "raid_level": "raid0", 00:16:04.788 "superblock": false, 00:16:04.788 "num_base_bdevs": 4, 00:16:04.788 "num_base_bdevs_discovered": 2, 00:16:04.788 "num_base_bdevs_operational": 4, 00:16:04.788 "base_bdevs_list": [ 00:16:04.788 { 00:16:04.788 "name": "BaseBdev1", 00:16:04.788 "uuid": "a6eb5cd8-9e13-4a65-acb0-b148eae43b01", 00:16:04.788 "is_configured": true, 00:16:04.788 "data_offset": 0, 00:16:04.788 "data_size": 65536 00:16:04.788 }, 00:16:04.788 { 00:16:04.788 "name": null, 00:16:04.788 "uuid": "5db7afff-0226-4b65-bc5c-da99a4d56e63", 00:16:04.788 "is_configured": false, 00:16:04.788 "data_offset": 0, 00:16:04.788 "data_size": 65536 00:16:04.788 }, 00:16:04.788 { 00:16:04.788 "name": null, 00:16:04.788 "uuid": "9cbd60c4-6e20-4f30-a10c-086be640d3df", 00:16:04.788 "is_configured": false, 00:16:04.788 "data_offset": 0, 00:16:04.788 "data_size": 65536 00:16:04.788 }, 00:16:04.788 { 00:16:04.788 "name": "BaseBdev4", 00:16:04.788 "uuid": "3c819a48-a5ad-45eb-90e5-a73ab4c7f54e", 00:16:04.788 "is_configured": true, 00:16:04.788 "data_offset": 0, 00:16:04.788 "data_size": 65536 00:16:04.788 } 00:16:04.788 ] 00:16:04.788 }' 00:16:04.788 23:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:04.788 23:58:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:05.353 23:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:05.353 23:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:05.611 23:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:16:05.611 23:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:05.869 [2024-05-14 23:58:06.273694] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:05.869 23:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:05.869 23:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:05.869 23:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:05.869 23:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:05.869 23:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:05.869 23:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:05.869 23:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:05.869 23:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:05.869 23:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:05.869 23:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:05.869 23:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:05.869 23:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:06.127 23:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:06.127 "name": "Existed_Raid", 00:16:06.127 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:06.127 "strip_size_kb": 64, 00:16:06.127 "state": "configuring", 00:16:06.127 "raid_level": "raid0", 00:16:06.127 "superblock": false, 00:16:06.127 "num_base_bdevs": 4, 00:16:06.127 "num_base_bdevs_discovered": 3, 00:16:06.127 "num_base_bdevs_operational": 4, 00:16:06.127 "base_bdevs_list": [ 00:16:06.127 { 00:16:06.127 "name": "BaseBdev1", 00:16:06.127 "uuid": "a6eb5cd8-9e13-4a65-acb0-b148eae43b01", 00:16:06.127 "is_configured": true, 00:16:06.127 "data_offset": 0, 00:16:06.127 "data_size": 65536 00:16:06.127 }, 00:16:06.127 { 00:16:06.127 "name": null, 00:16:06.127 "uuid": "5db7afff-0226-4b65-bc5c-da99a4d56e63", 00:16:06.127 "is_configured": false, 00:16:06.127 "data_offset": 0, 00:16:06.127 "data_size": 65536 00:16:06.127 }, 00:16:06.127 { 00:16:06.127 "name": "BaseBdev3", 00:16:06.127 "uuid": "9cbd60c4-6e20-4f30-a10c-086be640d3df", 00:16:06.127 "is_configured": true, 00:16:06.127 "data_offset": 0, 00:16:06.127 "data_size": 65536 00:16:06.127 }, 00:16:06.127 { 00:16:06.127 "name": "BaseBdev4", 00:16:06.127 "uuid": "3c819a48-a5ad-45eb-90e5-a73ab4c7f54e", 00:16:06.127 "is_configured": true, 00:16:06.127 "data_offset": 0, 00:16:06.127 "data_size": 65536 00:16:06.127 } 00:16:06.127 ] 00:16:06.127 }' 00:16:06.127 23:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:06.127 23:58:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:06.695 23:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:06.695 23:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:06.967 23:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:16:06.967 23:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:07.239 [2024-05-14 23:58:07.593246] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:07.239 23:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:07.239 23:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:07.239 23:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:07.239 23:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:07.239 23:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:07.239 23:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:07.239 23:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:07.239 23:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:07.239 23:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:07.239 23:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:07.239 23:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:07.239 23:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:07.497 23:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:07.497 "name": "Existed_Raid", 00:16:07.497 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:07.497 "strip_size_kb": 64, 00:16:07.497 "state": "configuring", 00:16:07.497 "raid_level": "raid0", 00:16:07.497 "superblock": false, 00:16:07.497 "num_base_bdevs": 4, 00:16:07.497 "num_base_bdevs_discovered": 2, 00:16:07.497 "num_base_bdevs_operational": 4, 00:16:07.497 "base_bdevs_list": [ 00:16:07.497 { 00:16:07.497 "name": null, 00:16:07.497 "uuid": "a6eb5cd8-9e13-4a65-acb0-b148eae43b01", 00:16:07.497 "is_configured": false, 00:16:07.497 "data_offset": 0, 00:16:07.497 "data_size": 65536 00:16:07.497 }, 00:16:07.497 { 00:16:07.497 "name": null, 00:16:07.497 "uuid": "5db7afff-0226-4b65-bc5c-da99a4d56e63", 00:16:07.497 "is_configured": false, 00:16:07.497 "data_offset": 0, 00:16:07.497 "data_size": 65536 00:16:07.497 }, 00:16:07.497 { 00:16:07.497 "name": "BaseBdev3", 00:16:07.497 "uuid": "9cbd60c4-6e20-4f30-a10c-086be640d3df", 00:16:07.497 "is_configured": true, 00:16:07.497 "data_offset": 0, 00:16:07.497 "data_size": 65536 00:16:07.497 }, 00:16:07.497 { 00:16:07.497 "name": "BaseBdev4", 00:16:07.497 "uuid": "3c819a48-a5ad-45eb-90e5-a73ab4c7f54e", 00:16:07.497 "is_configured": true, 00:16:07.497 "data_offset": 0, 00:16:07.497 "data_size": 65536 00:16:07.497 } 00:16:07.497 ] 00:16:07.497 }' 00:16:07.497 23:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:07.497 23:58:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:08.064 23:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.064 23:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:08.064 23:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:16:08.064 23:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:08.322 [2024-05-14 23:58:08.853177] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:08.322 23:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:08.322 23:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:08.322 23:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:08.322 23:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:08.322 23:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:08.322 23:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:08.322 23:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:08.322 23:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:08.322 23:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:08.322 23:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:08.322 23:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.322 23:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:08.581 23:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:08.581 "name": "Existed_Raid", 00:16:08.581 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:08.581 "strip_size_kb": 64, 00:16:08.581 "state": "configuring", 00:16:08.581 "raid_level": "raid0", 00:16:08.581 "superblock": false, 00:16:08.581 "num_base_bdevs": 4, 00:16:08.581 "num_base_bdevs_discovered": 3, 00:16:08.581 "num_base_bdevs_operational": 4, 00:16:08.581 "base_bdevs_list": [ 00:16:08.581 { 00:16:08.581 "name": null, 00:16:08.581 "uuid": "a6eb5cd8-9e13-4a65-acb0-b148eae43b01", 00:16:08.581 "is_configured": false, 00:16:08.581 "data_offset": 0, 00:16:08.581 "data_size": 65536 00:16:08.581 }, 00:16:08.581 { 00:16:08.581 "name": "BaseBdev2", 00:16:08.581 "uuid": "5db7afff-0226-4b65-bc5c-da99a4d56e63", 00:16:08.581 "is_configured": true, 00:16:08.581 "data_offset": 0, 00:16:08.581 "data_size": 65536 00:16:08.581 }, 00:16:08.581 { 00:16:08.581 "name": "BaseBdev3", 00:16:08.581 "uuid": "9cbd60c4-6e20-4f30-a10c-086be640d3df", 00:16:08.581 "is_configured": true, 00:16:08.581 "data_offset": 0, 00:16:08.581 "data_size": 65536 00:16:08.581 }, 00:16:08.581 { 00:16:08.581 "name": "BaseBdev4", 00:16:08.581 "uuid": "3c819a48-a5ad-45eb-90e5-a73ab4c7f54e", 00:16:08.581 "is_configured": true, 00:16:08.581 "data_offset": 0, 00:16:08.581 "data_size": 65536 00:16:08.581 } 00:16:08.581 ] 00:16:08.581 }' 00:16:08.581 23:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:08.581 23:58:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:09.146 23:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:09.146 23:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:09.409 23:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:16:09.410 23:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:09.410 23:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:09.667 23:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u a6eb5cd8-9e13-4a65-acb0-b148eae43b01 00:16:09.924 [2024-05-14 23:58:10.353690] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:09.925 [2024-05-14 23:58:10.353728] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x281e8b0 00:16:09.925 [2024-05-14 23:58:10.353737] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:16:09.925 [2024-05-14 23:58:10.353932] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2593340 00:16:09.925 [2024-05-14 23:58:10.354065] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x281e8b0 00:16:09.925 [2024-05-14 23:58:10.354075] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x281e8b0 00:16:09.925 [2024-05-14 23:58:10.354238] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:09.925 NewBaseBdev 00:16:09.925 23:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:16:09.925 23:58:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:16:09.925 23:58:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:09.925 23:58:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:16:09.925 23:58:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:09.925 23:58:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:09.925 23:58:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:10.182 23:58:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:10.439 [ 00:16:10.439 { 00:16:10.439 "name": "NewBaseBdev", 00:16:10.439 "aliases": [ 00:16:10.439 "a6eb5cd8-9e13-4a65-acb0-b148eae43b01" 00:16:10.439 ], 00:16:10.439 "product_name": "Malloc disk", 00:16:10.439 "block_size": 512, 00:16:10.439 "num_blocks": 65536, 00:16:10.439 "uuid": "a6eb5cd8-9e13-4a65-acb0-b148eae43b01", 00:16:10.439 "assigned_rate_limits": { 00:16:10.439 "rw_ios_per_sec": 0, 00:16:10.439 "rw_mbytes_per_sec": 0, 00:16:10.439 "r_mbytes_per_sec": 0, 00:16:10.440 "w_mbytes_per_sec": 0 00:16:10.440 }, 00:16:10.440 "claimed": true, 00:16:10.440 "claim_type": "exclusive_write", 00:16:10.440 "zoned": false, 00:16:10.440 "supported_io_types": { 00:16:10.440 "read": true, 00:16:10.440 "write": true, 00:16:10.440 "unmap": true, 00:16:10.440 "write_zeroes": true, 00:16:10.440 "flush": true, 00:16:10.440 "reset": true, 00:16:10.440 "compare": false, 00:16:10.440 "compare_and_write": false, 00:16:10.440 "abort": true, 00:16:10.440 "nvme_admin": false, 00:16:10.440 "nvme_io": false 00:16:10.440 }, 00:16:10.440 "memory_domains": [ 00:16:10.440 { 00:16:10.440 "dma_device_id": "system", 00:16:10.440 "dma_device_type": 1 00:16:10.440 }, 00:16:10.440 { 00:16:10.440 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.440 "dma_device_type": 2 00:16:10.440 } 00:16:10.440 ], 00:16:10.440 "driver_specific": {} 00:16:10.440 } 00:16:10.440 ] 00:16:10.440 23:58:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:16:10.440 23:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:16:10.440 23:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:10.440 23:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:16:10.440 23:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:10.440 23:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:10.440 23:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:10.440 23:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:10.440 23:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:10.440 23:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:10.440 23:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:10.440 23:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.440 23:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:10.698 23:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:10.698 "name": "Existed_Raid", 00:16:10.698 "uuid": "66c31857-4b58-4b32-8347-6311154ba8e6", 00:16:10.698 "strip_size_kb": 64, 00:16:10.698 "state": "online", 00:16:10.698 "raid_level": "raid0", 00:16:10.698 "superblock": false, 00:16:10.698 "num_base_bdevs": 4, 00:16:10.698 "num_base_bdevs_discovered": 4, 00:16:10.698 "num_base_bdevs_operational": 4, 00:16:10.698 "base_bdevs_list": [ 00:16:10.698 { 00:16:10.698 "name": "NewBaseBdev", 00:16:10.698 "uuid": "a6eb5cd8-9e13-4a65-acb0-b148eae43b01", 00:16:10.698 "is_configured": true, 00:16:10.698 "data_offset": 0, 00:16:10.698 "data_size": 65536 00:16:10.698 }, 00:16:10.698 { 00:16:10.698 "name": "BaseBdev2", 00:16:10.698 "uuid": "5db7afff-0226-4b65-bc5c-da99a4d56e63", 00:16:10.698 "is_configured": true, 00:16:10.698 "data_offset": 0, 00:16:10.698 "data_size": 65536 00:16:10.698 }, 00:16:10.698 { 00:16:10.698 "name": "BaseBdev3", 00:16:10.698 "uuid": "9cbd60c4-6e20-4f30-a10c-086be640d3df", 00:16:10.698 "is_configured": true, 00:16:10.698 "data_offset": 0, 00:16:10.698 "data_size": 65536 00:16:10.698 }, 00:16:10.698 { 00:16:10.698 "name": "BaseBdev4", 00:16:10.698 "uuid": "3c819a48-a5ad-45eb-90e5-a73ab4c7f54e", 00:16:10.698 "is_configured": true, 00:16:10.698 "data_offset": 0, 00:16:10.698 "data_size": 65536 00:16:10.698 } 00:16:10.698 ] 00:16:10.698 }' 00:16:10.698 23:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:10.699 23:58:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:11.264 23:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:16:11.264 23:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:16:11.264 23:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:16:11.264 23:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:16:11.264 23:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:16:11.264 23:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:16:11.264 23:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:11.264 23:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:16:11.522 [2024-05-14 23:58:11.906109] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:11.522 23:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:16:11.522 "name": "Existed_Raid", 00:16:11.522 "aliases": [ 00:16:11.522 "66c31857-4b58-4b32-8347-6311154ba8e6" 00:16:11.522 ], 00:16:11.522 "product_name": "Raid Volume", 00:16:11.522 "block_size": 512, 00:16:11.522 "num_blocks": 262144, 00:16:11.522 "uuid": "66c31857-4b58-4b32-8347-6311154ba8e6", 00:16:11.522 "assigned_rate_limits": { 00:16:11.522 "rw_ios_per_sec": 0, 00:16:11.522 "rw_mbytes_per_sec": 0, 00:16:11.522 "r_mbytes_per_sec": 0, 00:16:11.522 "w_mbytes_per_sec": 0 00:16:11.522 }, 00:16:11.522 "claimed": false, 00:16:11.522 "zoned": false, 00:16:11.522 "supported_io_types": { 00:16:11.522 "read": true, 00:16:11.522 "write": true, 00:16:11.522 "unmap": true, 00:16:11.522 "write_zeroes": true, 00:16:11.522 "flush": true, 00:16:11.522 "reset": true, 00:16:11.522 "compare": false, 00:16:11.522 "compare_and_write": false, 00:16:11.522 "abort": false, 00:16:11.522 "nvme_admin": false, 00:16:11.522 "nvme_io": false 00:16:11.522 }, 00:16:11.522 "memory_domains": [ 00:16:11.522 { 00:16:11.522 "dma_device_id": "system", 00:16:11.522 "dma_device_type": 1 00:16:11.522 }, 00:16:11.522 { 00:16:11.522 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.522 "dma_device_type": 2 00:16:11.522 }, 00:16:11.522 { 00:16:11.522 "dma_device_id": "system", 00:16:11.522 "dma_device_type": 1 00:16:11.522 }, 00:16:11.522 { 00:16:11.522 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.522 "dma_device_type": 2 00:16:11.522 }, 00:16:11.522 { 00:16:11.522 "dma_device_id": "system", 00:16:11.522 "dma_device_type": 1 00:16:11.522 }, 00:16:11.522 { 00:16:11.522 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.522 "dma_device_type": 2 00:16:11.522 }, 00:16:11.522 { 00:16:11.522 "dma_device_id": "system", 00:16:11.522 "dma_device_type": 1 00:16:11.522 }, 00:16:11.522 { 00:16:11.522 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.522 "dma_device_type": 2 00:16:11.522 } 00:16:11.522 ], 00:16:11.522 "driver_specific": { 00:16:11.522 "raid": { 00:16:11.522 "uuid": "66c31857-4b58-4b32-8347-6311154ba8e6", 00:16:11.522 "strip_size_kb": 64, 00:16:11.522 "state": "online", 00:16:11.522 "raid_level": "raid0", 00:16:11.522 "superblock": false, 00:16:11.522 "num_base_bdevs": 4, 00:16:11.522 "num_base_bdevs_discovered": 4, 00:16:11.522 "num_base_bdevs_operational": 4, 00:16:11.522 "base_bdevs_list": [ 00:16:11.522 { 00:16:11.522 "name": "NewBaseBdev", 00:16:11.522 "uuid": "a6eb5cd8-9e13-4a65-acb0-b148eae43b01", 00:16:11.522 "is_configured": true, 00:16:11.522 "data_offset": 0, 00:16:11.522 "data_size": 65536 00:16:11.522 }, 00:16:11.522 { 00:16:11.522 "name": "BaseBdev2", 00:16:11.522 "uuid": "5db7afff-0226-4b65-bc5c-da99a4d56e63", 00:16:11.522 "is_configured": true, 00:16:11.522 "data_offset": 0, 00:16:11.522 "data_size": 65536 00:16:11.522 }, 00:16:11.522 { 00:16:11.522 "name": "BaseBdev3", 00:16:11.522 "uuid": "9cbd60c4-6e20-4f30-a10c-086be640d3df", 00:16:11.522 "is_configured": true, 00:16:11.522 "data_offset": 0, 00:16:11.522 "data_size": 65536 00:16:11.522 }, 00:16:11.522 { 00:16:11.522 "name": "BaseBdev4", 00:16:11.522 "uuid": "3c819a48-a5ad-45eb-90e5-a73ab4c7f54e", 00:16:11.522 "is_configured": true, 00:16:11.522 "data_offset": 0, 00:16:11.522 "data_size": 65536 00:16:11.522 } 00:16:11.522 ] 00:16:11.522 } 00:16:11.522 } 00:16:11.522 }' 00:16:11.522 23:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:11.522 23:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:16:11.522 BaseBdev2 00:16:11.522 BaseBdev3 00:16:11.522 BaseBdev4' 00:16:11.522 23:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:11.522 23:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:11.522 23:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:11.779 23:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:11.779 "name": "NewBaseBdev", 00:16:11.779 "aliases": [ 00:16:11.779 "a6eb5cd8-9e13-4a65-acb0-b148eae43b01" 00:16:11.779 ], 00:16:11.779 "product_name": "Malloc disk", 00:16:11.779 "block_size": 512, 00:16:11.779 "num_blocks": 65536, 00:16:11.779 "uuid": "a6eb5cd8-9e13-4a65-acb0-b148eae43b01", 00:16:11.779 "assigned_rate_limits": { 00:16:11.779 "rw_ios_per_sec": 0, 00:16:11.779 "rw_mbytes_per_sec": 0, 00:16:11.779 "r_mbytes_per_sec": 0, 00:16:11.779 "w_mbytes_per_sec": 0 00:16:11.780 }, 00:16:11.780 "claimed": true, 00:16:11.780 "claim_type": "exclusive_write", 00:16:11.780 "zoned": false, 00:16:11.780 "supported_io_types": { 00:16:11.780 "read": true, 00:16:11.780 "write": true, 00:16:11.780 "unmap": true, 00:16:11.780 "write_zeroes": true, 00:16:11.780 "flush": true, 00:16:11.780 "reset": true, 00:16:11.780 "compare": false, 00:16:11.780 "compare_and_write": false, 00:16:11.780 "abort": true, 00:16:11.780 "nvme_admin": false, 00:16:11.780 "nvme_io": false 00:16:11.780 }, 00:16:11.780 "memory_domains": [ 00:16:11.780 { 00:16:11.780 "dma_device_id": "system", 00:16:11.780 "dma_device_type": 1 00:16:11.780 }, 00:16:11.780 { 00:16:11.780 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.780 "dma_device_type": 2 00:16:11.780 } 00:16:11.780 ], 00:16:11.780 "driver_specific": {} 00:16:11.780 }' 00:16:11.780 23:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:11.780 23:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:11.780 23:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:11.780 23:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:11.780 23:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:12.037 23:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:12.037 23:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:12.037 23:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:12.037 23:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:12.037 23:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:12.037 23:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:12.037 23:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:12.037 23:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:12.037 23:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:12.037 23:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:12.295 23:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:12.295 "name": "BaseBdev2", 00:16:12.295 "aliases": [ 00:16:12.295 "5db7afff-0226-4b65-bc5c-da99a4d56e63" 00:16:12.295 ], 00:16:12.295 "product_name": "Malloc disk", 00:16:12.295 "block_size": 512, 00:16:12.295 "num_blocks": 65536, 00:16:12.295 "uuid": "5db7afff-0226-4b65-bc5c-da99a4d56e63", 00:16:12.295 "assigned_rate_limits": { 00:16:12.295 "rw_ios_per_sec": 0, 00:16:12.295 "rw_mbytes_per_sec": 0, 00:16:12.295 "r_mbytes_per_sec": 0, 00:16:12.295 "w_mbytes_per_sec": 0 00:16:12.295 }, 00:16:12.295 "claimed": true, 00:16:12.295 "claim_type": "exclusive_write", 00:16:12.295 "zoned": false, 00:16:12.295 "supported_io_types": { 00:16:12.295 "read": true, 00:16:12.295 "write": true, 00:16:12.295 "unmap": true, 00:16:12.295 "write_zeroes": true, 00:16:12.295 "flush": true, 00:16:12.295 "reset": true, 00:16:12.295 "compare": false, 00:16:12.295 "compare_and_write": false, 00:16:12.295 "abort": true, 00:16:12.295 "nvme_admin": false, 00:16:12.295 "nvme_io": false 00:16:12.295 }, 00:16:12.295 "memory_domains": [ 00:16:12.295 { 00:16:12.295 "dma_device_id": "system", 00:16:12.295 "dma_device_type": 1 00:16:12.295 }, 00:16:12.295 { 00:16:12.295 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:12.295 "dma_device_type": 2 00:16:12.295 } 00:16:12.295 ], 00:16:12.295 "driver_specific": {} 00:16:12.295 }' 00:16:12.295 23:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:12.295 23:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:12.553 23:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:12.553 23:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:12.553 23:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:12.553 23:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:12.553 23:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:12.553 23:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:12.553 23:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:12.553 23:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:12.553 23:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:12.811 23:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:12.811 23:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:12.811 23:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:12.811 23:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:12.811 23:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:12.811 "name": "BaseBdev3", 00:16:12.811 "aliases": [ 00:16:12.811 "9cbd60c4-6e20-4f30-a10c-086be640d3df" 00:16:12.811 ], 00:16:12.811 "product_name": "Malloc disk", 00:16:12.811 "block_size": 512, 00:16:12.811 "num_blocks": 65536, 00:16:12.811 "uuid": "9cbd60c4-6e20-4f30-a10c-086be640d3df", 00:16:12.811 "assigned_rate_limits": { 00:16:12.811 "rw_ios_per_sec": 0, 00:16:12.811 "rw_mbytes_per_sec": 0, 00:16:12.811 "r_mbytes_per_sec": 0, 00:16:12.811 "w_mbytes_per_sec": 0 00:16:12.811 }, 00:16:12.811 "claimed": true, 00:16:12.811 "claim_type": "exclusive_write", 00:16:12.811 "zoned": false, 00:16:12.811 "supported_io_types": { 00:16:12.811 "read": true, 00:16:12.811 "write": true, 00:16:12.811 "unmap": true, 00:16:12.811 "write_zeroes": true, 00:16:12.811 "flush": true, 00:16:12.811 "reset": true, 00:16:12.811 "compare": false, 00:16:12.811 "compare_and_write": false, 00:16:12.811 "abort": true, 00:16:12.811 "nvme_admin": false, 00:16:12.811 "nvme_io": false 00:16:12.811 }, 00:16:12.811 "memory_domains": [ 00:16:12.811 { 00:16:12.811 "dma_device_id": "system", 00:16:12.811 "dma_device_type": 1 00:16:12.811 }, 00:16:12.811 { 00:16:12.811 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:12.811 "dma_device_type": 2 00:16:12.811 } 00:16:12.811 ], 00:16:12.811 "driver_specific": {} 00:16:12.811 }' 00:16:13.069 23:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:13.069 23:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:13.069 23:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:13.069 23:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:13.069 23:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:13.069 23:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:13.069 23:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:13.069 23:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:13.069 23:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:13.327 23:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:13.327 23:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:13.327 23:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:13.327 23:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:13.327 23:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:13.327 23:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:13.584 23:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:13.584 "name": "BaseBdev4", 00:16:13.584 "aliases": [ 00:16:13.584 "3c819a48-a5ad-45eb-90e5-a73ab4c7f54e" 00:16:13.584 ], 00:16:13.584 "product_name": "Malloc disk", 00:16:13.584 "block_size": 512, 00:16:13.584 "num_blocks": 65536, 00:16:13.584 "uuid": "3c819a48-a5ad-45eb-90e5-a73ab4c7f54e", 00:16:13.584 "assigned_rate_limits": { 00:16:13.584 "rw_ios_per_sec": 0, 00:16:13.584 "rw_mbytes_per_sec": 0, 00:16:13.584 "r_mbytes_per_sec": 0, 00:16:13.584 "w_mbytes_per_sec": 0 00:16:13.584 }, 00:16:13.584 "claimed": true, 00:16:13.584 "claim_type": "exclusive_write", 00:16:13.584 "zoned": false, 00:16:13.584 "supported_io_types": { 00:16:13.584 "read": true, 00:16:13.584 "write": true, 00:16:13.584 "unmap": true, 00:16:13.584 "write_zeroes": true, 00:16:13.584 "flush": true, 00:16:13.584 "reset": true, 00:16:13.584 "compare": false, 00:16:13.584 "compare_and_write": false, 00:16:13.584 "abort": true, 00:16:13.584 "nvme_admin": false, 00:16:13.584 "nvme_io": false 00:16:13.584 }, 00:16:13.584 "memory_domains": [ 00:16:13.584 { 00:16:13.584 "dma_device_id": "system", 00:16:13.584 "dma_device_type": 1 00:16:13.584 }, 00:16:13.584 { 00:16:13.584 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:13.584 "dma_device_type": 2 00:16:13.584 } 00:16:13.584 ], 00:16:13.584 "driver_specific": {} 00:16:13.584 }' 00:16:13.584 23:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:13.584 23:58:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:13.584 23:58:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:13.584 23:58:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:13.584 23:58:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:13.584 23:58:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:13.584 23:58:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:13.842 23:58:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:13.842 23:58:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:13.842 23:58:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:13.842 23:58:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:13.842 23:58:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:13.842 23:58:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:14.100 [2024-05-14 23:58:14.564919] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:14.100 [2024-05-14 23:58:14.564947] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:14.100 [2024-05-14 23:58:14.565005] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:14.100 [2024-05-14 23:58:14.565069] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:14.100 [2024-05-14 23:58:14.565081] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x281e8b0 name Existed_Raid, state offline 00:16:14.100 23:58:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@342 -- # killprocess 434289 00:16:14.100 23:58:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@946 -- # '[' -z 434289 ']' 00:16:14.100 23:58:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # kill -0 434289 00:16:14.100 23:58:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # uname 00:16:14.100 23:58:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:16:14.100 23:58:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 434289 00:16:14.100 23:58:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:16:14.100 23:58:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:16:14.100 23:58:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 434289' 00:16:14.100 killing process with pid 434289 00:16:14.100 23:58:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@965 -- # kill 434289 00:16:14.100 [2024-05-14 23:58:14.631589] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:14.100 23:58:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@970 -- # wait 434289 00:16:14.100 [2024-05-14 23:58:14.671960] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:14.357 23:58:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@344 -- # return 0 00:16:14.357 00:16:14.357 real 0m31.803s 00:16:14.357 user 0m58.333s 00:16:14.357 sys 0m5.706s 00:16:14.357 23:58:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:16:14.357 23:58:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:14.357 ************************************ 00:16:14.357 END TEST raid_state_function_test 00:16:14.357 ************************************ 00:16:14.357 23:58:14 bdev_raid -- bdev/bdev_raid.sh@816 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:16:14.358 23:58:14 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:16:14.358 23:58:14 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:16:14.358 23:58:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:14.624 ************************************ 00:16:14.624 START TEST raid_state_function_test_sb 00:16:14.624 ************************************ 00:16:14.624 23:58:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1121 -- # raid_state_function_test raid0 4 true 00:16:14.624 23:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local raid_level=raid0 00:16:14.624 23:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=4 00:16:14.624 23:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:16:14.624 23:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:16:14.624 23:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:16:14.624 23:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:16:14.624 23:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:16:14.624 23:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:16:14.624 23:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:16:14.624 23:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:16:14.624 23:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:16:14.624 23:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:16:14.624 23:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:16:14.624 23:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:16:14.624 23:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:16:14.624 23:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev4 00:16:14.624 23:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:16:14.624 23:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:16:14.624 23:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:16:14.624 23:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:16:14.624 23:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:16:14.624 23:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size 00:16:14.624 23:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:16:14.624 23:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:16:14.624 23:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # '[' raid0 '!=' raid1 ']' 00:16:14.624 23:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:16:14.624 23:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:16:14.624 23:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:16:14.624 23:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:16:14.624 23:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # raid_pid=439024 00:16:14.624 23:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 439024' 00:16:14.624 Process raid pid: 439024 00:16:14.624 23:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@247 -- # waitforlisten 439024 /var/tmp/spdk-raid.sock 00:16:14.624 23:58:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@827 -- # '[' -z 439024 ']' 00:16:14.624 23:58:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:14.624 23:58:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:16:14.624 23:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:14.624 23:58:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:14.624 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:14.624 23:58:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:16:14.624 23:58:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:14.624 [2024-05-14 23:58:15.062763] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:16:14.624 [2024-05-14 23:58:15.062828] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:14.624 [2024-05-14 23:58:15.190824] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:14.884 [2024-05-14 23:58:15.296960] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:16:14.884 [2024-05-14 23:58:15.354585] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:14.885 [2024-05-14 23:58:15.354621] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:15.450 23:58:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:16:15.450 23:58:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # return 0 00:16:15.450 23:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:15.707 [2024-05-14 23:58:16.206598] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:15.707 [2024-05-14 23:58:16.206642] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:15.707 [2024-05-14 23:58:16.206653] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:15.707 [2024-05-14 23:58:16.206666] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:15.707 [2024-05-14 23:58:16.206675] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:15.707 [2024-05-14 23:58:16.206687] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:15.707 [2024-05-14 23:58:16.206696] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:15.707 [2024-05-14 23:58:16.206707] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:15.707 23:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:15.707 23:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:15.707 23:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:15.708 23:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:15.708 23:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:15.708 23:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:15.708 23:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:15.708 23:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:15.708 23:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:15.708 23:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:15.708 23:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:15.708 23:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:15.965 23:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:15.965 "name": "Existed_Raid", 00:16:15.965 "uuid": "06500d05-2ddd-4cd5-96c9-6e7415b6bbab", 00:16:15.965 "strip_size_kb": 64, 00:16:15.965 "state": "configuring", 00:16:15.965 "raid_level": "raid0", 00:16:15.965 "superblock": true, 00:16:15.965 "num_base_bdevs": 4, 00:16:15.965 "num_base_bdevs_discovered": 0, 00:16:15.965 "num_base_bdevs_operational": 4, 00:16:15.965 "base_bdevs_list": [ 00:16:15.965 { 00:16:15.965 "name": "BaseBdev1", 00:16:15.965 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:15.965 "is_configured": false, 00:16:15.965 "data_offset": 0, 00:16:15.965 "data_size": 0 00:16:15.965 }, 00:16:15.965 { 00:16:15.965 "name": "BaseBdev2", 00:16:15.965 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:15.965 "is_configured": false, 00:16:15.965 "data_offset": 0, 00:16:15.965 "data_size": 0 00:16:15.965 }, 00:16:15.965 { 00:16:15.965 "name": "BaseBdev3", 00:16:15.965 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:15.965 "is_configured": false, 00:16:15.965 "data_offset": 0, 00:16:15.965 "data_size": 0 00:16:15.965 }, 00:16:15.965 { 00:16:15.965 "name": "BaseBdev4", 00:16:15.965 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:15.965 "is_configured": false, 00:16:15.965 "data_offset": 0, 00:16:15.965 "data_size": 0 00:16:15.965 } 00:16:15.965 ] 00:16:15.965 }' 00:16:15.965 23:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:15.965 23:58:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:16.530 23:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:16.788 [2024-05-14 23:58:17.269254] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:16.788 [2024-05-14 23:58:17.269290] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ad4c00 name Existed_Raid, state configuring 00:16:16.788 23:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:17.046 [2024-05-14 23:58:17.509915] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:17.046 [2024-05-14 23:58:17.509950] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:17.046 [2024-05-14 23:58:17.509964] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:17.046 [2024-05-14 23:58:17.509980] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:17.046 [2024-05-14 23:58:17.509993] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:17.046 [2024-05-14 23:58:17.510009] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:17.046 [2024-05-14 23:58:17.510023] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:17.046 [2024-05-14 23:58:17.510039] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:17.046 23:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:17.304 [2024-05-14 23:58:17.761701] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:17.304 BaseBdev1 00:16:17.304 23:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:16:17.304 23:58:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:16:17.304 23:58:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:17.304 23:58:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:16:17.304 23:58:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:17.304 23:58:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:17.304 23:58:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:17.560 23:58:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:17.817 [ 00:16:17.817 { 00:16:17.817 "name": "BaseBdev1", 00:16:17.817 "aliases": [ 00:16:17.817 "5ac787f2-64b6-4f17-ba17-d7d3f9f96a13" 00:16:17.817 ], 00:16:17.817 "product_name": "Malloc disk", 00:16:17.817 "block_size": 512, 00:16:17.817 "num_blocks": 65536, 00:16:17.817 "uuid": "5ac787f2-64b6-4f17-ba17-d7d3f9f96a13", 00:16:17.817 "assigned_rate_limits": { 00:16:17.817 "rw_ios_per_sec": 0, 00:16:17.817 "rw_mbytes_per_sec": 0, 00:16:17.817 "r_mbytes_per_sec": 0, 00:16:17.817 "w_mbytes_per_sec": 0 00:16:17.817 }, 00:16:17.817 "claimed": true, 00:16:17.817 "claim_type": "exclusive_write", 00:16:17.817 "zoned": false, 00:16:17.817 "supported_io_types": { 00:16:17.817 "read": true, 00:16:17.817 "write": true, 00:16:17.817 "unmap": true, 00:16:17.817 "write_zeroes": true, 00:16:17.817 "flush": true, 00:16:17.817 "reset": true, 00:16:17.817 "compare": false, 00:16:17.817 "compare_and_write": false, 00:16:17.817 "abort": true, 00:16:17.817 "nvme_admin": false, 00:16:17.817 "nvme_io": false 00:16:17.817 }, 00:16:17.817 "memory_domains": [ 00:16:17.817 { 00:16:17.817 "dma_device_id": "system", 00:16:17.817 "dma_device_type": 1 00:16:17.817 }, 00:16:17.817 { 00:16:17.817 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.817 "dma_device_type": 2 00:16:17.817 } 00:16:17.817 ], 00:16:17.817 "driver_specific": {} 00:16:17.817 } 00:16:17.817 ] 00:16:17.817 23:58:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:16:17.817 23:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:17.817 23:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:17.817 23:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:17.817 23:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:17.817 23:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:17.817 23:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:17.817 23:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:17.817 23:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:17.817 23:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:17.817 23:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:17.817 23:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:17.817 23:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:18.075 23:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:18.075 "name": "Existed_Raid", 00:16:18.075 "uuid": "07eae3d4-1e20-4595-8903-5aca27cad600", 00:16:18.075 "strip_size_kb": 64, 00:16:18.075 "state": "configuring", 00:16:18.075 "raid_level": "raid0", 00:16:18.075 "superblock": true, 00:16:18.075 "num_base_bdevs": 4, 00:16:18.075 "num_base_bdevs_discovered": 1, 00:16:18.075 "num_base_bdevs_operational": 4, 00:16:18.075 "base_bdevs_list": [ 00:16:18.075 { 00:16:18.075 "name": "BaseBdev1", 00:16:18.075 "uuid": "5ac787f2-64b6-4f17-ba17-d7d3f9f96a13", 00:16:18.075 "is_configured": true, 00:16:18.075 "data_offset": 2048, 00:16:18.075 "data_size": 63488 00:16:18.075 }, 00:16:18.075 { 00:16:18.075 "name": "BaseBdev2", 00:16:18.075 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:18.075 "is_configured": false, 00:16:18.075 "data_offset": 0, 00:16:18.075 "data_size": 0 00:16:18.075 }, 00:16:18.075 { 00:16:18.075 "name": "BaseBdev3", 00:16:18.075 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:18.075 "is_configured": false, 00:16:18.075 "data_offset": 0, 00:16:18.075 "data_size": 0 00:16:18.075 }, 00:16:18.075 { 00:16:18.075 "name": "BaseBdev4", 00:16:18.075 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:18.075 "is_configured": false, 00:16:18.075 "data_offset": 0, 00:16:18.075 "data_size": 0 00:16:18.075 } 00:16:18.075 ] 00:16:18.075 }' 00:16:18.075 23:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:18.075 23:58:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:18.640 23:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:18.897 [2024-05-14 23:58:19.249649] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:18.897 [2024-05-14 23:58:19.249692] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ad4ea0 name Existed_Raid, state configuring 00:16:18.897 23:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:18.897 [2024-05-14 23:58:19.482309] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:18.897 [2024-05-14 23:58:19.484026] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:18.897 [2024-05-14 23:58:19.484060] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:18.897 [2024-05-14 23:58:19.484076] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:18.897 [2024-05-14 23:58:19.484091] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:18.897 [2024-05-14 23:58:19.484104] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:18.897 [2024-05-14 23:58:19.484119] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:19.154 23:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:16:19.154 23:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:16:19.154 23:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:19.154 23:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:19.154 23:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:19.154 23:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:19.154 23:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:19.154 23:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:19.154 23:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:19.154 23:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:19.154 23:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:19.154 23:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:19.154 23:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:19.154 23:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:19.412 23:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:19.412 "name": "Existed_Raid", 00:16:19.412 "uuid": "830064ff-0ce9-410b-9a33-715088a43c3c", 00:16:19.412 "strip_size_kb": 64, 00:16:19.412 "state": "configuring", 00:16:19.412 "raid_level": "raid0", 00:16:19.412 "superblock": true, 00:16:19.412 "num_base_bdevs": 4, 00:16:19.412 "num_base_bdevs_discovered": 1, 00:16:19.412 "num_base_bdevs_operational": 4, 00:16:19.412 "base_bdevs_list": [ 00:16:19.412 { 00:16:19.412 "name": "BaseBdev1", 00:16:19.412 "uuid": "5ac787f2-64b6-4f17-ba17-d7d3f9f96a13", 00:16:19.412 "is_configured": true, 00:16:19.412 "data_offset": 2048, 00:16:19.412 "data_size": 63488 00:16:19.412 }, 00:16:19.412 { 00:16:19.412 "name": "BaseBdev2", 00:16:19.412 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:19.412 "is_configured": false, 00:16:19.412 "data_offset": 0, 00:16:19.412 "data_size": 0 00:16:19.412 }, 00:16:19.412 { 00:16:19.412 "name": "BaseBdev3", 00:16:19.412 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:19.412 "is_configured": false, 00:16:19.412 "data_offset": 0, 00:16:19.412 "data_size": 0 00:16:19.412 }, 00:16:19.412 { 00:16:19.412 "name": "BaseBdev4", 00:16:19.412 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:19.412 "is_configured": false, 00:16:19.412 "data_offset": 0, 00:16:19.412 "data_size": 0 00:16:19.412 } 00:16:19.412 ] 00:16:19.412 }' 00:16:19.412 23:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:19.412 23:58:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:19.992 23:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:19.992 [2024-05-14 23:58:20.560538] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:19.992 BaseBdev2 00:16:20.265 23:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:16:20.265 23:58:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:16:20.265 23:58:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:20.265 23:58:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:16:20.265 23:58:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:20.265 23:58:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:20.265 23:58:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:20.265 23:58:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:20.522 [ 00:16:20.522 { 00:16:20.522 "name": "BaseBdev2", 00:16:20.522 "aliases": [ 00:16:20.522 "083b6313-03db-4380-be49-4f33a771f1c5" 00:16:20.522 ], 00:16:20.522 "product_name": "Malloc disk", 00:16:20.522 "block_size": 512, 00:16:20.522 "num_blocks": 65536, 00:16:20.522 "uuid": "083b6313-03db-4380-be49-4f33a771f1c5", 00:16:20.522 "assigned_rate_limits": { 00:16:20.522 "rw_ios_per_sec": 0, 00:16:20.522 "rw_mbytes_per_sec": 0, 00:16:20.522 "r_mbytes_per_sec": 0, 00:16:20.522 "w_mbytes_per_sec": 0 00:16:20.522 }, 00:16:20.522 "claimed": true, 00:16:20.522 "claim_type": "exclusive_write", 00:16:20.522 "zoned": false, 00:16:20.522 "supported_io_types": { 00:16:20.522 "read": true, 00:16:20.522 "write": true, 00:16:20.522 "unmap": true, 00:16:20.522 "write_zeroes": true, 00:16:20.522 "flush": true, 00:16:20.522 "reset": true, 00:16:20.522 "compare": false, 00:16:20.522 "compare_and_write": false, 00:16:20.522 "abort": true, 00:16:20.522 "nvme_admin": false, 00:16:20.522 "nvme_io": false 00:16:20.522 }, 00:16:20.522 "memory_domains": [ 00:16:20.522 { 00:16:20.522 "dma_device_id": "system", 00:16:20.522 "dma_device_type": 1 00:16:20.522 }, 00:16:20.522 { 00:16:20.522 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:20.522 "dma_device_type": 2 00:16:20.522 } 00:16:20.522 ], 00:16:20.522 "driver_specific": {} 00:16:20.522 } 00:16:20.522 ] 00:16:20.522 23:58:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:16:20.522 23:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:16:20.522 23:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:16:20.522 23:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:20.522 23:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:20.522 23:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:20.522 23:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:20.522 23:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:20.522 23:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:20.522 23:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:20.522 23:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:20.522 23:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:20.522 23:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:20.522 23:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.522 23:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:20.779 23:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:20.779 "name": "Existed_Raid", 00:16:20.779 "uuid": "830064ff-0ce9-410b-9a33-715088a43c3c", 00:16:20.779 "strip_size_kb": 64, 00:16:20.779 "state": "configuring", 00:16:20.779 "raid_level": "raid0", 00:16:20.779 "superblock": true, 00:16:20.779 "num_base_bdevs": 4, 00:16:20.779 "num_base_bdevs_discovered": 2, 00:16:20.779 "num_base_bdevs_operational": 4, 00:16:20.779 "base_bdevs_list": [ 00:16:20.779 { 00:16:20.779 "name": "BaseBdev1", 00:16:20.779 "uuid": "5ac787f2-64b6-4f17-ba17-d7d3f9f96a13", 00:16:20.779 "is_configured": true, 00:16:20.779 "data_offset": 2048, 00:16:20.779 "data_size": 63488 00:16:20.779 }, 00:16:20.779 { 00:16:20.779 "name": "BaseBdev2", 00:16:20.779 "uuid": "083b6313-03db-4380-be49-4f33a771f1c5", 00:16:20.779 "is_configured": true, 00:16:20.779 "data_offset": 2048, 00:16:20.779 "data_size": 63488 00:16:20.779 }, 00:16:20.779 { 00:16:20.779 "name": "BaseBdev3", 00:16:20.779 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:20.779 "is_configured": false, 00:16:20.779 "data_offset": 0, 00:16:20.779 "data_size": 0 00:16:20.779 }, 00:16:20.779 { 00:16:20.779 "name": "BaseBdev4", 00:16:20.779 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:20.779 "is_configured": false, 00:16:20.779 "data_offset": 0, 00:16:20.779 "data_size": 0 00:16:20.779 } 00:16:20.779 ] 00:16:20.779 }' 00:16:20.779 23:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:20.779 23:58:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:21.343 23:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:21.600 [2024-05-14 23:58:22.140123] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:21.600 BaseBdev3 00:16:21.600 23:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:16:21.600 23:58:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:16:21.600 23:58:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:21.600 23:58:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:16:21.600 23:58:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:21.600 23:58:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:21.600 23:58:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:21.858 23:58:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:22.114 [ 00:16:22.114 { 00:16:22.114 "name": "BaseBdev3", 00:16:22.114 "aliases": [ 00:16:22.114 "baa474fe-1525-44a9-b9af-46beecdf7c97" 00:16:22.114 ], 00:16:22.114 "product_name": "Malloc disk", 00:16:22.114 "block_size": 512, 00:16:22.114 "num_blocks": 65536, 00:16:22.114 "uuid": "baa474fe-1525-44a9-b9af-46beecdf7c97", 00:16:22.114 "assigned_rate_limits": { 00:16:22.114 "rw_ios_per_sec": 0, 00:16:22.114 "rw_mbytes_per_sec": 0, 00:16:22.114 "r_mbytes_per_sec": 0, 00:16:22.114 "w_mbytes_per_sec": 0 00:16:22.114 }, 00:16:22.115 "claimed": true, 00:16:22.115 "claim_type": "exclusive_write", 00:16:22.115 "zoned": false, 00:16:22.115 "supported_io_types": { 00:16:22.115 "read": true, 00:16:22.115 "write": true, 00:16:22.115 "unmap": true, 00:16:22.115 "write_zeroes": true, 00:16:22.115 "flush": true, 00:16:22.115 "reset": true, 00:16:22.115 "compare": false, 00:16:22.115 "compare_and_write": false, 00:16:22.115 "abort": true, 00:16:22.115 "nvme_admin": false, 00:16:22.115 "nvme_io": false 00:16:22.115 }, 00:16:22.115 "memory_domains": [ 00:16:22.115 { 00:16:22.115 "dma_device_id": "system", 00:16:22.115 "dma_device_type": 1 00:16:22.115 }, 00:16:22.115 { 00:16:22.115 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:22.115 "dma_device_type": 2 00:16:22.115 } 00:16:22.115 ], 00:16:22.115 "driver_specific": {} 00:16:22.115 } 00:16:22.115 ] 00:16:22.115 23:58:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:16:22.115 23:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:16:22.115 23:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:16:22.115 23:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:22.115 23:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:22.115 23:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:22.115 23:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:22.115 23:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:22.115 23:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:22.115 23:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:22.115 23:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:22.115 23:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:22.115 23:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:22.115 23:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:22.115 23:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:22.372 23:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:22.372 "name": "Existed_Raid", 00:16:22.372 "uuid": "830064ff-0ce9-410b-9a33-715088a43c3c", 00:16:22.372 "strip_size_kb": 64, 00:16:22.372 "state": "configuring", 00:16:22.372 "raid_level": "raid0", 00:16:22.372 "superblock": true, 00:16:22.372 "num_base_bdevs": 4, 00:16:22.372 "num_base_bdevs_discovered": 3, 00:16:22.372 "num_base_bdevs_operational": 4, 00:16:22.372 "base_bdevs_list": [ 00:16:22.372 { 00:16:22.372 "name": "BaseBdev1", 00:16:22.372 "uuid": "5ac787f2-64b6-4f17-ba17-d7d3f9f96a13", 00:16:22.372 "is_configured": true, 00:16:22.372 "data_offset": 2048, 00:16:22.372 "data_size": 63488 00:16:22.372 }, 00:16:22.372 { 00:16:22.372 "name": "BaseBdev2", 00:16:22.372 "uuid": "083b6313-03db-4380-be49-4f33a771f1c5", 00:16:22.372 "is_configured": true, 00:16:22.372 "data_offset": 2048, 00:16:22.372 "data_size": 63488 00:16:22.372 }, 00:16:22.372 { 00:16:22.372 "name": "BaseBdev3", 00:16:22.372 "uuid": "baa474fe-1525-44a9-b9af-46beecdf7c97", 00:16:22.372 "is_configured": true, 00:16:22.372 "data_offset": 2048, 00:16:22.372 "data_size": 63488 00:16:22.372 }, 00:16:22.372 { 00:16:22.372 "name": "BaseBdev4", 00:16:22.372 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:22.372 "is_configured": false, 00:16:22.372 "data_offset": 0, 00:16:22.372 "data_size": 0 00:16:22.372 } 00:16:22.372 ] 00:16:22.372 }' 00:16:22.372 23:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:22.372 23:58:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:22.941 23:58:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:23.206 [2024-05-14 23:58:23.579729] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:23.206 [2024-05-14 23:58:23.579899] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ad4470 00:16:23.206 [2024-05-14 23:58:23.579912] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:23.206 [2024-05-14 23:58:23.580092] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ad4b40 00:16:23.206 [2024-05-14 23:58:23.580219] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ad4470 00:16:23.206 [2024-05-14 23:58:23.580229] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1ad4470 00:16:23.206 [2024-05-14 23:58:23.580324] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:23.206 BaseBdev4 00:16:23.206 23:58:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev4 00:16:23.206 23:58:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:16:23.206 23:58:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:23.206 23:58:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:16:23.206 23:58:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:23.207 23:58:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:23.207 23:58:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:23.463 23:58:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:23.720 [ 00:16:23.720 { 00:16:23.720 "name": "BaseBdev4", 00:16:23.720 "aliases": [ 00:16:23.720 "d904ca09-6676-4f45-ab9d-6b8a053cb8b1" 00:16:23.720 ], 00:16:23.720 "product_name": "Malloc disk", 00:16:23.720 "block_size": 512, 00:16:23.720 "num_blocks": 65536, 00:16:23.720 "uuid": "d904ca09-6676-4f45-ab9d-6b8a053cb8b1", 00:16:23.720 "assigned_rate_limits": { 00:16:23.720 "rw_ios_per_sec": 0, 00:16:23.720 "rw_mbytes_per_sec": 0, 00:16:23.720 "r_mbytes_per_sec": 0, 00:16:23.720 "w_mbytes_per_sec": 0 00:16:23.720 }, 00:16:23.720 "claimed": true, 00:16:23.720 "claim_type": "exclusive_write", 00:16:23.720 "zoned": false, 00:16:23.720 "supported_io_types": { 00:16:23.720 "read": true, 00:16:23.720 "write": true, 00:16:23.720 "unmap": true, 00:16:23.720 "write_zeroes": true, 00:16:23.720 "flush": true, 00:16:23.720 "reset": true, 00:16:23.720 "compare": false, 00:16:23.720 "compare_and_write": false, 00:16:23.720 "abort": true, 00:16:23.720 "nvme_admin": false, 00:16:23.720 "nvme_io": false 00:16:23.720 }, 00:16:23.720 "memory_domains": [ 00:16:23.720 { 00:16:23.720 "dma_device_id": "system", 00:16:23.720 "dma_device_type": 1 00:16:23.720 }, 00:16:23.720 { 00:16:23.720 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:23.720 "dma_device_type": 2 00:16:23.720 } 00:16:23.720 ], 00:16:23.720 "driver_specific": {} 00:16:23.720 } 00:16:23.720 ] 00:16:23.720 23:58:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:16:23.720 23:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:16:23.720 23:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:16:23.720 23:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:16:23.720 23:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:23.720 23:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:16:23.720 23:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:23.720 23:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:23.720 23:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:23.720 23:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:23.720 23:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:23.720 23:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:23.720 23:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:23.720 23:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:23.720 23:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:23.977 23:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:23.977 "name": "Existed_Raid", 00:16:23.977 "uuid": "830064ff-0ce9-410b-9a33-715088a43c3c", 00:16:23.977 "strip_size_kb": 64, 00:16:23.977 "state": "online", 00:16:23.977 "raid_level": "raid0", 00:16:23.977 "superblock": true, 00:16:23.977 "num_base_bdevs": 4, 00:16:23.977 "num_base_bdevs_discovered": 4, 00:16:23.977 "num_base_bdevs_operational": 4, 00:16:23.977 "base_bdevs_list": [ 00:16:23.977 { 00:16:23.977 "name": "BaseBdev1", 00:16:23.977 "uuid": "5ac787f2-64b6-4f17-ba17-d7d3f9f96a13", 00:16:23.977 "is_configured": true, 00:16:23.977 "data_offset": 2048, 00:16:23.977 "data_size": 63488 00:16:23.977 }, 00:16:23.977 { 00:16:23.977 "name": "BaseBdev2", 00:16:23.977 "uuid": "083b6313-03db-4380-be49-4f33a771f1c5", 00:16:23.977 "is_configured": true, 00:16:23.977 "data_offset": 2048, 00:16:23.977 "data_size": 63488 00:16:23.977 }, 00:16:23.977 { 00:16:23.977 "name": "BaseBdev3", 00:16:23.977 "uuid": "baa474fe-1525-44a9-b9af-46beecdf7c97", 00:16:23.977 "is_configured": true, 00:16:23.977 "data_offset": 2048, 00:16:23.977 "data_size": 63488 00:16:23.977 }, 00:16:23.977 { 00:16:23.977 "name": "BaseBdev4", 00:16:23.977 "uuid": "d904ca09-6676-4f45-ab9d-6b8a053cb8b1", 00:16:23.977 "is_configured": true, 00:16:23.977 "data_offset": 2048, 00:16:23.977 "data_size": 63488 00:16:23.977 } 00:16:23.977 ] 00:16:23.977 }' 00:16:23.977 23:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:23.977 23:58:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:24.541 23:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:16:24.541 23:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:16:24.541 23:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:16:24.541 23:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:16:24.541 23:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:16:24.541 23:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:16:24.541 23:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:24.541 23:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:16:24.798 [2024-05-14 23:58:25.148188] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:24.798 23:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:16:24.798 "name": "Existed_Raid", 00:16:24.798 "aliases": [ 00:16:24.798 "830064ff-0ce9-410b-9a33-715088a43c3c" 00:16:24.798 ], 00:16:24.798 "product_name": "Raid Volume", 00:16:24.798 "block_size": 512, 00:16:24.798 "num_blocks": 253952, 00:16:24.798 "uuid": "830064ff-0ce9-410b-9a33-715088a43c3c", 00:16:24.798 "assigned_rate_limits": { 00:16:24.798 "rw_ios_per_sec": 0, 00:16:24.798 "rw_mbytes_per_sec": 0, 00:16:24.798 "r_mbytes_per_sec": 0, 00:16:24.798 "w_mbytes_per_sec": 0 00:16:24.798 }, 00:16:24.798 "claimed": false, 00:16:24.798 "zoned": false, 00:16:24.798 "supported_io_types": { 00:16:24.798 "read": true, 00:16:24.798 "write": true, 00:16:24.798 "unmap": true, 00:16:24.798 "write_zeroes": true, 00:16:24.798 "flush": true, 00:16:24.798 "reset": true, 00:16:24.798 "compare": false, 00:16:24.798 "compare_and_write": false, 00:16:24.798 "abort": false, 00:16:24.798 "nvme_admin": false, 00:16:24.798 "nvme_io": false 00:16:24.798 }, 00:16:24.798 "memory_domains": [ 00:16:24.798 { 00:16:24.798 "dma_device_id": "system", 00:16:24.798 "dma_device_type": 1 00:16:24.798 }, 00:16:24.798 { 00:16:24.798 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:24.798 "dma_device_type": 2 00:16:24.798 }, 00:16:24.798 { 00:16:24.798 "dma_device_id": "system", 00:16:24.798 "dma_device_type": 1 00:16:24.798 }, 00:16:24.798 { 00:16:24.798 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:24.798 "dma_device_type": 2 00:16:24.798 }, 00:16:24.798 { 00:16:24.798 "dma_device_id": "system", 00:16:24.798 "dma_device_type": 1 00:16:24.798 }, 00:16:24.798 { 00:16:24.798 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:24.798 "dma_device_type": 2 00:16:24.798 }, 00:16:24.798 { 00:16:24.798 "dma_device_id": "system", 00:16:24.798 "dma_device_type": 1 00:16:24.798 }, 00:16:24.798 { 00:16:24.798 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:24.798 "dma_device_type": 2 00:16:24.798 } 00:16:24.798 ], 00:16:24.798 "driver_specific": { 00:16:24.798 "raid": { 00:16:24.798 "uuid": "830064ff-0ce9-410b-9a33-715088a43c3c", 00:16:24.798 "strip_size_kb": 64, 00:16:24.798 "state": "online", 00:16:24.798 "raid_level": "raid0", 00:16:24.798 "superblock": true, 00:16:24.798 "num_base_bdevs": 4, 00:16:24.798 "num_base_bdevs_discovered": 4, 00:16:24.798 "num_base_bdevs_operational": 4, 00:16:24.798 "base_bdevs_list": [ 00:16:24.798 { 00:16:24.798 "name": "BaseBdev1", 00:16:24.798 "uuid": "5ac787f2-64b6-4f17-ba17-d7d3f9f96a13", 00:16:24.798 "is_configured": true, 00:16:24.798 "data_offset": 2048, 00:16:24.798 "data_size": 63488 00:16:24.798 }, 00:16:24.798 { 00:16:24.798 "name": "BaseBdev2", 00:16:24.798 "uuid": "083b6313-03db-4380-be49-4f33a771f1c5", 00:16:24.798 "is_configured": true, 00:16:24.798 "data_offset": 2048, 00:16:24.798 "data_size": 63488 00:16:24.798 }, 00:16:24.798 { 00:16:24.798 "name": "BaseBdev3", 00:16:24.798 "uuid": "baa474fe-1525-44a9-b9af-46beecdf7c97", 00:16:24.798 "is_configured": true, 00:16:24.798 "data_offset": 2048, 00:16:24.798 "data_size": 63488 00:16:24.798 }, 00:16:24.798 { 00:16:24.798 "name": "BaseBdev4", 00:16:24.798 "uuid": "d904ca09-6676-4f45-ab9d-6b8a053cb8b1", 00:16:24.798 "is_configured": true, 00:16:24.798 "data_offset": 2048, 00:16:24.798 "data_size": 63488 00:16:24.798 } 00:16:24.798 ] 00:16:24.798 } 00:16:24.798 } 00:16:24.799 }' 00:16:24.799 23:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:24.799 23:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:16:24.799 BaseBdev2 00:16:24.799 BaseBdev3 00:16:24.799 BaseBdev4' 00:16:24.799 23:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:24.799 23:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:24.799 23:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:25.055 23:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:25.055 "name": "BaseBdev1", 00:16:25.055 "aliases": [ 00:16:25.055 "5ac787f2-64b6-4f17-ba17-d7d3f9f96a13" 00:16:25.055 ], 00:16:25.055 "product_name": "Malloc disk", 00:16:25.056 "block_size": 512, 00:16:25.056 "num_blocks": 65536, 00:16:25.056 "uuid": "5ac787f2-64b6-4f17-ba17-d7d3f9f96a13", 00:16:25.056 "assigned_rate_limits": { 00:16:25.056 "rw_ios_per_sec": 0, 00:16:25.056 "rw_mbytes_per_sec": 0, 00:16:25.056 "r_mbytes_per_sec": 0, 00:16:25.056 "w_mbytes_per_sec": 0 00:16:25.056 }, 00:16:25.056 "claimed": true, 00:16:25.056 "claim_type": "exclusive_write", 00:16:25.056 "zoned": false, 00:16:25.056 "supported_io_types": { 00:16:25.056 "read": true, 00:16:25.056 "write": true, 00:16:25.056 "unmap": true, 00:16:25.056 "write_zeroes": true, 00:16:25.056 "flush": true, 00:16:25.056 "reset": true, 00:16:25.056 "compare": false, 00:16:25.056 "compare_and_write": false, 00:16:25.056 "abort": true, 00:16:25.056 "nvme_admin": false, 00:16:25.056 "nvme_io": false 00:16:25.056 }, 00:16:25.056 "memory_domains": [ 00:16:25.056 { 00:16:25.056 "dma_device_id": "system", 00:16:25.056 "dma_device_type": 1 00:16:25.056 }, 00:16:25.056 { 00:16:25.056 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:25.056 "dma_device_type": 2 00:16:25.056 } 00:16:25.056 ], 00:16:25.056 "driver_specific": {} 00:16:25.056 }' 00:16:25.056 23:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:25.056 23:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:25.056 23:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:25.056 23:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:25.056 23:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:25.312 23:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:25.312 23:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:25.312 23:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:25.312 23:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:25.312 23:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:25.312 23:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:25.312 23:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:25.312 23:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:25.312 23:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:25.312 23:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:25.569 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:25.569 "name": "BaseBdev2", 00:16:25.569 "aliases": [ 00:16:25.569 "083b6313-03db-4380-be49-4f33a771f1c5" 00:16:25.569 ], 00:16:25.569 "product_name": "Malloc disk", 00:16:25.569 "block_size": 512, 00:16:25.569 "num_blocks": 65536, 00:16:25.569 "uuid": "083b6313-03db-4380-be49-4f33a771f1c5", 00:16:25.569 "assigned_rate_limits": { 00:16:25.569 "rw_ios_per_sec": 0, 00:16:25.569 "rw_mbytes_per_sec": 0, 00:16:25.569 "r_mbytes_per_sec": 0, 00:16:25.569 "w_mbytes_per_sec": 0 00:16:25.569 }, 00:16:25.569 "claimed": true, 00:16:25.569 "claim_type": "exclusive_write", 00:16:25.569 "zoned": false, 00:16:25.569 "supported_io_types": { 00:16:25.569 "read": true, 00:16:25.569 "write": true, 00:16:25.569 "unmap": true, 00:16:25.569 "write_zeroes": true, 00:16:25.569 "flush": true, 00:16:25.569 "reset": true, 00:16:25.569 "compare": false, 00:16:25.569 "compare_and_write": false, 00:16:25.569 "abort": true, 00:16:25.569 "nvme_admin": false, 00:16:25.569 "nvme_io": false 00:16:25.569 }, 00:16:25.569 "memory_domains": [ 00:16:25.569 { 00:16:25.569 "dma_device_id": "system", 00:16:25.569 "dma_device_type": 1 00:16:25.569 }, 00:16:25.569 { 00:16:25.569 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:25.569 "dma_device_type": 2 00:16:25.569 } 00:16:25.569 ], 00:16:25.569 "driver_specific": {} 00:16:25.569 }' 00:16:25.569 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:25.569 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:25.569 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:25.569 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:25.569 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:25.826 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:25.826 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:25.826 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:25.826 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:25.826 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:25.826 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:25.826 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:25.826 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:25.826 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:25.826 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:26.083 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:26.083 "name": "BaseBdev3", 00:16:26.083 "aliases": [ 00:16:26.083 "baa474fe-1525-44a9-b9af-46beecdf7c97" 00:16:26.083 ], 00:16:26.083 "product_name": "Malloc disk", 00:16:26.083 "block_size": 512, 00:16:26.083 "num_blocks": 65536, 00:16:26.083 "uuid": "baa474fe-1525-44a9-b9af-46beecdf7c97", 00:16:26.083 "assigned_rate_limits": { 00:16:26.083 "rw_ios_per_sec": 0, 00:16:26.083 "rw_mbytes_per_sec": 0, 00:16:26.083 "r_mbytes_per_sec": 0, 00:16:26.083 "w_mbytes_per_sec": 0 00:16:26.083 }, 00:16:26.083 "claimed": true, 00:16:26.083 "claim_type": "exclusive_write", 00:16:26.083 "zoned": false, 00:16:26.083 "supported_io_types": { 00:16:26.083 "read": true, 00:16:26.083 "write": true, 00:16:26.083 "unmap": true, 00:16:26.083 "write_zeroes": true, 00:16:26.083 "flush": true, 00:16:26.083 "reset": true, 00:16:26.083 "compare": false, 00:16:26.083 "compare_and_write": false, 00:16:26.083 "abort": true, 00:16:26.083 "nvme_admin": false, 00:16:26.083 "nvme_io": false 00:16:26.083 }, 00:16:26.083 "memory_domains": [ 00:16:26.083 { 00:16:26.083 "dma_device_id": "system", 00:16:26.083 "dma_device_type": 1 00:16:26.083 }, 00:16:26.083 { 00:16:26.083 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:26.083 "dma_device_type": 2 00:16:26.083 } 00:16:26.083 ], 00:16:26.083 "driver_specific": {} 00:16:26.083 }' 00:16:26.083 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:26.083 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:26.339 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:26.339 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:26.339 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:26.339 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:26.339 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:26.339 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:26.339 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:26.339 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:26.339 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:26.595 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:26.595 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:26.595 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:26.595 23:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:26.852 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:26.852 "name": "BaseBdev4", 00:16:26.852 "aliases": [ 00:16:26.852 "d904ca09-6676-4f45-ab9d-6b8a053cb8b1" 00:16:26.852 ], 00:16:26.852 "product_name": "Malloc disk", 00:16:26.852 "block_size": 512, 00:16:26.852 "num_blocks": 65536, 00:16:26.852 "uuid": "d904ca09-6676-4f45-ab9d-6b8a053cb8b1", 00:16:26.852 "assigned_rate_limits": { 00:16:26.852 "rw_ios_per_sec": 0, 00:16:26.852 "rw_mbytes_per_sec": 0, 00:16:26.852 "r_mbytes_per_sec": 0, 00:16:26.852 "w_mbytes_per_sec": 0 00:16:26.852 }, 00:16:26.852 "claimed": true, 00:16:26.852 "claim_type": "exclusive_write", 00:16:26.852 "zoned": false, 00:16:26.852 "supported_io_types": { 00:16:26.852 "read": true, 00:16:26.852 "write": true, 00:16:26.852 "unmap": true, 00:16:26.852 "write_zeroes": true, 00:16:26.852 "flush": true, 00:16:26.852 "reset": true, 00:16:26.852 "compare": false, 00:16:26.852 "compare_and_write": false, 00:16:26.852 "abort": true, 00:16:26.852 "nvme_admin": false, 00:16:26.852 "nvme_io": false 00:16:26.852 }, 00:16:26.852 "memory_domains": [ 00:16:26.853 { 00:16:26.853 "dma_device_id": "system", 00:16:26.853 "dma_device_type": 1 00:16:26.853 }, 00:16:26.853 { 00:16:26.853 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:26.853 "dma_device_type": 2 00:16:26.853 } 00:16:26.853 ], 00:16:26.853 "driver_specific": {} 00:16:26.853 }' 00:16:26.853 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:26.853 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:26.853 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:26.853 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:26.853 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:26.853 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:26.853 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:26.853 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:27.109 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:27.110 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:27.110 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:27.110 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:27.110 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:27.366 [2024-05-14 23:58:27.766881] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:27.366 [2024-05-14 23:58:27.766906] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:27.366 [2024-05-14 23:58:27.766953] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:27.366 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # local expected_state 00:16:27.366 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # has_redundancy raid0 00:16:27.366 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # case $1 in 00:16:27.366 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@216 -- # return 1 00:16:27.366 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:16:27.366 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:16:27.366 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:27.366 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:16:27.366 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:27.366 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:27.366 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:16:27.366 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:27.366 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:27.366 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:27.366 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:27.366 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:27.366 23:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:27.623 23:58:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:27.623 "name": "Existed_Raid", 00:16:27.623 "uuid": "830064ff-0ce9-410b-9a33-715088a43c3c", 00:16:27.623 "strip_size_kb": 64, 00:16:27.623 "state": "offline", 00:16:27.623 "raid_level": "raid0", 00:16:27.623 "superblock": true, 00:16:27.623 "num_base_bdevs": 4, 00:16:27.623 "num_base_bdevs_discovered": 3, 00:16:27.623 "num_base_bdevs_operational": 3, 00:16:27.623 "base_bdevs_list": [ 00:16:27.623 { 00:16:27.623 "name": null, 00:16:27.623 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:27.623 "is_configured": false, 00:16:27.623 "data_offset": 2048, 00:16:27.623 "data_size": 63488 00:16:27.623 }, 00:16:27.623 { 00:16:27.623 "name": "BaseBdev2", 00:16:27.623 "uuid": "083b6313-03db-4380-be49-4f33a771f1c5", 00:16:27.623 "is_configured": true, 00:16:27.623 "data_offset": 2048, 00:16:27.623 "data_size": 63488 00:16:27.623 }, 00:16:27.623 { 00:16:27.623 "name": "BaseBdev3", 00:16:27.623 "uuid": "baa474fe-1525-44a9-b9af-46beecdf7c97", 00:16:27.623 "is_configured": true, 00:16:27.623 "data_offset": 2048, 00:16:27.623 "data_size": 63488 00:16:27.623 }, 00:16:27.623 { 00:16:27.623 "name": "BaseBdev4", 00:16:27.623 "uuid": "d904ca09-6676-4f45-ab9d-6b8a053cb8b1", 00:16:27.623 "is_configured": true, 00:16:27.623 "data_offset": 2048, 00:16:27.623 "data_size": 63488 00:16:27.623 } 00:16:27.623 ] 00:16:27.623 }' 00:16:27.623 23:58:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:27.623 23:58:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:28.187 23:58:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:16:28.187 23:58:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:16:28.187 23:58:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:28.187 23:58:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:16:28.444 23:58:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:16:28.444 23:58:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:28.444 23:58:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:28.444 [2024-05-14 23:58:29.031253] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:28.700 23:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:16:28.700 23:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:16:28.700 23:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:28.700 23:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:16:28.700 23:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:16:28.700 23:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:28.700 23:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:28.957 [2024-05-14 23:58:29.402657] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:28.957 23:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:16:28.957 23:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:16:28.958 23:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:28.958 23:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:16:29.214 23:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:16:29.214 23:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:29.214 23:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:16:29.472 [2024-05-14 23:58:29.910485] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:16:29.473 [2024-05-14 23:58:29.910529] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ad4470 name Existed_Raid, state offline 00:16:29.473 23:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:16:29.473 23:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:16:29.473 23:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:29.473 23:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:16:29.736 23:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:16:29.736 23:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:16:29.736 23:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@300 -- # '[' 4 -gt 2 ']' 00:16:29.736 23:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:16:29.736 23:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:16:29.736 23:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:29.993 BaseBdev2 00:16:29.993 23:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:16:29.994 23:58:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:16:29.994 23:58:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:29.994 23:58:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:16:29.994 23:58:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:29.994 23:58:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:29.994 23:58:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:30.252 23:58:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:30.510 [ 00:16:30.510 { 00:16:30.510 "name": "BaseBdev2", 00:16:30.510 "aliases": [ 00:16:30.510 "386e68b8-5446-4b8c-b4db-f305201a68a2" 00:16:30.510 ], 00:16:30.510 "product_name": "Malloc disk", 00:16:30.510 "block_size": 512, 00:16:30.510 "num_blocks": 65536, 00:16:30.510 "uuid": "386e68b8-5446-4b8c-b4db-f305201a68a2", 00:16:30.510 "assigned_rate_limits": { 00:16:30.510 "rw_ios_per_sec": 0, 00:16:30.510 "rw_mbytes_per_sec": 0, 00:16:30.510 "r_mbytes_per_sec": 0, 00:16:30.510 "w_mbytes_per_sec": 0 00:16:30.510 }, 00:16:30.510 "claimed": false, 00:16:30.510 "zoned": false, 00:16:30.510 "supported_io_types": { 00:16:30.510 "read": true, 00:16:30.510 "write": true, 00:16:30.510 "unmap": true, 00:16:30.510 "write_zeroes": true, 00:16:30.510 "flush": true, 00:16:30.510 "reset": true, 00:16:30.510 "compare": false, 00:16:30.510 "compare_and_write": false, 00:16:30.510 "abort": true, 00:16:30.510 "nvme_admin": false, 00:16:30.510 "nvme_io": false 00:16:30.510 }, 00:16:30.510 "memory_domains": [ 00:16:30.510 { 00:16:30.510 "dma_device_id": "system", 00:16:30.510 "dma_device_type": 1 00:16:30.510 }, 00:16:30.510 { 00:16:30.510 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:30.510 "dma_device_type": 2 00:16:30.510 } 00:16:30.510 ], 00:16:30.510 "driver_specific": {} 00:16:30.510 } 00:16:30.510 ] 00:16:30.510 23:58:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:16:30.510 23:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:16:30.510 23:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:16:30.510 23:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:30.769 BaseBdev3 00:16:30.769 23:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:16:30.769 23:58:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:16:30.769 23:58:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:30.769 23:58:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:16:30.769 23:58:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:30.769 23:58:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:30.769 23:58:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:31.027 23:58:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:31.286 [ 00:16:31.286 { 00:16:31.286 "name": "BaseBdev3", 00:16:31.286 "aliases": [ 00:16:31.286 "a440e224-c32a-45ec-abce-6da813f72f62" 00:16:31.286 ], 00:16:31.286 "product_name": "Malloc disk", 00:16:31.286 "block_size": 512, 00:16:31.286 "num_blocks": 65536, 00:16:31.286 "uuid": "a440e224-c32a-45ec-abce-6da813f72f62", 00:16:31.286 "assigned_rate_limits": { 00:16:31.286 "rw_ios_per_sec": 0, 00:16:31.286 "rw_mbytes_per_sec": 0, 00:16:31.286 "r_mbytes_per_sec": 0, 00:16:31.286 "w_mbytes_per_sec": 0 00:16:31.286 }, 00:16:31.286 "claimed": false, 00:16:31.286 "zoned": false, 00:16:31.286 "supported_io_types": { 00:16:31.286 "read": true, 00:16:31.286 "write": true, 00:16:31.286 "unmap": true, 00:16:31.286 "write_zeroes": true, 00:16:31.286 "flush": true, 00:16:31.286 "reset": true, 00:16:31.286 "compare": false, 00:16:31.286 "compare_and_write": false, 00:16:31.286 "abort": true, 00:16:31.286 "nvme_admin": false, 00:16:31.286 "nvme_io": false 00:16:31.286 }, 00:16:31.286 "memory_domains": [ 00:16:31.286 { 00:16:31.286 "dma_device_id": "system", 00:16:31.286 "dma_device_type": 1 00:16:31.286 }, 00:16:31.286 { 00:16:31.286 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:31.286 "dma_device_type": 2 00:16:31.286 } 00:16:31.286 ], 00:16:31.286 "driver_specific": {} 00:16:31.286 } 00:16:31.286 ] 00:16:31.286 23:58:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:16:31.286 23:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:16:31.286 23:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:16:31.286 23:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:31.286 BaseBdev4 00:16:31.544 23:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev4 00:16:31.544 23:58:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:16:31.544 23:58:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:31.544 23:58:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:16:31.544 23:58:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:31.544 23:58:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:31.544 23:58:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:31.544 23:58:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:31.802 [ 00:16:31.802 { 00:16:31.802 "name": "BaseBdev4", 00:16:31.802 "aliases": [ 00:16:31.802 "5de28cc0-b0fe-483b-9622-8fc4523fb17e" 00:16:31.802 ], 00:16:31.802 "product_name": "Malloc disk", 00:16:31.802 "block_size": 512, 00:16:31.802 "num_blocks": 65536, 00:16:31.802 "uuid": "5de28cc0-b0fe-483b-9622-8fc4523fb17e", 00:16:31.802 "assigned_rate_limits": { 00:16:31.802 "rw_ios_per_sec": 0, 00:16:31.802 "rw_mbytes_per_sec": 0, 00:16:31.802 "r_mbytes_per_sec": 0, 00:16:31.802 "w_mbytes_per_sec": 0 00:16:31.802 }, 00:16:31.803 "claimed": false, 00:16:31.803 "zoned": false, 00:16:31.803 "supported_io_types": { 00:16:31.803 "read": true, 00:16:31.803 "write": true, 00:16:31.803 "unmap": true, 00:16:31.803 "write_zeroes": true, 00:16:31.803 "flush": true, 00:16:31.803 "reset": true, 00:16:31.803 "compare": false, 00:16:31.803 "compare_and_write": false, 00:16:31.803 "abort": true, 00:16:31.803 "nvme_admin": false, 00:16:31.803 "nvme_io": false 00:16:31.803 }, 00:16:31.803 "memory_domains": [ 00:16:31.803 { 00:16:31.803 "dma_device_id": "system", 00:16:31.803 "dma_device_type": 1 00:16:31.803 }, 00:16:31.803 { 00:16:31.803 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:31.803 "dma_device_type": 2 00:16:31.803 } 00:16:31.803 ], 00:16:31.803 "driver_specific": {} 00:16:31.803 } 00:16:31.803 ] 00:16:31.803 23:58:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:16:31.803 23:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:16:31.803 23:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:16:31.803 23:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:32.061 [2024-05-14 23:58:32.592609] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:32.061 [2024-05-14 23:58:32.592649] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:32.061 [2024-05-14 23:58:32.592676] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:32.061 [2024-05-14 23:58:32.594087] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:32.061 [2024-05-14 23:58:32.594139] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:32.061 23:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:32.061 23:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:32.061 23:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:32.061 23:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:32.061 23:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:32.061 23:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:32.061 23:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:32.061 23:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:32.061 23:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:32.061 23:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:32.061 23:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:32.061 23:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:32.320 23:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:32.320 "name": "Existed_Raid", 00:16:32.320 "uuid": "119e1ce3-5dfc-4f87-9acd-066c875a4215", 00:16:32.320 "strip_size_kb": 64, 00:16:32.320 "state": "configuring", 00:16:32.320 "raid_level": "raid0", 00:16:32.320 "superblock": true, 00:16:32.320 "num_base_bdevs": 4, 00:16:32.320 "num_base_bdevs_discovered": 3, 00:16:32.320 "num_base_bdevs_operational": 4, 00:16:32.320 "base_bdevs_list": [ 00:16:32.320 { 00:16:32.320 "name": "BaseBdev1", 00:16:32.320 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.320 "is_configured": false, 00:16:32.320 "data_offset": 0, 00:16:32.320 "data_size": 0 00:16:32.320 }, 00:16:32.320 { 00:16:32.320 "name": "BaseBdev2", 00:16:32.320 "uuid": "386e68b8-5446-4b8c-b4db-f305201a68a2", 00:16:32.320 "is_configured": true, 00:16:32.320 "data_offset": 2048, 00:16:32.320 "data_size": 63488 00:16:32.320 }, 00:16:32.320 { 00:16:32.320 "name": "BaseBdev3", 00:16:32.320 "uuid": "a440e224-c32a-45ec-abce-6da813f72f62", 00:16:32.320 "is_configured": true, 00:16:32.320 "data_offset": 2048, 00:16:32.320 "data_size": 63488 00:16:32.320 }, 00:16:32.320 { 00:16:32.320 "name": "BaseBdev4", 00:16:32.320 "uuid": "5de28cc0-b0fe-483b-9622-8fc4523fb17e", 00:16:32.320 "is_configured": true, 00:16:32.320 "data_offset": 2048, 00:16:32.320 "data_size": 63488 00:16:32.320 } 00:16:32.320 ] 00:16:32.320 }' 00:16:32.320 23:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:32.320 23:58:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:32.886 23:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:33.143 [2024-05-14 23:58:33.643357] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:33.143 23:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:33.143 23:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:33.143 23:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:33.143 23:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:33.143 23:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:33.143 23:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:33.143 23:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:33.143 23:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:33.143 23:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:33.143 23:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:33.143 23:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:33.143 23:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:33.401 23:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:33.401 "name": "Existed_Raid", 00:16:33.401 "uuid": "119e1ce3-5dfc-4f87-9acd-066c875a4215", 00:16:33.401 "strip_size_kb": 64, 00:16:33.401 "state": "configuring", 00:16:33.401 "raid_level": "raid0", 00:16:33.401 "superblock": true, 00:16:33.401 "num_base_bdevs": 4, 00:16:33.401 "num_base_bdevs_discovered": 2, 00:16:33.401 "num_base_bdevs_operational": 4, 00:16:33.401 "base_bdevs_list": [ 00:16:33.401 { 00:16:33.401 "name": "BaseBdev1", 00:16:33.401 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:33.401 "is_configured": false, 00:16:33.401 "data_offset": 0, 00:16:33.401 "data_size": 0 00:16:33.401 }, 00:16:33.401 { 00:16:33.401 "name": null, 00:16:33.401 "uuid": "386e68b8-5446-4b8c-b4db-f305201a68a2", 00:16:33.401 "is_configured": false, 00:16:33.401 "data_offset": 2048, 00:16:33.401 "data_size": 63488 00:16:33.401 }, 00:16:33.401 { 00:16:33.401 "name": "BaseBdev3", 00:16:33.401 "uuid": "a440e224-c32a-45ec-abce-6da813f72f62", 00:16:33.401 "is_configured": true, 00:16:33.401 "data_offset": 2048, 00:16:33.401 "data_size": 63488 00:16:33.401 }, 00:16:33.401 { 00:16:33.401 "name": "BaseBdev4", 00:16:33.401 "uuid": "5de28cc0-b0fe-483b-9622-8fc4523fb17e", 00:16:33.401 "is_configured": true, 00:16:33.401 "data_offset": 2048, 00:16:33.401 "data_size": 63488 00:16:33.401 } 00:16:33.401 ] 00:16:33.401 }' 00:16:33.401 23:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:33.401 23:58:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:33.967 23:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:33.967 23:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:34.230 23:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:16:34.230 23:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:34.496 [2024-05-14 23:58:34.986957] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:34.496 BaseBdev1 00:16:34.496 23:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:16:34.496 23:58:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:16:34.496 23:58:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:34.496 23:58:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:16:34.496 23:58:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:34.496 23:58:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:34.496 23:58:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:34.753 23:58:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:35.011 [ 00:16:35.011 { 00:16:35.011 "name": "BaseBdev1", 00:16:35.011 "aliases": [ 00:16:35.011 "1e938e3e-edae-4665-9824-0e98b079b371" 00:16:35.011 ], 00:16:35.011 "product_name": "Malloc disk", 00:16:35.011 "block_size": 512, 00:16:35.011 "num_blocks": 65536, 00:16:35.011 "uuid": "1e938e3e-edae-4665-9824-0e98b079b371", 00:16:35.011 "assigned_rate_limits": { 00:16:35.011 "rw_ios_per_sec": 0, 00:16:35.011 "rw_mbytes_per_sec": 0, 00:16:35.011 "r_mbytes_per_sec": 0, 00:16:35.011 "w_mbytes_per_sec": 0 00:16:35.011 }, 00:16:35.011 "claimed": true, 00:16:35.011 "claim_type": "exclusive_write", 00:16:35.011 "zoned": false, 00:16:35.011 "supported_io_types": { 00:16:35.011 "read": true, 00:16:35.011 "write": true, 00:16:35.011 "unmap": true, 00:16:35.011 "write_zeroes": true, 00:16:35.011 "flush": true, 00:16:35.011 "reset": true, 00:16:35.011 "compare": false, 00:16:35.011 "compare_and_write": false, 00:16:35.011 "abort": true, 00:16:35.011 "nvme_admin": false, 00:16:35.011 "nvme_io": false 00:16:35.011 }, 00:16:35.011 "memory_domains": [ 00:16:35.011 { 00:16:35.011 "dma_device_id": "system", 00:16:35.011 "dma_device_type": 1 00:16:35.011 }, 00:16:35.011 { 00:16:35.011 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:35.011 "dma_device_type": 2 00:16:35.011 } 00:16:35.011 ], 00:16:35.011 "driver_specific": {} 00:16:35.011 } 00:16:35.011 ] 00:16:35.011 23:58:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:16:35.011 23:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:35.011 23:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:35.011 23:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:35.011 23:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:35.011 23:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:35.011 23:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:35.011 23:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:35.011 23:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:35.011 23:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:35.011 23:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:35.011 23:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:35.012 23:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:35.269 23:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:35.269 "name": "Existed_Raid", 00:16:35.269 "uuid": "119e1ce3-5dfc-4f87-9acd-066c875a4215", 00:16:35.269 "strip_size_kb": 64, 00:16:35.269 "state": "configuring", 00:16:35.269 "raid_level": "raid0", 00:16:35.269 "superblock": true, 00:16:35.269 "num_base_bdevs": 4, 00:16:35.269 "num_base_bdevs_discovered": 3, 00:16:35.270 "num_base_bdevs_operational": 4, 00:16:35.270 "base_bdevs_list": [ 00:16:35.270 { 00:16:35.270 "name": "BaseBdev1", 00:16:35.270 "uuid": "1e938e3e-edae-4665-9824-0e98b079b371", 00:16:35.270 "is_configured": true, 00:16:35.270 "data_offset": 2048, 00:16:35.270 "data_size": 63488 00:16:35.270 }, 00:16:35.270 { 00:16:35.270 "name": null, 00:16:35.270 "uuid": "386e68b8-5446-4b8c-b4db-f305201a68a2", 00:16:35.270 "is_configured": false, 00:16:35.270 "data_offset": 2048, 00:16:35.270 "data_size": 63488 00:16:35.270 }, 00:16:35.270 { 00:16:35.270 "name": "BaseBdev3", 00:16:35.270 "uuid": "a440e224-c32a-45ec-abce-6da813f72f62", 00:16:35.270 "is_configured": true, 00:16:35.270 "data_offset": 2048, 00:16:35.270 "data_size": 63488 00:16:35.270 }, 00:16:35.270 { 00:16:35.270 "name": "BaseBdev4", 00:16:35.270 "uuid": "5de28cc0-b0fe-483b-9622-8fc4523fb17e", 00:16:35.270 "is_configured": true, 00:16:35.270 "data_offset": 2048, 00:16:35.270 "data_size": 63488 00:16:35.270 } 00:16:35.270 ] 00:16:35.270 }' 00:16:35.270 23:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:35.270 23:58:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:35.836 23:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:35.836 23:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:36.094 23:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:16:36.094 23:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:36.353 [2024-05-14 23:58:36.783746] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:36.353 23:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:36.353 23:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:36.353 23:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:36.353 23:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:36.353 23:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:36.353 23:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:36.353 23:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:36.353 23:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:36.353 23:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:36.353 23:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:36.353 23:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:36.353 23:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:36.612 23:58:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:36.612 "name": "Existed_Raid", 00:16:36.612 "uuid": "119e1ce3-5dfc-4f87-9acd-066c875a4215", 00:16:36.612 "strip_size_kb": 64, 00:16:36.612 "state": "configuring", 00:16:36.612 "raid_level": "raid0", 00:16:36.612 "superblock": true, 00:16:36.612 "num_base_bdevs": 4, 00:16:36.612 "num_base_bdevs_discovered": 2, 00:16:36.612 "num_base_bdevs_operational": 4, 00:16:36.612 "base_bdevs_list": [ 00:16:36.612 { 00:16:36.612 "name": "BaseBdev1", 00:16:36.612 "uuid": "1e938e3e-edae-4665-9824-0e98b079b371", 00:16:36.612 "is_configured": true, 00:16:36.612 "data_offset": 2048, 00:16:36.612 "data_size": 63488 00:16:36.612 }, 00:16:36.612 { 00:16:36.612 "name": null, 00:16:36.612 "uuid": "386e68b8-5446-4b8c-b4db-f305201a68a2", 00:16:36.612 "is_configured": false, 00:16:36.612 "data_offset": 2048, 00:16:36.612 "data_size": 63488 00:16:36.612 }, 00:16:36.612 { 00:16:36.612 "name": null, 00:16:36.612 "uuid": "a440e224-c32a-45ec-abce-6da813f72f62", 00:16:36.612 "is_configured": false, 00:16:36.612 "data_offset": 2048, 00:16:36.612 "data_size": 63488 00:16:36.612 }, 00:16:36.612 { 00:16:36.612 "name": "BaseBdev4", 00:16:36.612 "uuid": "5de28cc0-b0fe-483b-9622-8fc4523fb17e", 00:16:36.612 "is_configured": true, 00:16:36.612 "data_offset": 2048, 00:16:36.612 "data_size": 63488 00:16:36.612 } 00:16:36.612 ] 00:16:36.612 }' 00:16:36.612 23:58:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:36.612 23:58:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:37.178 23:58:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:37.178 23:58:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:37.436 23:58:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:16:37.436 23:58:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:37.436 [2024-05-14 23:58:38.019032] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:37.709 23:58:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:37.709 23:58:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:37.709 23:58:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:37.709 23:58:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:37.709 23:58:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:37.709 23:58:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:37.709 23:58:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:37.709 23:58:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:37.709 23:58:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:37.709 23:58:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:37.709 23:58:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:37.709 23:58:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:37.709 23:58:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:37.709 "name": "Existed_Raid", 00:16:37.709 "uuid": "119e1ce3-5dfc-4f87-9acd-066c875a4215", 00:16:37.710 "strip_size_kb": 64, 00:16:37.710 "state": "configuring", 00:16:37.710 "raid_level": "raid0", 00:16:37.710 "superblock": true, 00:16:37.710 "num_base_bdevs": 4, 00:16:37.710 "num_base_bdevs_discovered": 3, 00:16:37.710 "num_base_bdevs_operational": 4, 00:16:37.710 "base_bdevs_list": [ 00:16:37.710 { 00:16:37.710 "name": "BaseBdev1", 00:16:37.710 "uuid": "1e938e3e-edae-4665-9824-0e98b079b371", 00:16:37.710 "is_configured": true, 00:16:37.710 "data_offset": 2048, 00:16:37.710 "data_size": 63488 00:16:37.710 }, 00:16:37.710 { 00:16:37.710 "name": null, 00:16:37.710 "uuid": "386e68b8-5446-4b8c-b4db-f305201a68a2", 00:16:37.710 "is_configured": false, 00:16:37.710 "data_offset": 2048, 00:16:37.710 "data_size": 63488 00:16:37.710 }, 00:16:37.710 { 00:16:37.710 "name": "BaseBdev3", 00:16:37.710 "uuid": "a440e224-c32a-45ec-abce-6da813f72f62", 00:16:37.710 "is_configured": true, 00:16:37.710 "data_offset": 2048, 00:16:37.710 "data_size": 63488 00:16:37.710 }, 00:16:37.710 { 00:16:37.710 "name": "BaseBdev4", 00:16:37.710 "uuid": "5de28cc0-b0fe-483b-9622-8fc4523fb17e", 00:16:37.710 "is_configured": true, 00:16:37.710 "data_offset": 2048, 00:16:37.710 "data_size": 63488 00:16:37.710 } 00:16:37.710 ] 00:16:37.710 }' 00:16:37.710 23:58:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:37.710 23:58:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:38.641 23:58:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:38.641 23:58:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:38.641 23:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:16:38.641 23:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:38.898 [2024-05-14 23:58:39.282397] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:38.898 23:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:38.898 23:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:38.898 23:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:38.898 23:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:38.898 23:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:38.898 23:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:38.898 23:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:38.898 23:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:38.898 23:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:38.898 23:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:38.898 23:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:38.898 23:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:39.156 23:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:39.156 "name": "Existed_Raid", 00:16:39.156 "uuid": "119e1ce3-5dfc-4f87-9acd-066c875a4215", 00:16:39.156 "strip_size_kb": 64, 00:16:39.156 "state": "configuring", 00:16:39.156 "raid_level": "raid0", 00:16:39.156 "superblock": true, 00:16:39.156 "num_base_bdevs": 4, 00:16:39.156 "num_base_bdevs_discovered": 2, 00:16:39.156 "num_base_bdevs_operational": 4, 00:16:39.156 "base_bdevs_list": [ 00:16:39.156 { 00:16:39.156 "name": null, 00:16:39.156 "uuid": "1e938e3e-edae-4665-9824-0e98b079b371", 00:16:39.156 "is_configured": false, 00:16:39.156 "data_offset": 2048, 00:16:39.156 "data_size": 63488 00:16:39.156 }, 00:16:39.156 { 00:16:39.156 "name": null, 00:16:39.156 "uuid": "386e68b8-5446-4b8c-b4db-f305201a68a2", 00:16:39.156 "is_configured": false, 00:16:39.156 "data_offset": 2048, 00:16:39.156 "data_size": 63488 00:16:39.156 }, 00:16:39.156 { 00:16:39.156 "name": "BaseBdev3", 00:16:39.156 "uuid": "a440e224-c32a-45ec-abce-6da813f72f62", 00:16:39.156 "is_configured": true, 00:16:39.156 "data_offset": 2048, 00:16:39.156 "data_size": 63488 00:16:39.156 }, 00:16:39.156 { 00:16:39.156 "name": "BaseBdev4", 00:16:39.156 "uuid": "5de28cc0-b0fe-483b-9622-8fc4523fb17e", 00:16:39.156 "is_configured": true, 00:16:39.156 "data_offset": 2048, 00:16:39.156 "data_size": 63488 00:16:39.156 } 00:16:39.156 ] 00:16:39.156 }' 00:16:39.156 23:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:39.156 23:58:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:39.722 23:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:39.722 23:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:39.980 23:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:16:39.980 23:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:40.239 [2024-05-14 23:58:40.676629] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:40.239 23:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:40.239 23:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:40.239 23:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:40.239 23:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:40.239 23:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:40.239 23:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:40.239 23:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:40.239 23:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:40.239 23:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:40.239 23:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:40.239 23:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:40.239 23:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:40.498 23:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:40.498 "name": "Existed_Raid", 00:16:40.498 "uuid": "119e1ce3-5dfc-4f87-9acd-066c875a4215", 00:16:40.498 "strip_size_kb": 64, 00:16:40.498 "state": "configuring", 00:16:40.498 "raid_level": "raid0", 00:16:40.498 "superblock": true, 00:16:40.498 "num_base_bdevs": 4, 00:16:40.498 "num_base_bdevs_discovered": 3, 00:16:40.498 "num_base_bdevs_operational": 4, 00:16:40.498 "base_bdevs_list": [ 00:16:40.498 { 00:16:40.498 "name": null, 00:16:40.498 "uuid": "1e938e3e-edae-4665-9824-0e98b079b371", 00:16:40.498 "is_configured": false, 00:16:40.498 "data_offset": 2048, 00:16:40.498 "data_size": 63488 00:16:40.498 }, 00:16:40.498 { 00:16:40.498 "name": "BaseBdev2", 00:16:40.498 "uuid": "386e68b8-5446-4b8c-b4db-f305201a68a2", 00:16:40.498 "is_configured": true, 00:16:40.498 "data_offset": 2048, 00:16:40.498 "data_size": 63488 00:16:40.498 }, 00:16:40.498 { 00:16:40.498 "name": "BaseBdev3", 00:16:40.498 "uuid": "a440e224-c32a-45ec-abce-6da813f72f62", 00:16:40.498 "is_configured": true, 00:16:40.498 "data_offset": 2048, 00:16:40.498 "data_size": 63488 00:16:40.498 }, 00:16:40.498 { 00:16:40.498 "name": "BaseBdev4", 00:16:40.498 "uuid": "5de28cc0-b0fe-483b-9622-8fc4523fb17e", 00:16:40.498 "is_configured": true, 00:16:40.498 "data_offset": 2048, 00:16:40.498 "data_size": 63488 00:16:40.498 } 00:16:40.498 ] 00:16:40.498 }' 00:16:40.498 23:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:40.498 23:58:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:41.064 23:58:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:41.064 23:58:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:41.323 23:58:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:16:41.323 23:58:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:41.323 23:58:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:41.581 23:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 1e938e3e-edae-4665-9824-0e98b079b371 00:16:41.840 [2024-05-14 23:58:42.245350] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:41.840 [2024-05-14 23:58:42.245524] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ad36d0 00:16:41.840 [2024-05-14 23:58:42.245538] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:41.840 [2024-05-14 23:58:42.245715] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c83120 00:16:41.840 [2024-05-14 23:58:42.245834] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ad36d0 00:16:41.840 [2024-05-14 23:58:42.245844] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1ad36d0 00:16:41.840 [2024-05-14 23:58:42.245934] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:41.840 NewBaseBdev 00:16:41.840 23:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:16:41.840 23:58:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:16:41.840 23:58:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:41.840 23:58:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:16:41.840 23:58:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:41.840 23:58:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:41.840 23:58:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:42.098 23:58:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:42.356 [ 00:16:42.356 { 00:16:42.356 "name": "NewBaseBdev", 00:16:42.356 "aliases": [ 00:16:42.356 "1e938e3e-edae-4665-9824-0e98b079b371" 00:16:42.356 ], 00:16:42.356 "product_name": "Malloc disk", 00:16:42.356 "block_size": 512, 00:16:42.356 "num_blocks": 65536, 00:16:42.356 "uuid": "1e938e3e-edae-4665-9824-0e98b079b371", 00:16:42.356 "assigned_rate_limits": { 00:16:42.356 "rw_ios_per_sec": 0, 00:16:42.356 "rw_mbytes_per_sec": 0, 00:16:42.356 "r_mbytes_per_sec": 0, 00:16:42.356 "w_mbytes_per_sec": 0 00:16:42.356 }, 00:16:42.356 "claimed": true, 00:16:42.356 "claim_type": "exclusive_write", 00:16:42.356 "zoned": false, 00:16:42.356 "supported_io_types": { 00:16:42.356 "read": true, 00:16:42.356 "write": true, 00:16:42.356 "unmap": true, 00:16:42.356 "write_zeroes": true, 00:16:42.356 "flush": true, 00:16:42.356 "reset": true, 00:16:42.356 "compare": false, 00:16:42.356 "compare_and_write": false, 00:16:42.356 "abort": true, 00:16:42.356 "nvme_admin": false, 00:16:42.356 "nvme_io": false 00:16:42.356 }, 00:16:42.356 "memory_domains": [ 00:16:42.356 { 00:16:42.356 "dma_device_id": "system", 00:16:42.356 "dma_device_type": 1 00:16:42.356 }, 00:16:42.356 { 00:16:42.356 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:42.356 "dma_device_type": 2 00:16:42.356 } 00:16:42.356 ], 00:16:42.356 "driver_specific": {} 00:16:42.356 } 00:16:42.356 ] 00:16:42.356 23:58:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:16:42.356 23:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:16:42.356 23:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:42.357 23:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:16:42.357 23:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:42.357 23:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:42.357 23:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:42.357 23:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:42.357 23:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:42.357 23:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:42.357 23:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:42.357 23:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:42.357 23:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:42.615 23:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:42.615 "name": "Existed_Raid", 00:16:42.615 "uuid": "119e1ce3-5dfc-4f87-9acd-066c875a4215", 00:16:42.615 "strip_size_kb": 64, 00:16:42.615 "state": "online", 00:16:42.615 "raid_level": "raid0", 00:16:42.615 "superblock": true, 00:16:42.615 "num_base_bdevs": 4, 00:16:42.615 "num_base_bdevs_discovered": 4, 00:16:42.615 "num_base_bdevs_operational": 4, 00:16:42.615 "base_bdevs_list": [ 00:16:42.615 { 00:16:42.615 "name": "NewBaseBdev", 00:16:42.615 "uuid": "1e938e3e-edae-4665-9824-0e98b079b371", 00:16:42.615 "is_configured": true, 00:16:42.615 "data_offset": 2048, 00:16:42.616 "data_size": 63488 00:16:42.616 }, 00:16:42.616 { 00:16:42.616 "name": "BaseBdev2", 00:16:42.616 "uuid": "386e68b8-5446-4b8c-b4db-f305201a68a2", 00:16:42.616 "is_configured": true, 00:16:42.616 "data_offset": 2048, 00:16:42.616 "data_size": 63488 00:16:42.616 }, 00:16:42.616 { 00:16:42.616 "name": "BaseBdev3", 00:16:42.616 "uuid": "a440e224-c32a-45ec-abce-6da813f72f62", 00:16:42.616 "is_configured": true, 00:16:42.616 "data_offset": 2048, 00:16:42.616 "data_size": 63488 00:16:42.616 }, 00:16:42.616 { 00:16:42.616 "name": "BaseBdev4", 00:16:42.616 "uuid": "5de28cc0-b0fe-483b-9622-8fc4523fb17e", 00:16:42.616 "is_configured": true, 00:16:42.616 "data_offset": 2048, 00:16:42.616 "data_size": 63488 00:16:42.616 } 00:16:42.616 ] 00:16:42.616 }' 00:16:42.616 23:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:42.616 23:58:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:43.183 23:58:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:16:43.183 23:58:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:16:43.183 23:58:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:16:43.183 23:58:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:16:43.183 23:58:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:16:43.183 23:58:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:16:43.183 23:58:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:43.183 23:58:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:16:43.441 [2024-05-14 23:58:43.810041] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:43.441 23:58:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:16:43.441 "name": "Existed_Raid", 00:16:43.441 "aliases": [ 00:16:43.441 "119e1ce3-5dfc-4f87-9acd-066c875a4215" 00:16:43.441 ], 00:16:43.441 "product_name": "Raid Volume", 00:16:43.441 "block_size": 512, 00:16:43.441 "num_blocks": 253952, 00:16:43.441 "uuid": "119e1ce3-5dfc-4f87-9acd-066c875a4215", 00:16:43.441 "assigned_rate_limits": { 00:16:43.441 "rw_ios_per_sec": 0, 00:16:43.441 "rw_mbytes_per_sec": 0, 00:16:43.441 "r_mbytes_per_sec": 0, 00:16:43.441 "w_mbytes_per_sec": 0 00:16:43.441 }, 00:16:43.441 "claimed": false, 00:16:43.441 "zoned": false, 00:16:43.441 "supported_io_types": { 00:16:43.441 "read": true, 00:16:43.441 "write": true, 00:16:43.441 "unmap": true, 00:16:43.441 "write_zeroes": true, 00:16:43.441 "flush": true, 00:16:43.441 "reset": true, 00:16:43.441 "compare": false, 00:16:43.441 "compare_and_write": false, 00:16:43.442 "abort": false, 00:16:43.442 "nvme_admin": false, 00:16:43.442 "nvme_io": false 00:16:43.442 }, 00:16:43.442 "memory_domains": [ 00:16:43.442 { 00:16:43.442 "dma_device_id": "system", 00:16:43.442 "dma_device_type": 1 00:16:43.442 }, 00:16:43.442 { 00:16:43.442 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:43.442 "dma_device_type": 2 00:16:43.442 }, 00:16:43.442 { 00:16:43.442 "dma_device_id": "system", 00:16:43.442 "dma_device_type": 1 00:16:43.442 }, 00:16:43.442 { 00:16:43.442 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:43.442 "dma_device_type": 2 00:16:43.442 }, 00:16:43.442 { 00:16:43.442 "dma_device_id": "system", 00:16:43.442 "dma_device_type": 1 00:16:43.442 }, 00:16:43.442 { 00:16:43.442 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:43.442 "dma_device_type": 2 00:16:43.442 }, 00:16:43.442 { 00:16:43.442 "dma_device_id": "system", 00:16:43.442 "dma_device_type": 1 00:16:43.442 }, 00:16:43.442 { 00:16:43.442 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:43.442 "dma_device_type": 2 00:16:43.442 } 00:16:43.442 ], 00:16:43.442 "driver_specific": { 00:16:43.442 "raid": { 00:16:43.442 "uuid": "119e1ce3-5dfc-4f87-9acd-066c875a4215", 00:16:43.442 "strip_size_kb": 64, 00:16:43.442 "state": "online", 00:16:43.442 "raid_level": "raid0", 00:16:43.442 "superblock": true, 00:16:43.442 "num_base_bdevs": 4, 00:16:43.442 "num_base_bdevs_discovered": 4, 00:16:43.442 "num_base_bdevs_operational": 4, 00:16:43.442 "base_bdevs_list": [ 00:16:43.442 { 00:16:43.442 "name": "NewBaseBdev", 00:16:43.442 "uuid": "1e938e3e-edae-4665-9824-0e98b079b371", 00:16:43.442 "is_configured": true, 00:16:43.442 "data_offset": 2048, 00:16:43.442 "data_size": 63488 00:16:43.442 }, 00:16:43.442 { 00:16:43.442 "name": "BaseBdev2", 00:16:43.442 "uuid": "386e68b8-5446-4b8c-b4db-f305201a68a2", 00:16:43.442 "is_configured": true, 00:16:43.442 "data_offset": 2048, 00:16:43.442 "data_size": 63488 00:16:43.442 }, 00:16:43.442 { 00:16:43.442 "name": "BaseBdev3", 00:16:43.442 "uuid": "a440e224-c32a-45ec-abce-6da813f72f62", 00:16:43.442 "is_configured": true, 00:16:43.442 "data_offset": 2048, 00:16:43.442 "data_size": 63488 00:16:43.442 }, 00:16:43.442 { 00:16:43.442 "name": "BaseBdev4", 00:16:43.442 "uuid": "5de28cc0-b0fe-483b-9622-8fc4523fb17e", 00:16:43.442 "is_configured": true, 00:16:43.442 "data_offset": 2048, 00:16:43.442 "data_size": 63488 00:16:43.442 } 00:16:43.442 ] 00:16:43.442 } 00:16:43.442 } 00:16:43.442 }' 00:16:43.442 23:58:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:43.442 23:58:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:16:43.442 BaseBdev2 00:16:43.442 BaseBdev3 00:16:43.442 BaseBdev4' 00:16:43.442 23:58:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:43.442 23:58:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:43.442 23:58:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:43.700 23:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:43.700 "name": "NewBaseBdev", 00:16:43.700 "aliases": [ 00:16:43.701 "1e938e3e-edae-4665-9824-0e98b079b371" 00:16:43.701 ], 00:16:43.701 "product_name": "Malloc disk", 00:16:43.701 "block_size": 512, 00:16:43.701 "num_blocks": 65536, 00:16:43.701 "uuid": "1e938e3e-edae-4665-9824-0e98b079b371", 00:16:43.701 "assigned_rate_limits": { 00:16:43.701 "rw_ios_per_sec": 0, 00:16:43.701 "rw_mbytes_per_sec": 0, 00:16:43.701 "r_mbytes_per_sec": 0, 00:16:43.701 "w_mbytes_per_sec": 0 00:16:43.701 }, 00:16:43.701 "claimed": true, 00:16:43.701 "claim_type": "exclusive_write", 00:16:43.701 "zoned": false, 00:16:43.701 "supported_io_types": { 00:16:43.701 "read": true, 00:16:43.701 "write": true, 00:16:43.701 "unmap": true, 00:16:43.701 "write_zeroes": true, 00:16:43.701 "flush": true, 00:16:43.701 "reset": true, 00:16:43.701 "compare": false, 00:16:43.701 "compare_and_write": false, 00:16:43.701 "abort": true, 00:16:43.701 "nvme_admin": false, 00:16:43.701 "nvme_io": false 00:16:43.701 }, 00:16:43.701 "memory_domains": [ 00:16:43.701 { 00:16:43.701 "dma_device_id": "system", 00:16:43.701 "dma_device_type": 1 00:16:43.701 }, 00:16:43.701 { 00:16:43.701 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:43.701 "dma_device_type": 2 00:16:43.701 } 00:16:43.701 ], 00:16:43.701 "driver_specific": {} 00:16:43.701 }' 00:16:43.701 23:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:43.701 23:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:43.701 23:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:43.701 23:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:43.701 23:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:43.701 23:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:43.701 23:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:43.959 23:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:43.959 23:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:43.959 23:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:43.959 23:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:43.959 23:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:43.959 23:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:43.959 23:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:43.959 23:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:44.218 23:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:44.218 "name": "BaseBdev2", 00:16:44.218 "aliases": [ 00:16:44.218 "386e68b8-5446-4b8c-b4db-f305201a68a2" 00:16:44.218 ], 00:16:44.218 "product_name": "Malloc disk", 00:16:44.218 "block_size": 512, 00:16:44.218 "num_blocks": 65536, 00:16:44.218 "uuid": "386e68b8-5446-4b8c-b4db-f305201a68a2", 00:16:44.218 "assigned_rate_limits": { 00:16:44.218 "rw_ios_per_sec": 0, 00:16:44.218 "rw_mbytes_per_sec": 0, 00:16:44.218 "r_mbytes_per_sec": 0, 00:16:44.218 "w_mbytes_per_sec": 0 00:16:44.218 }, 00:16:44.218 "claimed": true, 00:16:44.218 "claim_type": "exclusive_write", 00:16:44.218 "zoned": false, 00:16:44.218 "supported_io_types": { 00:16:44.218 "read": true, 00:16:44.218 "write": true, 00:16:44.218 "unmap": true, 00:16:44.218 "write_zeroes": true, 00:16:44.218 "flush": true, 00:16:44.218 "reset": true, 00:16:44.218 "compare": false, 00:16:44.218 "compare_and_write": false, 00:16:44.218 "abort": true, 00:16:44.218 "nvme_admin": false, 00:16:44.218 "nvme_io": false 00:16:44.218 }, 00:16:44.218 "memory_domains": [ 00:16:44.218 { 00:16:44.218 "dma_device_id": "system", 00:16:44.218 "dma_device_type": 1 00:16:44.218 }, 00:16:44.218 { 00:16:44.218 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:44.218 "dma_device_type": 2 00:16:44.218 } 00:16:44.218 ], 00:16:44.218 "driver_specific": {} 00:16:44.218 }' 00:16:44.218 23:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:44.218 23:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:44.218 23:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:44.218 23:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:44.476 23:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:44.476 23:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:44.476 23:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:44.476 23:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:44.476 23:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:44.476 23:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:44.476 23:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:44.476 23:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:44.476 23:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:44.476 23:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:44.476 23:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:44.735 23:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:44.735 "name": "BaseBdev3", 00:16:44.735 "aliases": [ 00:16:44.735 "a440e224-c32a-45ec-abce-6da813f72f62" 00:16:44.735 ], 00:16:44.735 "product_name": "Malloc disk", 00:16:44.735 "block_size": 512, 00:16:44.735 "num_blocks": 65536, 00:16:44.735 "uuid": "a440e224-c32a-45ec-abce-6da813f72f62", 00:16:44.735 "assigned_rate_limits": { 00:16:44.735 "rw_ios_per_sec": 0, 00:16:44.735 "rw_mbytes_per_sec": 0, 00:16:44.735 "r_mbytes_per_sec": 0, 00:16:44.735 "w_mbytes_per_sec": 0 00:16:44.735 }, 00:16:44.735 "claimed": true, 00:16:44.735 "claim_type": "exclusive_write", 00:16:44.735 "zoned": false, 00:16:44.735 "supported_io_types": { 00:16:44.735 "read": true, 00:16:44.735 "write": true, 00:16:44.735 "unmap": true, 00:16:44.735 "write_zeroes": true, 00:16:44.735 "flush": true, 00:16:44.735 "reset": true, 00:16:44.735 "compare": false, 00:16:44.735 "compare_and_write": false, 00:16:44.735 "abort": true, 00:16:44.735 "nvme_admin": false, 00:16:44.735 "nvme_io": false 00:16:44.735 }, 00:16:44.735 "memory_domains": [ 00:16:44.735 { 00:16:44.735 "dma_device_id": "system", 00:16:44.735 "dma_device_type": 1 00:16:44.735 }, 00:16:44.735 { 00:16:44.735 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:44.735 "dma_device_type": 2 00:16:44.735 } 00:16:44.735 ], 00:16:44.735 "driver_specific": {} 00:16:44.735 }' 00:16:44.735 23:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:44.735 23:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:44.994 23:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:44.994 23:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:44.994 23:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:44.994 23:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:44.994 23:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:44.994 23:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:44.994 23:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:44.994 23:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:44.994 23:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:45.253 23:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:45.253 23:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:45.253 23:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:45.253 23:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:45.512 23:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:45.512 "name": "BaseBdev4", 00:16:45.512 "aliases": [ 00:16:45.512 "5de28cc0-b0fe-483b-9622-8fc4523fb17e" 00:16:45.512 ], 00:16:45.512 "product_name": "Malloc disk", 00:16:45.512 "block_size": 512, 00:16:45.512 "num_blocks": 65536, 00:16:45.512 "uuid": "5de28cc0-b0fe-483b-9622-8fc4523fb17e", 00:16:45.512 "assigned_rate_limits": { 00:16:45.512 "rw_ios_per_sec": 0, 00:16:45.512 "rw_mbytes_per_sec": 0, 00:16:45.512 "r_mbytes_per_sec": 0, 00:16:45.512 "w_mbytes_per_sec": 0 00:16:45.512 }, 00:16:45.512 "claimed": true, 00:16:45.512 "claim_type": "exclusive_write", 00:16:45.512 "zoned": false, 00:16:45.512 "supported_io_types": { 00:16:45.512 "read": true, 00:16:45.512 "write": true, 00:16:45.512 "unmap": true, 00:16:45.512 "write_zeroes": true, 00:16:45.512 "flush": true, 00:16:45.512 "reset": true, 00:16:45.512 "compare": false, 00:16:45.512 "compare_and_write": false, 00:16:45.512 "abort": true, 00:16:45.512 "nvme_admin": false, 00:16:45.512 "nvme_io": false 00:16:45.512 }, 00:16:45.512 "memory_domains": [ 00:16:45.512 { 00:16:45.512 "dma_device_id": "system", 00:16:45.512 "dma_device_type": 1 00:16:45.512 }, 00:16:45.512 { 00:16:45.512 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:45.512 "dma_device_type": 2 00:16:45.512 } 00:16:45.512 ], 00:16:45.512 "driver_specific": {} 00:16:45.512 }' 00:16:45.512 23:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:45.512 23:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:45.512 23:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:45.512 23:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:45.512 23:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:45.512 23:58:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:45.513 23:58:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:45.513 23:58:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:45.771 23:58:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:45.771 23:58:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:45.771 23:58:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:45.771 23:58:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:45.771 23:58:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:46.030 [2024-05-14 23:58:46.428715] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:46.030 [2024-05-14 23:58:46.428741] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:46.030 [2024-05-14 23:58:46.428793] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:46.030 [2024-05-14 23:58:46.428854] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:46.030 [2024-05-14 23:58:46.428865] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ad36d0 name Existed_Raid, state offline 00:16:46.030 23:58:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@342 -- # killprocess 439024 00:16:46.030 23:58:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@946 -- # '[' -z 439024 ']' 00:16:46.030 23:58:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # kill -0 439024 00:16:46.030 23:58:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # uname 00:16:46.030 23:58:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:16:46.030 23:58:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 439024 00:16:46.030 23:58:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:16:46.031 23:58:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:16:46.031 23:58:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 439024' 00:16:46.031 killing process with pid 439024 00:16:46.031 23:58:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@965 -- # kill 439024 00:16:46.031 [2024-05-14 23:58:46.501948] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:46.031 23:58:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@970 -- # wait 439024 00:16:46.031 [2024-05-14 23:58:46.539754] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:46.289 23:58:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@344 -- # return 0 00:16:46.289 00:16:46.289 real 0m31.785s 00:16:46.289 user 0m58.331s 00:16:46.289 sys 0m5.686s 00:16:46.289 23:58:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:16:46.289 23:58:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:46.289 ************************************ 00:16:46.289 END TEST raid_state_function_test_sb 00:16:46.289 ************************************ 00:16:46.289 23:58:46 bdev_raid -- bdev/bdev_raid.sh@817 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:16:46.289 23:58:46 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:16:46.289 23:58:46 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:16:46.289 23:58:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:46.289 ************************************ 00:16:46.289 START TEST raid_superblock_test 00:16:46.289 ************************************ 00:16:46.289 23:58:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1121 -- # raid_superblock_test raid0 4 00:16:46.289 23:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local raid_level=raid0 00:16:46.289 23:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=4 00:16:46.289 23:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:16:46.289 23:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:16:46.289 23:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:16:46.289 23:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:16:46.289 23:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:16:46.289 23:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:16:46.289 23:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:16:46.289 23:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size 00:16:46.289 23:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:16:46.289 23:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:16:46.289 23:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:16:46.548 23:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # '[' raid0 '!=' raid1 ']' 00:16:46.548 23:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size=64 00:16:46.548 23:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@406 -- # strip_size_create_arg='-z 64' 00:16:46.548 23:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # raid_pid=443895 00:16:46.548 23:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # waitforlisten 443895 /var/tmp/spdk-raid.sock 00:16:46.548 23:58:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:16:46.548 23:58:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@827 -- # '[' -z 443895 ']' 00:16:46.548 23:58:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:46.548 23:58:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:16:46.548 23:58:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:46.548 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:46.548 23:58:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:16:46.548 23:58:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:46.548 [2024-05-14 23:58:46.935596] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:16:46.548 [2024-05-14 23:58:46.935662] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid443895 ] 00:16:46.548 [2024-05-14 23:58:47.061675] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:46.807 [2024-05-14 23:58:47.159654] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:16:46.807 [2024-05-14 23:58:47.224044] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:46.807 [2024-05-14 23:58:47.224092] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:47.374 23:58:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:16:47.374 23:58:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # return 0 00:16:47.374 23:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:16:47.374 23:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:16:47.374 23:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:16:47.374 23:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:16:47.374 23:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:16:47.374 23:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:47.374 23:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:16:47.374 23:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:47.374 23:58:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:16:47.633 malloc1 00:16:47.633 23:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:47.892 [2024-05-14 23:58:48.253251] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:47.892 [2024-05-14 23:58:48.253301] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:47.892 [2024-05-14 23:58:48.253332] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2326780 00:16:47.892 [2024-05-14 23:58:48.253348] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:47.892 [2024-05-14 23:58:48.255233] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:47.892 [2024-05-14 23:58:48.255265] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:47.892 pt1 00:16:47.892 23:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:16:47.892 23:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:16:47.892 23:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:16:47.892 23:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:16:47.892 23:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:16:47.892 23:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:47.892 23:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:16:47.892 23:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:47.892 23:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:16:48.188 malloc2 00:16:48.188 23:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:48.188 [2024-05-14 23:58:48.680469] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:48.188 [2024-05-14 23:58:48.680518] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:48.188 [2024-05-14 23:58:48.680544] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2327b60 00:16:48.188 [2024-05-14 23:58:48.680561] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:48.188 [2024-05-14 23:58:48.682030] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:48.188 [2024-05-14 23:58:48.682060] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:48.188 pt2 00:16:48.188 23:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:16:48.188 23:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:16:48.188 23:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc3 00:16:48.188 23:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt3 00:16:48.188 23:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:16:48.188 23:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:48.188 23:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:16:48.188 23:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:48.188 23:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:16:48.455 malloc3 00:16:48.455 23:58:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:48.713 [2024-05-14 23:58:49.122277] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:48.713 [2024-05-14 23:58:49.122331] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:48.713 [2024-05-14 23:58:49.122366] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24d2080 00:16:48.713 [2024-05-14 23:58:49.122383] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:48.713 [2024-05-14 23:58:49.124088] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:48.713 [2024-05-14 23:58:49.124121] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:48.713 pt3 00:16:48.713 23:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:16:48.713 23:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:16:48.713 23:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc4 00:16:48.713 23:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt4 00:16:48.713 23:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:16:48.713 23:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:48.713 23:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:16:48.713 23:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:48.713 23:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:16:48.972 malloc4 00:16:48.972 23:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:16:49.231 [2024-05-14 23:58:49.593363] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:16:49.231 [2024-05-14 23:58:49.593419] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:49.231 [2024-05-14 23:58:49.593450] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24d4610 00:16:49.231 [2024-05-14 23:58:49.593469] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:49.231 [2024-05-14 23:58:49.595103] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:49.231 [2024-05-14 23:58:49.595135] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:16:49.231 pt4 00:16:49.231 23:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:16:49.231 23:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:16:49.231 23:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:16:49.490 [2024-05-14 23:58:49.834030] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:49.490 [2024-05-14 23:58:49.835405] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:49.490 [2024-05-14 23:58:49.835462] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:49.490 [2024-05-14 23:58:49.835505] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:16:49.490 [2024-05-14 23:58:49.835683] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x24d6480 00:16:49.490 [2024-05-14 23:58:49.835694] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:49.490 [2024-05-14 23:58:49.835898] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24d5310 00:16:49.490 [2024-05-14 23:58:49.836049] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24d6480 00:16:49.490 [2024-05-14 23:58:49.836058] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x24d6480 00:16:49.490 [2024-05-14 23:58:49.836159] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:49.490 23:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:16:49.490 23:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:16:49.490 23:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:16:49.490 23:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:49.490 23:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:49.490 23:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:49.490 23:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:49.490 23:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:49.490 23:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:49.490 23:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:49.490 23:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.490 23:58:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:49.490 23:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:49.490 "name": "raid_bdev1", 00:16:49.490 "uuid": "6719f32a-a1bc-43dd-8c8c-d2b7f5d27bd3", 00:16:49.490 "strip_size_kb": 64, 00:16:49.490 "state": "online", 00:16:49.490 "raid_level": "raid0", 00:16:49.490 "superblock": true, 00:16:49.490 "num_base_bdevs": 4, 00:16:49.490 "num_base_bdevs_discovered": 4, 00:16:49.490 "num_base_bdevs_operational": 4, 00:16:49.490 "base_bdevs_list": [ 00:16:49.490 { 00:16:49.490 "name": "pt1", 00:16:49.490 "uuid": "1806c7e0-8814-5e58-9e6e-22580da78ed3", 00:16:49.490 "is_configured": true, 00:16:49.490 "data_offset": 2048, 00:16:49.490 "data_size": 63488 00:16:49.490 }, 00:16:49.490 { 00:16:49.490 "name": "pt2", 00:16:49.490 "uuid": "571fb3b0-056f-5723-8aac-a20b68df19ad", 00:16:49.490 "is_configured": true, 00:16:49.490 "data_offset": 2048, 00:16:49.490 "data_size": 63488 00:16:49.490 }, 00:16:49.490 { 00:16:49.490 "name": "pt3", 00:16:49.490 "uuid": "7bcd23f1-2fa6-5ba2-9b6a-0018c1c2ea0e", 00:16:49.490 "is_configured": true, 00:16:49.490 "data_offset": 2048, 00:16:49.490 "data_size": 63488 00:16:49.490 }, 00:16:49.490 { 00:16:49.490 "name": "pt4", 00:16:49.490 "uuid": "ebed89c0-129d-54df-b617-3e5b7b77acb0", 00:16:49.490 "is_configured": true, 00:16:49.490 "data_offset": 2048, 00:16:49.490 "data_size": 63488 00:16:49.490 } 00:16:49.490 ] 00:16:49.490 }' 00:16:49.490 23:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:49.490 23:58:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:50.424 23:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:16:50.424 23:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:16:50.424 23:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:16:50.424 23:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:16:50.424 23:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:16:50.424 23:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:16:50.424 23:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:50.424 23:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:16:50.424 [2024-05-14 23:58:50.877042] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:50.424 23:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:16:50.424 "name": "raid_bdev1", 00:16:50.424 "aliases": [ 00:16:50.424 "6719f32a-a1bc-43dd-8c8c-d2b7f5d27bd3" 00:16:50.424 ], 00:16:50.424 "product_name": "Raid Volume", 00:16:50.424 "block_size": 512, 00:16:50.424 "num_blocks": 253952, 00:16:50.424 "uuid": "6719f32a-a1bc-43dd-8c8c-d2b7f5d27bd3", 00:16:50.424 "assigned_rate_limits": { 00:16:50.424 "rw_ios_per_sec": 0, 00:16:50.424 "rw_mbytes_per_sec": 0, 00:16:50.424 "r_mbytes_per_sec": 0, 00:16:50.424 "w_mbytes_per_sec": 0 00:16:50.424 }, 00:16:50.424 "claimed": false, 00:16:50.424 "zoned": false, 00:16:50.424 "supported_io_types": { 00:16:50.424 "read": true, 00:16:50.424 "write": true, 00:16:50.424 "unmap": true, 00:16:50.424 "write_zeroes": true, 00:16:50.424 "flush": true, 00:16:50.424 "reset": true, 00:16:50.424 "compare": false, 00:16:50.424 "compare_and_write": false, 00:16:50.424 "abort": false, 00:16:50.424 "nvme_admin": false, 00:16:50.424 "nvme_io": false 00:16:50.424 }, 00:16:50.424 "memory_domains": [ 00:16:50.424 { 00:16:50.424 "dma_device_id": "system", 00:16:50.424 "dma_device_type": 1 00:16:50.424 }, 00:16:50.424 { 00:16:50.424 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:50.424 "dma_device_type": 2 00:16:50.424 }, 00:16:50.424 { 00:16:50.424 "dma_device_id": "system", 00:16:50.424 "dma_device_type": 1 00:16:50.424 }, 00:16:50.424 { 00:16:50.424 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:50.424 "dma_device_type": 2 00:16:50.424 }, 00:16:50.424 { 00:16:50.424 "dma_device_id": "system", 00:16:50.424 "dma_device_type": 1 00:16:50.424 }, 00:16:50.424 { 00:16:50.424 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:50.424 "dma_device_type": 2 00:16:50.424 }, 00:16:50.424 { 00:16:50.424 "dma_device_id": "system", 00:16:50.424 "dma_device_type": 1 00:16:50.424 }, 00:16:50.424 { 00:16:50.424 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:50.424 "dma_device_type": 2 00:16:50.424 } 00:16:50.424 ], 00:16:50.424 "driver_specific": { 00:16:50.424 "raid": { 00:16:50.424 "uuid": "6719f32a-a1bc-43dd-8c8c-d2b7f5d27bd3", 00:16:50.424 "strip_size_kb": 64, 00:16:50.424 "state": "online", 00:16:50.424 "raid_level": "raid0", 00:16:50.424 "superblock": true, 00:16:50.424 "num_base_bdevs": 4, 00:16:50.424 "num_base_bdevs_discovered": 4, 00:16:50.424 "num_base_bdevs_operational": 4, 00:16:50.424 "base_bdevs_list": [ 00:16:50.424 { 00:16:50.424 "name": "pt1", 00:16:50.424 "uuid": "1806c7e0-8814-5e58-9e6e-22580da78ed3", 00:16:50.424 "is_configured": true, 00:16:50.424 "data_offset": 2048, 00:16:50.424 "data_size": 63488 00:16:50.424 }, 00:16:50.424 { 00:16:50.424 "name": "pt2", 00:16:50.424 "uuid": "571fb3b0-056f-5723-8aac-a20b68df19ad", 00:16:50.424 "is_configured": true, 00:16:50.424 "data_offset": 2048, 00:16:50.424 "data_size": 63488 00:16:50.424 }, 00:16:50.424 { 00:16:50.424 "name": "pt3", 00:16:50.424 "uuid": "7bcd23f1-2fa6-5ba2-9b6a-0018c1c2ea0e", 00:16:50.424 "is_configured": true, 00:16:50.424 "data_offset": 2048, 00:16:50.424 "data_size": 63488 00:16:50.424 }, 00:16:50.424 { 00:16:50.424 "name": "pt4", 00:16:50.424 "uuid": "ebed89c0-129d-54df-b617-3e5b7b77acb0", 00:16:50.424 "is_configured": true, 00:16:50.424 "data_offset": 2048, 00:16:50.424 "data_size": 63488 00:16:50.424 } 00:16:50.424 ] 00:16:50.424 } 00:16:50.424 } 00:16:50.424 }' 00:16:50.424 23:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:50.424 23:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:16:50.424 pt2 00:16:50.424 pt3 00:16:50.424 pt4' 00:16:50.424 23:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:50.424 23:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:50.424 23:58:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:50.683 23:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:50.683 "name": "pt1", 00:16:50.683 "aliases": [ 00:16:50.683 "1806c7e0-8814-5e58-9e6e-22580da78ed3" 00:16:50.683 ], 00:16:50.683 "product_name": "passthru", 00:16:50.683 "block_size": 512, 00:16:50.683 "num_blocks": 65536, 00:16:50.683 "uuid": "1806c7e0-8814-5e58-9e6e-22580da78ed3", 00:16:50.683 "assigned_rate_limits": { 00:16:50.683 "rw_ios_per_sec": 0, 00:16:50.683 "rw_mbytes_per_sec": 0, 00:16:50.683 "r_mbytes_per_sec": 0, 00:16:50.683 "w_mbytes_per_sec": 0 00:16:50.683 }, 00:16:50.683 "claimed": true, 00:16:50.683 "claim_type": "exclusive_write", 00:16:50.683 "zoned": false, 00:16:50.683 "supported_io_types": { 00:16:50.683 "read": true, 00:16:50.683 "write": true, 00:16:50.683 "unmap": true, 00:16:50.683 "write_zeroes": true, 00:16:50.683 "flush": true, 00:16:50.683 "reset": true, 00:16:50.683 "compare": false, 00:16:50.683 "compare_and_write": false, 00:16:50.683 "abort": true, 00:16:50.683 "nvme_admin": false, 00:16:50.683 "nvme_io": false 00:16:50.683 }, 00:16:50.683 "memory_domains": [ 00:16:50.683 { 00:16:50.683 "dma_device_id": "system", 00:16:50.683 "dma_device_type": 1 00:16:50.683 }, 00:16:50.683 { 00:16:50.683 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:50.683 "dma_device_type": 2 00:16:50.683 } 00:16:50.683 ], 00:16:50.683 "driver_specific": { 00:16:50.683 "passthru": { 00:16:50.683 "name": "pt1", 00:16:50.683 "base_bdev_name": "malloc1" 00:16:50.683 } 00:16:50.683 } 00:16:50.683 }' 00:16:50.683 23:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:50.683 23:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:50.941 23:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:50.941 23:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:50.941 23:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:50.941 23:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:50.941 23:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:50.941 23:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:50.941 23:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:50.941 23:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:50.941 23:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:51.199 23:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:51.199 23:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:51.199 23:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:51.199 23:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:51.458 23:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:51.458 "name": "pt2", 00:16:51.458 "aliases": [ 00:16:51.458 "571fb3b0-056f-5723-8aac-a20b68df19ad" 00:16:51.458 ], 00:16:51.458 "product_name": "passthru", 00:16:51.458 "block_size": 512, 00:16:51.458 "num_blocks": 65536, 00:16:51.458 "uuid": "571fb3b0-056f-5723-8aac-a20b68df19ad", 00:16:51.458 "assigned_rate_limits": { 00:16:51.458 "rw_ios_per_sec": 0, 00:16:51.458 "rw_mbytes_per_sec": 0, 00:16:51.458 "r_mbytes_per_sec": 0, 00:16:51.458 "w_mbytes_per_sec": 0 00:16:51.458 }, 00:16:51.458 "claimed": true, 00:16:51.458 "claim_type": "exclusive_write", 00:16:51.458 "zoned": false, 00:16:51.458 "supported_io_types": { 00:16:51.458 "read": true, 00:16:51.458 "write": true, 00:16:51.458 "unmap": true, 00:16:51.458 "write_zeroes": true, 00:16:51.458 "flush": true, 00:16:51.458 "reset": true, 00:16:51.458 "compare": false, 00:16:51.458 "compare_and_write": false, 00:16:51.458 "abort": true, 00:16:51.458 "nvme_admin": false, 00:16:51.458 "nvme_io": false 00:16:51.458 }, 00:16:51.458 "memory_domains": [ 00:16:51.458 { 00:16:51.458 "dma_device_id": "system", 00:16:51.458 "dma_device_type": 1 00:16:51.458 }, 00:16:51.458 { 00:16:51.458 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.458 "dma_device_type": 2 00:16:51.458 } 00:16:51.458 ], 00:16:51.458 "driver_specific": { 00:16:51.458 "passthru": { 00:16:51.458 "name": "pt2", 00:16:51.458 "base_bdev_name": "malloc2" 00:16:51.458 } 00:16:51.458 } 00:16:51.458 }' 00:16:51.458 23:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:51.458 23:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:51.458 23:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:51.458 23:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:51.458 23:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:51.458 23:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:51.458 23:58:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:51.458 23:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:51.716 23:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:51.716 23:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:51.716 23:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:51.716 23:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:51.716 23:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:51.716 23:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:51.716 23:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:51.974 23:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:51.974 "name": "pt3", 00:16:51.974 "aliases": [ 00:16:51.974 "7bcd23f1-2fa6-5ba2-9b6a-0018c1c2ea0e" 00:16:51.974 ], 00:16:51.974 "product_name": "passthru", 00:16:51.974 "block_size": 512, 00:16:51.974 "num_blocks": 65536, 00:16:51.974 "uuid": "7bcd23f1-2fa6-5ba2-9b6a-0018c1c2ea0e", 00:16:51.974 "assigned_rate_limits": { 00:16:51.974 "rw_ios_per_sec": 0, 00:16:51.974 "rw_mbytes_per_sec": 0, 00:16:51.974 "r_mbytes_per_sec": 0, 00:16:51.974 "w_mbytes_per_sec": 0 00:16:51.974 }, 00:16:51.974 "claimed": true, 00:16:51.974 "claim_type": "exclusive_write", 00:16:51.974 "zoned": false, 00:16:51.974 "supported_io_types": { 00:16:51.974 "read": true, 00:16:51.974 "write": true, 00:16:51.974 "unmap": true, 00:16:51.974 "write_zeroes": true, 00:16:51.974 "flush": true, 00:16:51.974 "reset": true, 00:16:51.974 "compare": false, 00:16:51.974 "compare_and_write": false, 00:16:51.974 "abort": true, 00:16:51.974 "nvme_admin": false, 00:16:51.974 "nvme_io": false 00:16:51.974 }, 00:16:51.974 "memory_domains": [ 00:16:51.974 { 00:16:51.974 "dma_device_id": "system", 00:16:51.974 "dma_device_type": 1 00:16:51.974 }, 00:16:51.974 { 00:16:51.974 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.974 "dma_device_type": 2 00:16:51.974 } 00:16:51.974 ], 00:16:51.974 "driver_specific": { 00:16:51.974 "passthru": { 00:16:51.974 "name": "pt3", 00:16:51.974 "base_bdev_name": "malloc3" 00:16:51.974 } 00:16:51.975 } 00:16:51.975 }' 00:16:51.975 23:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:51.975 23:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:51.975 23:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:51.975 23:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:51.975 23:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:51.975 23:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:51.975 23:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:52.233 23:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:52.234 23:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:52.234 23:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:52.234 23:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:52.234 23:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:52.234 23:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:52.234 23:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:16:52.234 23:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:52.492 23:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:52.492 "name": "pt4", 00:16:52.492 "aliases": [ 00:16:52.492 "ebed89c0-129d-54df-b617-3e5b7b77acb0" 00:16:52.492 ], 00:16:52.492 "product_name": "passthru", 00:16:52.492 "block_size": 512, 00:16:52.492 "num_blocks": 65536, 00:16:52.493 "uuid": "ebed89c0-129d-54df-b617-3e5b7b77acb0", 00:16:52.493 "assigned_rate_limits": { 00:16:52.493 "rw_ios_per_sec": 0, 00:16:52.493 "rw_mbytes_per_sec": 0, 00:16:52.493 "r_mbytes_per_sec": 0, 00:16:52.493 "w_mbytes_per_sec": 0 00:16:52.493 }, 00:16:52.493 "claimed": true, 00:16:52.493 "claim_type": "exclusive_write", 00:16:52.493 "zoned": false, 00:16:52.493 "supported_io_types": { 00:16:52.493 "read": true, 00:16:52.493 "write": true, 00:16:52.493 "unmap": true, 00:16:52.493 "write_zeroes": true, 00:16:52.493 "flush": true, 00:16:52.493 "reset": true, 00:16:52.493 "compare": false, 00:16:52.493 "compare_and_write": false, 00:16:52.493 "abort": true, 00:16:52.493 "nvme_admin": false, 00:16:52.493 "nvme_io": false 00:16:52.493 }, 00:16:52.493 "memory_domains": [ 00:16:52.493 { 00:16:52.493 "dma_device_id": "system", 00:16:52.493 "dma_device_type": 1 00:16:52.493 }, 00:16:52.493 { 00:16:52.493 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:52.493 "dma_device_type": 2 00:16:52.493 } 00:16:52.493 ], 00:16:52.493 "driver_specific": { 00:16:52.493 "passthru": { 00:16:52.493 "name": "pt4", 00:16:52.493 "base_bdev_name": "malloc4" 00:16:52.493 } 00:16:52.493 } 00:16:52.493 }' 00:16:52.493 23:58:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:52.493 23:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:52.493 23:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:52.493 23:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:52.751 23:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:52.751 23:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:52.751 23:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:52.751 23:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:52.751 23:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:52.751 23:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:52.751 23:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:52.751 23:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:52.751 23:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:16:52.751 23:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:53.009 [2024-05-14 23:58:53.548117] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:53.009 23:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=6719f32a-a1bc-43dd-8c8c-d2b7f5d27bd3 00:16:53.009 23:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # '[' -z 6719f32a-a1bc-43dd-8c8c-d2b7f5d27bd3 ']' 00:16:53.009 23:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:53.267 [2024-05-14 23:58:53.788457] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:53.267 [2024-05-14 23:58:53.788481] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:53.267 [2024-05-14 23:58:53.788535] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:53.267 [2024-05-14 23:58:53.788601] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:53.267 [2024-05-14 23:58:53.788613] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24d6480 name raid_bdev1, state offline 00:16:53.267 23:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:53.267 23:58:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:16:53.524 23:58:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:16:53.524 23:58:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:16:53.524 23:58:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:16:53.524 23:58:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:16:53.782 23:58:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:16:53.782 23:58:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:54.040 23:58:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:16:54.040 23:58:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:16:54.298 23:58:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:16:54.298 23:58:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:16:54.556 23:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:16:54.556 23:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:16:54.813 23:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:16:54.813 23:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:16:54.813 23:58:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:16:54.813 23:58:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:16:54.813 23:58:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:54.813 23:58:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:54.813 23:58:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:54.813 23:58:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:54.813 23:58:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:54.813 23:58:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:54.813 23:58:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:54.813 23:58:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:16:54.813 23:58:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:16:55.083 [2024-05-14 23:58:55.492952] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:16:55.083 [2024-05-14 23:58:55.494342] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:16:55.083 [2024-05-14 23:58:55.494387] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:16:55.083 [2024-05-14 23:58:55.494429] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:16:55.083 [2024-05-14 23:58:55.494476] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:16:55.083 [2024-05-14 23:58:55.494518] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:16:55.083 [2024-05-14 23:58:55.494547] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:16:55.083 [2024-05-14 23:58:55.494574] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:16:55.083 [2024-05-14 23:58:55.494599] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:55.083 [2024-05-14 23:58:55.494615] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24d7350 name raid_bdev1, state configuring 00:16:55.083 request: 00:16:55.083 { 00:16:55.083 "name": "raid_bdev1", 00:16:55.083 "raid_level": "raid0", 00:16:55.083 "base_bdevs": [ 00:16:55.083 "malloc1", 00:16:55.083 "malloc2", 00:16:55.083 "malloc3", 00:16:55.083 "malloc4" 00:16:55.084 ], 00:16:55.084 "superblock": false, 00:16:55.084 "strip_size_kb": 64, 00:16:55.084 "method": "bdev_raid_create", 00:16:55.084 "req_id": 1 00:16:55.084 } 00:16:55.084 Got JSON-RPC error response 00:16:55.084 response: 00:16:55.084 { 00:16:55.084 "code": -17, 00:16:55.084 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:16:55.084 } 00:16:55.084 23:58:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:16:55.084 23:58:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:16:55.084 23:58:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:16:55.084 23:58:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:16:55.084 23:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:55.084 23:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:16:55.341 23:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:16:55.341 23:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:16:55.341 23:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:55.599 [2024-05-14 23:58:55.974159] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:55.599 [2024-05-14 23:58:55.974209] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:55.599 [2024-05-14 23:58:55.974238] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24cfaa0 00:16:55.599 [2024-05-14 23:58:55.974255] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:55.599 [2024-05-14 23:58:55.975905] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:55.599 [2024-05-14 23:58:55.975941] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:55.599 [2024-05-14 23:58:55.976029] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt1 00:16:55.599 [2024-05-14 23:58:55.976064] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:55.599 pt1 00:16:55.599 23:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:16:55.599 23:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:16:55.599 23:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:55.599 23:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:55.599 23:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:55.599 23:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:55.599 23:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:55.599 23:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:55.599 23:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:55.599 23:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:55.599 23:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:55.599 23:58:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:55.857 23:58:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:55.857 "name": "raid_bdev1", 00:16:55.857 "uuid": "6719f32a-a1bc-43dd-8c8c-d2b7f5d27bd3", 00:16:55.857 "strip_size_kb": 64, 00:16:55.857 "state": "configuring", 00:16:55.857 "raid_level": "raid0", 00:16:55.857 "superblock": true, 00:16:55.857 "num_base_bdevs": 4, 00:16:55.857 "num_base_bdevs_discovered": 1, 00:16:55.857 "num_base_bdevs_operational": 4, 00:16:55.857 "base_bdevs_list": [ 00:16:55.857 { 00:16:55.857 "name": "pt1", 00:16:55.857 "uuid": "1806c7e0-8814-5e58-9e6e-22580da78ed3", 00:16:55.857 "is_configured": true, 00:16:55.857 "data_offset": 2048, 00:16:55.857 "data_size": 63488 00:16:55.857 }, 00:16:55.857 { 00:16:55.857 "name": null, 00:16:55.857 "uuid": "571fb3b0-056f-5723-8aac-a20b68df19ad", 00:16:55.857 "is_configured": false, 00:16:55.857 "data_offset": 2048, 00:16:55.857 "data_size": 63488 00:16:55.857 }, 00:16:55.857 { 00:16:55.857 "name": null, 00:16:55.857 "uuid": "7bcd23f1-2fa6-5ba2-9b6a-0018c1c2ea0e", 00:16:55.857 "is_configured": false, 00:16:55.857 "data_offset": 2048, 00:16:55.857 "data_size": 63488 00:16:55.857 }, 00:16:55.857 { 00:16:55.857 "name": null, 00:16:55.857 "uuid": "ebed89c0-129d-54df-b617-3e5b7b77acb0", 00:16:55.857 "is_configured": false, 00:16:55.857 "data_offset": 2048, 00:16:55.857 "data_size": 63488 00:16:55.857 } 00:16:55.857 ] 00:16:55.857 }' 00:16:55.857 23:58:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:55.857 23:58:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:56.423 23:58:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@470 -- # '[' 4 -gt 2 ']' 00:16:56.423 23:58:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:56.682 [2024-05-14 23:58:57.032980] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:56.682 [2024-05-14 23:58:57.033030] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:56.682 [2024-05-14 23:58:57.033052] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24d0e00 00:16:56.682 [2024-05-14 23:58:57.033065] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:56.682 [2024-05-14 23:58:57.033422] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:56.682 [2024-05-14 23:58:57.033447] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:56.682 [2024-05-14 23:58:57.033526] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:16:56.682 [2024-05-14 23:58:57.033547] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:56.682 pt2 00:16:56.682 23:58:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:56.682 [2024-05-14 23:58:57.269633] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:16:56.940 23:58:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:16:56.940 23:58:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:16:56.940 23:58:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:56.940 23:58:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:56.940 23:58:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:56.940 23:58:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:56.940 23:58:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:56.940 23:58:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:56.940 23:58:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:56.940 23:58:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:56.940 23:58:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:56.940 23:58:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:57.198 23:58:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:57.198 "name": "raid_bdev1", 00:16:57.198 "uuid": "6719f32a-a1bc-43dd-8c8c-d2b7f5d27bd3", 00:16:57.198 "strip_size_kb": 64, 00:16:57.198 "state": "configuring", 00:16:57.198 "raid_level": "raid0", 00:16:57.198 "superblock": true, 00:16:57.198 "num_base_bdevs": 4, 00:16:57.198 "num_base_bdevs_discovered": 1, 00:16:57.198 "num_base_bdevs_operational": 4, 00:16:57.198 "base_bdevs_list": [ 00:16:57.198 { 00:16:57.198 "name": "pt1", 00:16:57.198 "uuid": "1806c7e0-8814-5e58-9e6e-22580da78ed3", 00:16:57.198 "is_configured": true, 00:16:57.198 "data_offset": 2048, 00:16:57.198 "data_size": 63488 00:16:57.198 }, 00:16:57.198 { 00:16:57.198 "name": null, 00:16:57.198 "uuid": "571fb3b0-056f-5723-8aac-a20b68df19ad", 00:16:57.198 "is_configured": false, 00:16:57.198 "data_offset": 2048, 00:16:57.198 "data_size": 63488 00:16:57.198 }, 00:16:57.198 { 00:16:57.198 "name": null, 00:16:57.198 "uuid": "7bcd23f1-2fa6-5ba2-9b6a-0018c1c2ea0e", 00:16:57.198 "is_configured": false, 00:16:57.198 "data_offset": 2048, 00:16:57.198 "data_size": 63488 00:16:57.198 }, 00:16:57.198 { 00:16:57.198 "name": null, 00:16:57.198 "uuid": "ebed89c0-129d-54df-b617-3e5b7b77acb0", 00:16:57.198 "is_configured": false, 00:16:57.198 "data_offset": 2048, 00:16:57.198 "data_size": 63488 00:16:57.198 } 00:16:57.198 ] 00:16:57.198 }' 00:16:57.198 23:58:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:57.198 23:58:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:57.765 23:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:16:57.765 23:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:16:57.765 23:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:57.765 [2024-05-14 23:58:58.352491] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:57.765 [2024-05-14 23:58:58.352543] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:57.765 [2024-05-14 23:58:58.352564] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2327370 00:16:57.765 [2024-05-14 23:58:58.352577] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:57.765 [2024-05-14 23:58:58.352910] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:57.765 [2024-05-14 23:58:58.352927] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:57.765 [2024-05-14 23:58:58.352991] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:16:57.765 [2024-05-14 23:58:58.353010] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:58.024 pt2 00:16:58.024 23:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:16:58.024 23:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:16:58.024 23:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:58.024 [2024-05-14 23:58:58.597128] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:58.024 [2024-05-14 23:58:58.597168] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:58.024 [2024-05-14 23:58:58.597186] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24d7700 00:16:58.024 [2024-05-14 23:58:58.597199] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:58.024 [2024-05-14 23:58:58.597521] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:58.024 [2024-05-14 23:58:58.597539] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:58.024 [2024-05-14 23:58:58.597591] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt3 00:16:58.024 [2024-05-14 23:58:58.597609] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:58.024 pt3 00:16:58.282 23:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:16:58.282 23:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:16:58.282 23:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:16:58.282 [2024-05-14 23:58:58.841785] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:16:58.282 [2024-05-14 23:58:58.841829] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:58.282 [2024-05-14 23:58:58.841848] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2325280 00:16:58.282 [2024-05-14 23:58:58.841860] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:58.282 [2024-05-14 23:58:58.842196] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:58.282 [2024-05-14 23:58:58.842213] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:16:58.282 [2024-05-14 23:58:58.842272] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt4 00:16:58.282 [2024-05-14 23:58:58.842289] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:16:58.282 [2024-05-14 23:58:58.842423] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x24d7a20 00:16:58.282 [2024-05-14 23:58:58.842434] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:58.282 [2024-05-14 23:58:58.842609] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24c87d0 00:16:58.282 [2024-05-14 23:58:58.842749] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24d7a20 00:16:58.282 [2024-05-14 23:58:58.842758] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x24d7a20 00:16:58.282 [2024-05-14 23:58:58.842856] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:58.282 pt4 00:16:58.282 23:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:16:58.282 23:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:16:58.282 23:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:16:58.282 23:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:16:58.282 23:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:16:58.282 23:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:58.282 23:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:58.282 23:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:58.282 23:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:58.282 23:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:58.282 23:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:58.282 23:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:58.282 23:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.282 23:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:58.540 23:58:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:58.540 "name": "raid_bdev1", 00:16:58.540 "uuid": "6719f32a-a1bc-43dd-8c8c-d2b7f5d27bd3", 00:16:58.540 "strip_size_kb": 64, 00:16:58.540 "state": "online", 00:16:58.540 "raid_level": "raid0", 00:16:58.540 "superblock": true, 00:16:58.540 "num_base_bdevs": 4, 00:16:58.540 "num_base_bdevs_discovered": 4, 00:16:58.540 "num_base_bdevs_operational": 4, 00:16:58.540 "base_bdevs_list": [ 00:16:58.540 { 00:16:58.540 "name": "pt1", 00:16:58.540 "uuid": "1806c7e0-8814-5e58-9e6e-22580da78ed3", 00:16:58.540 "is_configured": true, 00:16:58.540 "data_offset": 2048, 00:16:58.540 "data_size": 63488 00:16:58.540 }, 00:16:58.540 { 00:16:58.540 "name": "pt2", 00:16:58.540 "uuid": "571fb3b0-056f-5723-8aac-a20b68df19ad", 00:16:58.540 "is_configured": true, 00:16:58.540 "data_offset": 2048, 00:16:58.540 "data_size": 63488 00:16:58.540 }, 00:16:58.540 { 00:16:58.540 "name": "pt3", 00:16:58.540 "uuid": "7bcd23f1-2fa6-5ba2-9b6a-0018c1c2ea0e", 00:16:58.540 "is_configured": true, 00:16:58.540 "data_offset": 2048, 00:16:58.540 "data_size": 63488 00:16:58.540 }, 00:16:58.540 { 00:16:58.540 "name": "pt4", 00:16:58.540 "uuid": "ebed89c0-129d-54df-b617-3e5b7b77acb0", 00:16:58.540 "is_configured": true, 00:16:58.540 "data_offset": 2048, 00:16:58.540 "data_size": 63488 00:16:58.540 } 00:16:58.540 ] 00:16:58.540 }' 00:16:58.540 23:58:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:58.540 23:58:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:59.105 23:58:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:16:59.105 23:58:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:16:59.105 23:58:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:16:59.105 23:58:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:16:59.105 23:58:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:16:59.105 23:58:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:16:59.105 23:58:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:59.364 23:58:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:16:59.364 [2024-05-14 23:58:59.916927] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:59.364 23:58:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:16:59.364 "name": "raid_bdev1", 00:16:59.364 "aliases": [ 00:16:59.364 "6719f32a-a1bc-43dd-8c8c-d2b7f5d27bd3" 00:16:59.364 ], 00:16:59.364 "product_name": "Raid Volume", 00:16:59.364 "block_size": 512, 00:16:59.364 "num_blocks": 253952, 00:16:59.364 "uuid": "6719f32a-a1bc-43dd-8c8c-d2b7f5d27bd3", 00:16:59.364 "assigned_rate_limits": { 00:16:59.364 "rw_ios_per_sec": 0, 00:16:59.364 "rw_mbytes_per_sec": 0, 00:16:59.364 "r_mbytes_per_sec": 0, 00:16:59.364 "w_mbytes_per_sec": 0 00:16:59.364 }, 00:16:59.364 "claimed": false, 00:16:59.364 "zoned": false, 00:16:59.364 "supported_io_types": { 00:16:59.364 "read": true, 00:16:59.364 "write": true, 00:16:59.364 "unmap": true, 00:16:59.364 "write_zeroes": true, 00:16:59.364 "flush": true, 00:16:59.364 "reset": true, 00:16:59.364 "compare": false, 00:16:59.364 "compare_and_write": false, 00:16:59.364 "abort": false, 00:16:59.364 "nvme_admin": false, 00:16:59.364 "nvme_io": false 00:16:59.364 }, 00:16:59.364 "memory_domains": [ 00:16:59.364 { 00:16:59.364 "dma_device_id": "system", 00:16:59.364 "dma_device_type": 1 00:16:59.364 }, 00:16:59.364 { 00:16:59.364 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.364 "dma_device_type": 2 00:16:59.364 }, 00:16:59.364 { 00:16:59.364 "dma_device_id": "system", 00:16:59.364 "dma_device_type": 1 00:16:59.364 }, 00:16:59.364 { 00:16:59.364 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.364 "dma_device_type": 2 00:16:59.364 }, 00:16:59.364 { 00:16:59.364 "dma_device_id": "system", 00:16:59.364 "dma_device_type": 1 00:16:59.364 }, 00:16:59.364 { 00:16:59.364 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.364 "dma_device_type": 2 00:16:59.364 }, 00:16:59.364 { 00:16:59.364 "dma_device_id": "system", 00:16:59.364 "dma_device_type": 1 00:16:59.364 }, 00:16:59.364 { 00:16:59.364 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.364 "dma_device_type": 2 00:16:59.364 } 00:16:59.364 ], 00:16:59.364 "driver_specific": { 00:16:59.364 "raid": { 00:16:59.364 "uuid": "6719f32a-a1bc-43dd-8c8c-d2b7f5d27bd3", 00:16:59.364 "strip_size_kb": 64, 00:16:59.364 "state": "online", 00:16:59.364 "raid_level": "raid0", 00:16:59.364 "superblock": true, 00:16:59.364 "num_base_bdevs": 4, 00:16:59.364 "num_base_bdevs_discovered": 4, 00:16:59.364 "num_base_bdevs_operational": 4, 00:16:59.364 "base_bdevs_list": [ 00:16:59.364 { 00:16:59.364 "name": "pt1", 00:16:59.364 "uuid": "1806c7e0-8814-5e58-9e6e-22580da78ed3", 00:16:59.364 "is_configured": true, 00:16:59.364 "data_offset": 2048, 00:16:59.364 "data_size": 63488 00:16:59.364 }, 00:16:59.364 { 00:16:59.364 "name": "pt2", 00:16:59.364 "uuid": "571fb3b0-056f-5723-8aac-a20b68df19ad", 00:16:59.364 "is_configured": true, 00:16:59.364 "data_offset": 2048, 00:16:59.364 "data_size": 63488 00:16:59.364 }, 00:16:59.364 { 00:16:59.364 "name": "pt3", 00:16:59.364 "uuid": "7bcd23f1-2fa6-5ba2-9b6a-0018c1c2ea0e", 00:16:59.364 "is_configured": true, 00:16:59.364 "data_offset": 2048, 00:16:59.364 "data_size": 63488 00:16:59.364 }, 00:16:59.364 { 00:16:59.364 "name": "pt4", 00:16:59.364 "uuid": "ebed89c0-129d-54df-b617-3e5b7b77acb0", 00:16:59.364 "is_configured": true, 00:16:59.364 "data_offset": 2048, 00:16:59.364 "data_size": 63488 00:16:59.364 } 00:16:59.364 ] 00:16:59.364 } 00:16:59.364 } 00:16:59.364 }' 00:16:59.364 23:58:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:59.622 23:58:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:16:59.622 pt2 00:16:59.622 pt3 00:16:59.622 pt4' 00:16:59.622 23:58:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:59.622 23:58:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:59.622 23:58:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:59.880 23:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:59.880 "name": "pt1", 00:16:59.880 "aliases": [ 00:16:59.880 "1806c7e0-8814-5e58-9e6e-22580da78ed3" 00:16:59.880 ], 00:16:59.880 "product_name": "passthru", 00:16:59.880 "block_size": 512, 00:16:59.880 "num_blocks": 65536, 00:16:59.880 "uuid": "1806c7e0-8814-5e58-9e6e-22580da78ed3", 00:16:59.880 "assigned_rate_limits": { 00:16:59.880 "rw_ios_per_sec": 0, 00:16:59.880 "rw_mbytes_per_sec": 0, 00:16:59.880 "r_mbytes_per_sec": 0, 00:16:59.880 "w_mbytes_per_sec": 0 00:16:59.880 }, 00:16:59.880 "claimed": true, 00:16:59.880 "claim_type": "exclusive_write", 00:16:59.880 "zoned": false, 00:16:59.880 "supported_io_types": { 00:16:59.880 "read": true, 00:16:59.880 "write": true, 00:16:59.880 "unmap": true, 00:16:59.880 "write_zeroes": true, 00:16:59.880 "flush": true, 00:16:59.880 "reset": true, 00:16:59.880 "compare": false, 00:16:59.880 "compare_and_write": false, 00:16:59.880 "abort": true, 00:16:59.880 "nvme_admin": false, 00:16:59.880 "nvme_io": false 00:16:59.880 }, 00:16:59.880 "memory_domains": [ 00:16:59.880 { 00:16:59.880 "dma_device_id": "system", 00:16:59.880 "dma_device_type": 1 00:16:59.880 }, 00:16:59.880 { 00:16:59.880 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.880 "dma_device_type": 2 00:16:59.880 } 00:16:59.880 ], 00:16:59.880 "driver_specific": { 00:16:59.880 "passthru": { 00:16:59.880 "name": "pt1", 00:16:59.880 "base_bdev_name": "malloc1" 00:16:59.880 } 00:16:59.880 } 00:16:59.880 }' 00:16:59.880 23:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:59.880 23:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:59.880 23:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:59.880 23:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:59.880 23:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:59.880 23:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:59.880 23:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:00.138 23:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:00.138 23:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:00.138 23:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:00.138 23:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:00.138 23:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:00.138 23:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:00.138 23:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:00.138 23:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:00.396 23:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:00.396 "name": "pt2", 00:17:00.396 "aliases": [ 00:17:00.396 "571fb3b0-056f-5723-8aac-a20b68df19ad" 00:17:00.396 ], 00:17:00.396 "product_name": "passthru", 00:17:00.396 "block_size": 512, 00:17:00.396 "num_blocks": 65536, 00:17:00.396 "uuid": "571fb3b0-056f-5723-8aac-a20b68df19ad", 00:17:00.396 "assigned_rate_limits": { 00:17:00.396 "rw_ios_per_sec": 0, 00:17:00.396 "rw_mbytes_per_sec": 0, 00:17:00.396 "r_mbytes_per_sec": 0, 00:17:00.396 "w_mbytes_per_sec": 0 00:17:00.396 }, 00:17:00.396 "claimed": true, 00:17:00.396 "claim_type": "exclusive_write", 00:17:00.396 "zoned": false, 00:17:00.396 "supported_io_types": { 00:17:00.396 "read": true, 00:17:00.396 "write": true, 00:17:00.396 "unmap": true, 00:17:00.396 "write_zeroes": true, 00:17:00.396 "flush": true, 00:17:00.396 "reset": true, 00:17:00.396 "compare": false, 00:17:00.396 "compare_and_write": false, 00:17:00.396 "abort": true, 00:17:00.396 "nvme_admin": false, 00:17:00.396 "nvme_io": false 00:17:00.396 }, 00:17:00.396 "memory_domains": [ 00:17:00.396 { 00:17:00.396 "dma_device_id": "system", 00:17:00.396 "dma_device_type": 1 00:17:00.396 }, 00:17:00.396 { 00:17:00.396 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:00.396 "dma_device_type": 2 00:17:00.396 } 00:17:00.396 ], 00:17:00.396 "driver_specific": { 00:17:00.396 "passthru": { 00:17:00.396 "name": "pt2", 00:17:00.396 "base_bdev_name": "malloc2" 00:17:00.396 } 00:17:00.396 } 00:17:00.396 }' 00:17:00.396 23:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:00.396 23:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:00.396 23:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:00.396 23:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:00.653 23:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:00.653 23:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:00.653 23:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:00.653 23:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:00.653 23:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:00.653 23:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:00.653 23:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:00.653 23:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:00.653 23:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:00.653 23:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:00.653 23:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:00.910 23:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:00.910 "name": "pt3", 00:17:00.910 "aliases": [ 00:17:00.910 "7bcd23f1-2fa6-5ba2-9b6a-0018c1c2ea0e" 00:17:00.910 ], 00:17:00.910 "product_name": "passthru", 00:17:00.910 "block_size": 512, 00:17:00.910 "num_blocks": 65536, 00:17:00.910 "uuid": "7bcd23f1-2fa6-5ba2-9b6a-0018c1c2ea0e", 00:17:00.910 "assigned_rate_limits": { 00:17:00.910 "rw_ios_per_sec": 0, 00:17:00.910 "rw_mbytes_per_sec": 0, 00:17:00.910 "r_mbytes_per_sec": 0, 00:17:00.910 "w_mbytes_per_sec": 0 00:17:00.910 }, 00:17:00.910 "claimed": true, 00:17:00.910 "claim_type": "exclusive_write", 00:17:00.910 "zoned": false, 00:17:00.910 "supported_io_types": { 00:17:00.910 "read": true, 00:17:00.910 "write": true, 00:17:00.910 "unmap": true, 00:17:00.910 "write_zeroes": true, 00:17:00.910 "flush": true, 00:17:00.910 "reset": true, 00:17:00.910 "compare": false, 00:17:00.910 "compare_and_write": false, 00:17:00.910 "abort": true, 00:17:00.910 "nvme_admin": false, 00:17:00.910 "nvme_io": false 00:17:00.910 }, 00:17:00.910 "memory_domains": [ 00:17:00.910 { 00:17:00.910 "dma_device_id": "system", 00:17:00.910 "dma_device_type": 1 00:17:00.910 }, 00:17:00.910 { 00:17:00.910 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:00.910 "dma_device_type": 2 00:17:00.910 } 00:17:00.910 ], 00:17:00.910 "driver_specific": { 00:17:00.910 "passthru": { 00:17:00.910 "name": "pt3", 00:17:00.910 "base_bdev_name": "malloc3" 00:17:00.910 } 00:17:00.910 } 00:17:00.910 }' 00:17:00.910 23:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:01.168 23:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:01.168 23:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:01.168 23:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:01.168 23:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:01.168 23:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:01.168 23:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:01.168 23:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:01.168 23:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:01.168 23:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:01.426 23:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:01.426 23:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:01.426 23:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:01.426 23:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:17:01.426 23:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:01.685 23:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:01.685 "name": "pt4", 00:17:01.685 "aliases": [ 00:17:01.685 "ebed89c0-129d-54df-b617-3e5b7b77acb0" 00:17:01.685 ], 00:17:01.685 "product_name": "passthru", 00:17:01.685 "block_size": 512, 00:17:01.685 "num_blocks": 65536, 00:17:01.685 "uuid": "ebed89c0-129d-54df-b617-3e5b7b77acb0", 00:17:01.685 "assigned_rate_limits": { 00:17:01.685 "rw_ios_per_sec": 0, 00:17:01.685 "rw_mbytes_per_sec": 0, 00:17:01.685 "r_mbytes_per_sec": 0, 00:17:01.685 "w_mbytes_per_sec": 0 00:17:01.685 }, 00:17:01.685 "claimed": true, 00:17:01.685 "claim_type": "exclusive_write", 00:17:01.685 "zoned": false, 00:17:01.685 "supported_io_types": { 00:17:01.685 "read": true, 00:17:01.685 "write": true, 00:17:01.685 "unmap": true, 00:17:01.685 "write_zeroes": true, 00:17:01.685 "flush": true, 00:17:01.685 "reset": true, 00:17:01.685 "compare": false, 00:17:01.685 "compare_and_write": false, 00:17:01.685 "abort": true, 00:17:01.685 "nvme_admin": false, 00:17:01.685 "nvme_io": false 00:17:01.685 }, 00:17:01.685 "memory_domains": [ 00:17:01.685 { 00:17:01.685 "dma_device_id": "system", 00:17:01.685 "dma_device_type": 1 00:17:01.685 }, 00:17:01.685 { 00:17:01.685 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.685 "dma_device_type": 2 00:17:01.685 } 00:17:01.685 ], 00:17:01.685 "driver_specific": { 00:17:01.685 "passthru": { 00:17:01.685 "name": "pt4", 00:17:01.685 "base_bdev_name": "malloc4" 00:17:01.685 } 00:17:01.685 } 00:17:01.685 }' 00:17:01.685 23:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:01.685 23:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:01.685 23:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:01.685 23:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:01.685 23:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:01.685 23:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:01.685 23:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:01.944 23:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:01.944 23:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:01.944 23:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:01.944 23:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:01.944 23:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:01.944 23:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:01.944 23:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:17:02.229 [2024-05-14 23:59:02.640144] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:02.229 23:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # '[' 6719f32a-a1bc-43dd-8c8c-d2b7f5d27bd3 '!=' 6719f32a-a1bc-43dd-8c8c-d2b7f5d27bd3 ']' 00:17:02.229 23:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@491 -- # has_redundancy raid0 00:17:02.229 23:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:17:02.229 23:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@216 -- # return 1 00:17:02.229 23:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@568 -- # killprocess 443895 00:17:02.229 23:59:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@946 -- # '[' -z 443895 ']' 00:17:02.229 23:59:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # kill -0 443895 00:17:02.229 23:59:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # uname 00:17:02.229 23:59:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:17:02.229 23:59:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 443895 00:17:02.229 23:59:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:17:02.229 23:59:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:17:02.229 23:59:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 443895' 00:17:02.229 killing process with pid 443895 00:17:02.229 23:59:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@965 -- # kill 443895 00:17:02.229 [2024-05-14 23:59:02.712713] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:02.229 [2024-05-14 23:59:02.712784] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:02.229 [2024-05-14 23:59:02.712852] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:02.229 [2024-05-14 23:59:02.712864] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24d7a20 name raid_bdev1, state offline 00:17:02.229 23:59:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@970 -- # wait 443895 00:17:02.229 [2024-05-14 23:59:02.749287] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:02.488 23:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # return 0 00:17:02.488 00:17:02.488 real 0m16.098s 00:17:02.488 user 0m29.047s 00:17:02.488 sys 0m2.889s 00:17:02.488 23:59:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:17:02.488 23:59:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:02.488 ************************************ 00:17:02.488 END TEST raid_superblock_test 00:17:02.488 ************************************ 00:17:02.488 23:59:03 bdev_raid -- bdev/bdev_raid.sh@814 -- # for level in raid0 concat raid1 00:17:02.488 23:59:03 bdev_raid -- bdev/bdev_raid.sh@815 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:17:02.488 23:59:03 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:17:02.488 23:59:03 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:17:02.488 23:59:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:02.488 ************************************ 00:17:02.488 START TEST raid_state_function_test 00:17:02.488 ************************************ 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1121 -- # raid_state_function_test concat 4 false 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local raid_level=concat 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=4 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local superblock=false 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev4 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # '[' concat '!=' raid1 ']' 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@238 -- # '[' false = true ']' 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@241 -- # superblock_create_arg= 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # raid_pid=446327 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 446327' 00:17:02.488 Process raid pid: 446327 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@247 -- # waitforlisten 446327 /var/tmp/spdk-raid.sock 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@827 -- # '[' -z 446327 ']' 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:02.488 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:17:02.488 23:59:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:02.747 [2024-05-14 23:59:03.132904] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:17:02.747 [2024-05-14 23:59:03.132974] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:02.747 [2024-05-14 23:59:03.263502] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:03.007 [2024-05-14 23:59:03.366605] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:17:03.007 [2024-05-14 23:59:03.428480] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:03.007 [2024-05-14 23:59:03.428521] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:03.572 23:59:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:17:03.572 23:59:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # return 0 00:17:03.572 23:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:03.830 [2024-05-14 23:59:04.291763] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:03.830 [2024-05-14 23:59:04.291808] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:03.830 [2024-05-14 23:59:04.291823] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:03.830 [2024-05-14 23:59:04.291836] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:03.830 [2024-05-14 23:59:04.291845] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:03.830 [2024-05-14 23:59:04.291856] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:03.830 [2024-05-14 23:59:04.291865] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:03.830 [2024-05-14 23:59:04.291876] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:03.830 23:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:03.830 23:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:03.830 23:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:03.830 23:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:03.830 23:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:03.830 23:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:03.830 23:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:03.830 23:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:03.830 23:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:03.830 23:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:03.830 23:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:03.830 23:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:04.088 23:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:04.088 "name": "Existed_Raid", 00:17:04.088 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:04.088 "strip_size_kb": 64, 00:17:04.088 "state": "configuring", 00:17:04.088 "raid_level": "concat", 00:17:04.088 "superblock": false, 00:17:04.088 "num_base_bdevs": 4, 00:17:04.088 "num_base_bdevs_discovered": 0, 00:17:04.088 "num_base_bdevs_operational": 4, 00:17:04.088 "base_bdevs_list": [ 00:17:04.088 { 00:17:04.088 "name": "BaseBdev1", 00:17:04.088 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:04.088 "is_configured": false, 00:17:04.088 "data_offset": 0, 00:17:04.088 "data_size": 0 00:17:04.088 }, 00:17:04.088 { 00:17:04.088 "name": "BaseBdev2", 00:17:04.088 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:04.088 "is_configured": false, 00:17:04.088 "data_offset": 0, 00:17:04.088 "data_size": 0 00:17:04.088 }, 00:17:04.088 { 00:17:04.088 "name": "BaseBdev3", 00:17:04.088 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:04.088 "is_configured": false, 00:17:04.088 "data_offset": 0, 00:17:04.088 "data_size": 0 00:17:04.088 }, 00:17:04.088 { 00:17:04.088 "name": "BaseBdev4", 00:17:04.088 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:04.088 "is_configured": false, 00:17:04.088 "data_offset": 0, 00:17:04.088 "data_size": 0 00:17:04.088 } 00:17:04.088 ] 00:17:04.088 }' 00:17:04.088 23:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:04.088 23:59:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:04.654 23:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:04.912 [2024-05-14 23:59:05.358632] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:04.912 [2024-05-14 23:59:05.358667] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x267cc00 name Existed_Raid, state configuring 00:17:04.912 23:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:05.170 [2024-05-14 23:59:05.567206] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:05.170 [2024-05-14 23:59:05.567246] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:05.170 [2024-05-14 23:59:05.567257] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:05.170 [2024-05-14 23:59:05.567268] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:05.170 [2024-05-14 23:59:05.567277] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:05.170 [2024-05-14 23:59:05.567289] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:05.170 [2024-05-14 23:59:05.567298] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:05.170 [2024-05-14 23:59:05.567310] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:05.170 23:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:05.428 [2024-05-14 23:59:05.825718] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:05.428 BaseBdev1 00:17:05.428 23:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:17:05.428 23:59:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:17:05.428 23:59:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:05.428 23:59:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:17:05.428 23:59:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:05.428 23:59:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:05.428 23:59:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:05.686 23:59:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:05.944 [ 00:17:05.944 { 00:17:05.944 "name": "BaseBdev1", 00:17:05.944 "aliases": [ 00:17:05.944 "ba6a34e1-727f-4de3-a980-6bbe3e7a90a1" 00:17:05.944 ], 00:17:05.944 "product_name": "Malloc disk", 00:17:05.944 "block_size": 512, 00:17:05.944 "num_blocks": 65536, 00:17:05.944 "uuid": "ba6a34e1-727f-4de3-a980-6bbe3e7a90a1", 00:17:05.944 "assigned_rate_limits": { 00:17:05.944 "rw_ios_per_sec": 0, 00:17:05.944 "rw_mbytes_per_sec": 0, 00:17:05.944 "r_mbytes_per_sec": 0, 00:17:05.944 "w_mbytes_per_sec": 0 00:17:05.944 }, 00:17:05.944 "claimed": true, 00:17:05.944 "claim_type": "exclusive_write", 00:17:05.944 "zoned": false, 00:17:05.944 "supported_io_types": { 00:17:05.944 "read": true, 00:17:05.944 "write": true, 00:17:05.944 "unmap": true, 00:17:05.944 "write_zeroes": true, 00:17:05.944 "flush": true, 00:17:05.944 "reset": true, 00:17:05.944 "compare": false, 00:17:05.944 "compare_and_write": false, 00:17:05.944 "abort": true, 00:17:05.944 "nvme_admin": false, 00:17:05.944 "nvme_io": false 00:17:05.944 }, 00:17:05.944 "memory_domains": [ 00:17:05.944 { 00:17:05.944 "dma_device_id": "system", 00:17:05.944 "dma_device_type": 1 00:17:05.944 }, 00:17:05.944 { 00:17:05.944 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:05.944 "dma_device_type": 2 00:17:05.944 } 00:17:05.944 ], 00:17:05.944 "driver_specific": {} 00:17:05.944 } 00:17:05.944 ] 00:17:05.944 23:59:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:17:05.944 23:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:05.944 23:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:05.944 23:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:05.944 23:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:05.944 23:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:05.944 23:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:05.944 23:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:05.944 23:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:05.944 23:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:05.944 23:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:05.944 23:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:05.944 23:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:06.204 23:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:06.204 "name": "Existed_Raid", 00:17:06.204 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:06.204 "strip_size_kb": 64, 00:17:06.204 "state": "configuring", 00:17:06.204 "raid_level": "concat", 00:17:06.204 "superblock": false, 00:17:06.204 "num_base_bdevs": 4, 00:17:06.204 "num_base_bdevs_discovered": 1, 00:17:06.204 "num_base_bdevs_operational": 4, 00:17:06.204 "base_bdevs_list": [ 00:17:06.204 { 00:17:06.204 "name": "BaseBdev1", 00:17:06.204 "uuid": "ba6a34e1-727f-4de3-a980-6bbe3e7a90a1", 00:17:06.204 "is_configured": true, 00:17:06.204 "data_offset": 0, 00:17:06.204 "data_size": 65536 00:17:06.204 }, 00:17:06.204 { 00:17:06.204 "name": "BaseBdev2", 00:17:06.204 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:06.204 "is_configured": false, 00:17:06.204 "data_offset": 0, 00:17:06.204 "data_size": 0 00:17:06.204 }, 00:17:06.204 { 00:17:06.204 "name": "BaseBdev3", 00:17:06.204 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:06.204 "is_configured": false, 00:17:06.204 "data_offset": 0, 00:17:06.204 "data_size": 0 00:17:06.204 }, 00:17:06.204 { 00:17:06.204 "name": "BaseBdev4", 00:17:06.204 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:06.204 "is_configured": false, 00:17:06.204 "data_offset": 0, 00:17:06.204 "data_size": 0 00:17:06.204 } 00:17:06.204 ] 00:17:06.204 }' 00:17:06.204 23:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:06.204 23:59:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:06.773 23:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:07.032 [2024-05-14 23:59:07.385829] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:07.032 [2024-05-14 23:59:07.385872] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x267cea0 name Existed_Raid, state configuring 00:17:07.032 23:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:07.291 [2024-05-14 23:59:07.630516] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:07.291 [2024-05-14 23:59:07.631997] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:07.291 [2024-05-14 23:59:07.632031] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:07.291 [2024-05-14 23:59:07.632042] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:07.291 [2024-05-14 23:59:07.632054] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:07.291 [2024-05-14 23:59:07.632063] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:07.291 [2024-05-14 23:59:07.632075] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:07.291 23:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:17:07.291 23:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:17:07.291 23:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:07.291 23:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:07.291 23:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:07.291 23:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:07.291 23:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:07.291 23:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:07.291 23:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:07.291 23:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:07.291 23:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:07.291 23:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:07.292 23:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:07.292 23:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:07.551 23:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:07.551 "name": "Existed_Raid", 00:17:07.551 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:07.551 "strip_size_kb": 64, 00:17:07.551 "state": "configuring", 00:17:07.551 "raid_level": "concat", 00:17:07.551 "superblock": false, 00:17:07.551 "num_base_bdevs": 4, 00:17:07.551 "num_base_bdevs_discovered": 1, 00:17:07.551 "num_base_bdevs_operational": 4, 00:17:07.551 "base_bdevs_list": [ 00:17:07.551 { 00:17:07.551 "name": "BaseBdev1", 00:17:07.551 "uuid": "ba6a34e1-727f-4de3-a980-6bbe3e7a90a1", 00:17:07.551 "is_configured": true, 00:17:07.551 "data_offset": 0, 00:17:07.551 "data_size": 65536 00:17:07.551 }, 00:17:07.551 { 00:17:07.551 "name": "BaseBdev2", 00:17:07.551 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:07.551 "is_configured": false, 00:17:07.551 "data_offset": 0, 00:17:07.551 "data_size": 0 00:17:07.551 }, 00:17:07.551 { 00:17:07.551 "name": "BaseBdev3", 00:17:07.551 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:07.551 "is_configured": false, 00:17:07.551 "data_offset": 0, 00:17:07.551 "data_size": 0 00:17:07.551 }, 00:17:07.551 { 00:17:07.551 "name": "BaseBdev4", 00:17:07.551 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:07.551 "is_configured": false, 00:17:07.551 "data_offset": 0, 00:17:07.551 "data_size": 0 00:17:07.551 } 00:17:07.551 ] 00:17:07.551 }' 00:17:07.551 23:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:07.551 23:59:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:08.119 23:59:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:08.119 [2024-05-14 23:59:08.708907] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:08.119 BaseBdev2 00:17:08.378 23:59:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:17:08.378 23:59:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:17:08.378 23:59:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:08.378 23:59:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:17:08.378 23:59:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:08.378 23:59:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:08.378 23:59:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:08.637 23:59:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:08.637 [ 00:17:08.637 { 00:17:08.637 "name": "BaseBdev2", 00:17:08.637 "aliases": [ 00:17:08.637 "a55f3bc0-3713-453e-9112-d1da3ead94a8" 00:17:08.637 ], 00:17:08.637 "product_name": "Malloc disk", 00:17:08.637 "block_size": 512, 00:17:08.637 "num_blocks": 65536, 00:17:08.637 "uuid": "a55f3bc0-3713-453e-9112-d1da3ead94a8", 00:17:08.637 "assigned_rate_limits": { 00:17:08.637 "rw_ios_per_sec": 0, 00:17:08.637 "rw_mbytes_per_sec": 0, 00:17:08.637 "r_mbytes_per_sec": 0, 00:17:08.637 "w_mbytes_per_sec": 0 00:17:08.637 }, 00:17:08.637 "claimed": true, 00:17:08.637 "claim_type": "exclusive_write", 00:17:08.637 "zoned": false, 00:17:08.637 "supported_io_types": { 00:17:08.637 "read": true, 00:17:08.637 "write": true, 00:17:08.637 "unmap": true, 00:17:08.637 "write_zeroes": true, 00:17:08.637 "flush": true, 00:17:08.637 "reset": true, 00:17:08.637 "compare": false, 00:17:08.637 "compare_and_write": false, 00:17:08.637 "abort": true, 00:17:08.637 "nvme_admin": false, 00:17:08.637 "nvme_io": false 00:17:08.637 }, 00:17:08.637 "memory_domains": [ 00:17:08.637 { 00:17:08.637 "dma_device_id": "system", 00:17:08.637 "dma_device_type": 1 00:17:08.637 }, 00:17:08.637 { 00:17:08.637 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:08.637 "dma_device_type": 2 00:17:08.637 } 00:17:08.637 ], 00:17:08.637 "driver_specific": {} 00:17:08.637 } 00:17:08.637 ] 00:17:08.637 23:59:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:17:08.637 23:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:17:08.637 23:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:17:08.637 23:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:08.637 23:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:08.637 23:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:08.637 23:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:08.637 23:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:08.637 23:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:08.637 23:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:08.637 23:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:08.637 23:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:08.637 23:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:08.637 23:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:08.637 23:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:08.897 23:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:08.897 "name": "Existed_Raid", 00:17:08.897 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:08.897 "strip_size_kb": 64, 00:17:08.897 "state": "configuring", 00:17:08.897 "raid_level": "concat", 00:17:08.897 "superblock": false, 00:17:08.897 "num_base_bdevs": 4, 00:17:08.897 "num_base_bdevs_discovered": 2, 00:17:08.897 "num_base_bdevs_operational": 4, 00:17:08.897 "base_bdevs_list": [ 00:17:08.897 { 00:17:08.897 "name": "BaseBdev1", 00:17:08.897 "uuid": "ba6a34e1-727f-4de3-a980-6bbe3e7a90a1", 00:17:08.897 "is_configured": true, 00:17:08.897 "data_offset": 0, 00:17:08.897 "data_size": 65536 00:17:08.897 }, 00:17:08.897 { 00:17:08.897 "name": "BaseBdev2", 00:17:08.897 "uuid": "a55f3bc0-3713-453e-9112-d1da3ead94a8", 00:17:08.897 "is_configured": true, 00:17:08.897 "data_offset": 0, 00:17:08.897 "data_size": 65536 00:17:08.897 }, 00:17:08.897 { 00:17:08.897 "name": "BaseBdev3", 00:17:08.897 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:08.897 "is_configured": false, 00:17:08.897 "data_offset": 0, 00:17:08.897 "data_size": 0 00:17:08.897 }, 00:17:08.897 { 00:17:08.897 "name": "BaseBdev4", 00:17:08.897 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:08.897 "is_configured": false, 00:17:08.897 "data_offset": 0, 00:17:08.897 "data_size": 0 00:17:08.897 } 00:17:08.897 ] 00:17:08.897 }' 00:17:08.897 23:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:08.897 23:59:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:09.465 23:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:09.723 [2024-05-14 23:59:10.220323] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:09.723 BaseBdev3 00:17:09.723 23:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:17:09.723 23:59:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:17:09.723 23:59:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:09.724 23:59:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:17:09.724 23:59:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:09.724 23:59:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:09.724 23:59:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:09.981 23:59:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:10.240 [ 00:17:10.240 { 00:17:10.240 "name": "BaseBdev3", 00:17:10.240 "aliases": [ 00:17:10.240 "6222964c-3aa1-4b2c-abd2-39ae9ba8fc30" 00:17:10.240 ], 00:17:10.240 "product_name": "Malloc disk", 00:17:10.240 "block_size": 512, 00:17:10.240 "num_blocks": 65536, 00:17:10.240 "uuid": "6222964c-3aa1-4b2c-abd2-39ae9ba8fc30", 00:17:10.240 "assigned_rate_limits": { 00:17:10.240 "rw_ios_per_sec": 0, 00:17:10.240 "rw_mbytes_per_sec": 0, 00:17:10.240 "r_mbytes_per_sec": 0, 00:17:10.240 "w_mbytes_per_sec": 0 00:17:10.240 }, 00:17:10.240 "claimed": true, 00:17:10.240 "claim_type": "exclusive_write", 00:17:10.240 "zoned": false, 00:17:10.240 "supported_io_types": { 00:17:10.240 "read": true, 00:17:10.240 "write": true, 00:17:10.240 "unmap": true, 00:17:10.240 "write_zeroes": true, 00:17:10.240 "flush": true, 00:17:10.240 "reset": true, 00:17:10.240 "compare": false, 00:17:10.240 "compare_and_write": false, 00:17:10.240 "abort": true, 00:17:10.240 "nvme_admin": false, 00:17:10.240 "nvme_io": false 00:17:10.240 }, 00:17:10.240 "memory_domains": [ 00:17:10.240 { 00:17:10.240 "dma_device_id": "system", 00:17:10.240 "dma_device_type": 1 00:17:10.240 }, 00:17:10.240 { 00:17:10.240 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:10.240 "dma_device_type": 2 00:17:10.240 } 00:17:10.240 ], 00:17:10.240 "driver_specific": {} 00:17:10.240 } 00:17:10.240 ] 00:17:10.240 23:59:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:17:10.240 23:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:17:10.240 23:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:17:10.240 23:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:10.240 23:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:10.240 23:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:10.240 23:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:10.240 23:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:10.240 23:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:10.240 23:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:10.240 23:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:10.240 23:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:10.240 23:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:10.240 23:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:10.240 23:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:10.498 23:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:10.498 "name": "Existed_Raid", 00:17:10.498 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:10.498 "strip_size_kb": 64, 00:17:10.498 "state": "configuring", 00:17:10.498 "raid_level": "concat", 00:17:10.498 "superblock": false, 00:17:10.498 "num_base_bdevs": 4, 00:17:10.498 "num_base_bdevs_discovered": 3, 00:17:10.498 "num_base_bdevs_operational": 4, 00:17:10.498 "base_bdevs_list": [ 00:17:10.498 { 00:17:10.498 "name": "BaseBdev1", 00:17:10.498 "uuid": "ba6a34e1-727f-4de3-a980-6bbe3e7a90a1", 00:17:10.498 "is_configured": true, 00:17:10.498 "data_offset": 0, 00:17:10.498 "data_size": 65536 00:17:10.498 }, 00:17:10.498 { 00:17:10.498 "name": "BaseBdev2", 00:17:10.498 "uuid": "a55f3bc0-3713-453e-9112-d1da3ead94a8", 00:17:10.498 "is_configured": true, 00:17:10.498 "data_offset": 0, 00:17:10.498 "data_size": 65536 00:17:10.498 }, 00:17:10.498 { 00:17:10.498 "name": "BaseBdev3", 00:17:10.498 "uuid": "6222964c-3aa1-4b2c-abd2-39ae9ba8fc30", 00:17:10.498 "is_configured": true, 00:17:10.498 "data_offset": 0, 00:17:10.498 "data_size": 65536 00:17:10.498 }, 00:17:10.498 { 00:17:10.498 "name": "BaseBdev4", 00:17:10.498 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:10.498 "is_configured": false, 00:17:10.498 "data_offset": 0, 00:17:10.498 "data_size": 0 00:17:10.498 } 00:17:10.498 ] 00:17:10.498 }' 00:17:10.498 23:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:10.498 23:59:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:11.064 23:59:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:11.323 [2024-05-14 23:59:11.780166] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:11.323 [2024-05-14 23:59:11.780206] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x267c470 00:17:11.323 [2024-05-14 23:59:11.780214] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:17:11.323 [2024-05-14 23:59:11.780424] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x267cb40 00:17:11.323 [2024-05-14 23:59:11.780552] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x267c470 00:17:11.323 [2024-05-14 23:59:11.780562] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x267c470 00:17:11.323 [2024-05-14 23:59:11.780727] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:11.323 BaseBdev4 00:17:11.323 23:59:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev4 00:17:11.323 23:59:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:17:11.323 23:59:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:11.323 23:59:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:17:11.323 23:59:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:11.323 23:59:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:11.323 23:59:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:11.582 23:59:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:11.841 [ 00:17:11.841 { 00:17:11.841 "name": "BaseBdev4", 00:17:11.841 "aliases": [ 00:17:11.841 "ec820b1b-3075-4be2-88f3-f90bbe663bb5" 00:17:11.841 ], 00:17:11.841 "product_name": "Malloc disk", 00:17:11.841 "block_size": 512, 00:17:11.841 "num_blocks": 65536, 00:17:11.841 "uuid": "ec820b1b-3075-4be2-88f3-f90bbe663bb5", 00:17:11.841 "assigned_rate_limits": { 00:17:11.841 "rw_ios_per_sec": 0, 00:17:11.841 "rw_mbytes_per_sec": 0, 00:17:11.841 "r_mbytes_per_sec": 0, 00:17:11.841 "w_mbytes_per_sec": 0 00:17:11.841 }, 00:17:11.841 "claimed": true, 00:17:11.841 "claim_type": "exclusive_write", 00:17:11.841 "zoned": false, 00:17:11.841 "supported_io_types": { 00:17:11.841 "read": true, 00:17:11.841 "write": true, 00:17:11.841 "unmap": true, 00:17:11.841 "write_zeroes": true, 00:17:11.841 "flush": true, 00:17:11.841 "reset": true, 00:17:11.841 "compare": false, 00:17:11.841 "compare_and_write": false, 00:17:11.841 "abort": true, 00:17:11.841 "nvme_admin": false, 00:17:11.841 "nvme_io": false 00:17:11.841 }, 00:17:11.841 "memory_domains": [ 00:17:11.841 { 00:17:11.841 "dma_device_id": "system", 00:17:11.841 "dma_device_type": 1 00:17:11.841 }, 00:17:11.841 { 00:17:11.841 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:11.841 "dma_device_type": 2 00:17:11.841 } 00:17:11.841 ], 00:17:11.841 "driver_specific": {} 00:17:11.841 } 00:17:11.841 ] 00:17:11.841 23:59:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:17:11.841 23:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:17:11.841 23:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:17:11.841 23:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:17:11.841 23:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:11.841 23:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:17:11.841 23:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:11.841 23:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:11.841 23:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:11.841 23:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:11.841 23:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:11.841 23:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:11.841 23:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:11.841 23:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:11.841 23:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:12.100 23:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:12.100 "name": "Existed_Raid", 00:17:12.100 "uuid": "33cc5bcd-76b3-4e8e-9dcf-3428816baa77", 00:17:12.100 "strip_size_kb": 64, 00:17:12.100 "state": "online", 00:17:12.100 "raid_level": "concat", 00:17:12.100 "superblock": false, 00:17:12.100 "num_base_bdevs": 4, 00:17:12.100 "num_base_bdevs_discovered": 4, 00:17:12.100 "num_base_bdevs_operational": 4, 00:17:12.100 "base_bdevs_list": [ 00:17:12.100 { 00:17:12.100 "name": "BaseBdev1", 00:17:12.100 "uuid": "ba6a34e1-727f-4de3-a980-6bbe3e7a90a1", 00:17:12.100 "is_configured": true, 00:17:12.100 "data_offset": 0, 00:17:12.100 "data_size": 65536 00:17:12.100 }, 00:17:12.100 { 00:17:12.100 "name": "BaseBdev2", 00:17:12.100 "uuid": "a55f3bc0-3713-453e-9112-d1da3ead94a8", 00:17:12.100 "is_configured": true, 00:17:12.100 "data_offset": 0, 00:17:12.100 "data_size": 65536 00:17:12.100 }, 00:17:12.100 { 00:17:12.100 "name": "BaseBdev3", 00:17:12.101 "uuid": "6222964c-3aa1-4b2c-abd2-39ae9ba8fc30", 00:17:12.101 "is_configured": true, 00:17:12.101 "data_offset": 0, 00:17:12.101 "data_size": 65536 00:17:12.101 }, 00:17:12.101 { 00:17:12.101 "name": "BaseBdev4", 00:17:12.101 "uuid": "ec820b1b-3075-4be2-88f3-f90bbe663bb5", 00:17:12.101 "is_configured": true, 00:17:12.101 "data_offset": 0, 00:17:12.101 "data_size": 65536 00:17:12.101 } 00:17:12.101 ] 00:17:12.101 }' 00:17:12.101 23:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:12.101 23:59:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:12.668 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:17:12.668 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:17:12.668 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:17:12.668 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:17:12.668 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:17:12.668 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:17:12.668 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:12.668 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:17:12.926 [2024-05-14 23:59:13.292489] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:12.926 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:17:12.926 "name": "Existed_Raid", 00:17:12.926 "aliases": [ 00:17:12.926 "33cc5bcd-76b3-4e8e-9dcf-3428816baa77" 00:17:12.926 ], 00:17:12.926 "product_name": "Raid Volume", 00:17:12.926 "block_size": 512, 00:17:12.926 "num_blocks": 262144, 00:17:12.926 "uuid": "33cc5bcd-76b3-4e8e-9dcf-3428816baa77", 00:17:12.926 "assigned_rate_limits": { 00:17:12.926 "rw_ios_per_sec": 0, 00:17:12.926 "rw_mbytes_per_sec": 0, 00:17:12.926 "r_mbytes_per_sec": 0, 00:17:12.926 "w_mbytes_per_sec": 0 00:17:12.926 }, 00:17:12.926 "claimed": false, 00:17:12.926 "zoned": false, 00:17:12.926 "supported_io_types": { 00:17:12.926 "read": true, 00:17:12.926 "write": true, 00:17:12.926 "unmap": true, 00:17:12.926 "write_zeroes": true, 00:17:12.926 "flush": true, 00:17:12.926 "reset": true, 00:17:12.926 "compare": false, 00:17:12.926 "compare_and_write": false, 00:17:12.926 "abort": false, 00:17:12.926 "nvme_admin": false, 00:17:12.926 "nvme_io": false 00:17:12.926 }, 00:17:12.926 "memory_domains": [ 00:17:12.926 { 00:17:12.926 "dma_device_id": "system", 00:17:12.926 "dma_device_type": 1 00:17:12.926 }, 00:17:12.926 { 00:17:12.926 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.926 "dma_device_type": 2 00:17:12.926 }, 00:17:12.926 { 00:17:12.926 "dma_device_id": "system", 00:17:12.926 "dma_device_type": 1 00:17:12.926 }, 00:17:12.926 { 00:17:12.926 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.926 "dma_device_type": 2 00:17:12.926 }, 00:17:12.926 { 00:17:12.926 "dma_device_id": "system", 00:17:12.926 "dma_device_type": 1 00:17:12.926 }, 00:17:12.926 { 00:17:12.926 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.926 "dma_device_type": 2 00:17:12.926 }, 00:17:12.926 { 00:17:12.926 "dma_device_id": "system", 00:17:12.926 "dma_device_type": 1 00:17:12.926 }, 00:17:12.926 { 00:17:12.926 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.926 "dma_device_type": 2 00:17:12.926 } 00:17:12.926 ], 00:17:12.926 "driver_specific": { 00:17:12.926 "raid": { 00:17:12.926 "uuid": "33cc5bcd-76b3-4e8e-9dcf-3428816baa77", 00:17:12.926 "strip_size_kb": 64, 00:17:12.926 "state": "online", 00:17:12.926 "raid_level": "concat", 00:17:12.926 "superblock": false, 00:17:12.926 "num_base_bdevs": 4, 00:17:12.926 "num_base_bdevs_discovered": 4, 00:17:12.926 "num_base_bdevs_operational": 4, 00:17:12.926 "base_bdevs_list": [ 00:17:12.926 { 00:17:12.926 "name": "BaseBdev1", 00:17:12.926 "uuid": "ba6a34e1-727f-4de3-a980-6bbe3e7a90a1", 00:17:12.926 "is_configured": true, 00:17:12.927 "data_offset": 0, 00:17:12.927 "data_size": 65536 00:17:12.927 }, 00:17:12.927 { 00:17:12.927 "name": "BaseBdev2", 00:17:12.927 "uuid": "a55f3bc0-3713-453e-9112-d1da3ead94a8", 00:17:12.927 "is_configured": true, 00:17:12.927 "data_offset": 0, 00:17:12.927 "data_size": 65536 00:17:12.927 }, 00:17:12.927 { 00:17:12.927 "name": "BaseBdev3", 00:17:12.927 "uuid": "6222964c-3aa1-4b2c-abd2-39ae9ba8fc30", 00:17:12.927 "is_configured": true, 00:17:12.927 "data_offset": 0, 00:17:12.927 "data_size": 65536 00:17:12.927 }, 00:17:12.927 { 00:17:12.927 "name": "BaseBdev4", 00:17:12.927 "uuid": "ec820b1b-3075-4be2-88f3-f90bbe663bb5", 00:17:12.927 "is_configured": true, 00:17:12.927 "data_offset": 0, 00:17:12.927 "data_size": 65536 00:17:12.927 } 00:17:12.927 ] 00:17:12.927 } 00:17:12.927 } 00:17:12.927 }' 00:17:12.927 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:12.927 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:17:12.927 BaseBdev2 00:17:12.927 BaseBdev3 00:17:12.927 BaseBdev4' 00:17:12.927 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:12.927 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:12.927 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:13.184 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:13.184 "name": "BaseBdev1", 00:17:13.184 "aliases": [ 00:17:13.184 "ba6a34e1-727f-4de3-a980-6bbe3e7a90a1" 00:17:13.184 ], 00:17:13.184 "product_name": "Malloc disk", 00:17:13.184 "block_size": 512, 00:17:13.184 "num_blocks": 65536, 00:17:13.184 "uuid": "ba6a34e1-727f-4de3-a980-6bbe3e7a90a1", 00:17:13.184 "assigned_rate_limits": { 00:17:13.184 "rw_ios_per_sec": 0, 00:17:13.184 "rw_mbytes_per_sec": 0, 00:17:13.184 "r_mbytes_per_sec": 0, 00:17:13.184 "w_mbytes_per_sec": 0 00:17:13.184 }, 00:17:13.184 "claimed": true, 00:17:13.184 "claim_type": "exclusive_write", 00:17:13.184 "zoned": false, 00:17:13.184 "supported_io_types": { 00:17:13.184 "read": true, 00:17:13.184 "write": true, 00:17:13.184 "unmap": true, 00:17:13.184 "write_zeroes": true, 00:17:13.184 "flush": true, 00:17:13.184 "reset": true, 00:17:13.184 "compare": false, 00:17:13.184 "compare_and_write": false, 00:17:13.184 "abort": true, 00:17:13.184 "nvme_admin": false, 00:17:13.184 "nvme_io": false 00:17:13.184 }, 00:17:13.184 "memory_domains": [ 00:17:13.184 { 00:17:13.184 "dma_device_id": "system", 00:17:13.184 "dma_device_type": 1 00:17:13.184 }, 00:17:13.184 { 00:17:13.184 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:13.184 "dma_device_type": 2 00:17:13.184 } 00:17:13.184 ], 00:17:13.184 "driver_specific": {} 00:17:13.184 }' 00:17:13.184 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:13.185 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:13.185 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:13.185 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:13.185 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:13.185 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:13.185 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:13.185 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:13.443 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:13.443 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:13.443 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:13.443 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:13.443 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:13.443 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:13.443 23:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:13.702 23:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:13.702 "name": "BaseBdev2", 00:17:13.702 "aliases": [ 00:17:13.702 "a55f3bc0-3713-453e-9112-d1da3ead94a8" 00:17:13.702 ], 00:17:13.702 "product_name": "Malloc disk", 00:17:13.702 "block_size": 512, 00:17:13.702 "num_blocks": 65536, 00:17:13.702 "uuid": "a55f3bc0-3713-453e-9112-d1da3ead94a8", 00:17:13.702 "assigned_rate_limits": { 00:17:13.702 "rw_ios_per_sec": 0, 00:17:13.702 "rw_mbytes_per_sec": 0, 00:17:13.702 "r_mbytes_per_sec": 0, 00:17:13.702 "w_mbytes_per_sec": 0 00:17:13.702 }, 00:17:13.702 "claimed": true, 00:17:13.702 "claim_type": "exclusive_write", 00:17:13.702 "zoned": false, 00:17:13.702 "supported_io_types": { 00:17:13.702 "read": true, 00:17:13.702 "write": true, 00:17:13.702 "unmap": true, 00:17:13.702 "write_zeroes": true, 00:17:13.702 "flush": true, 00:17:13.702 "reset": true, 00:17:13.702 "compare": false, 00:17:13.702 "compare_and_write": false, 00:17:13.702 "abort": true, 00:17:13.702 "nvme_admin": false, 00:17:13.702 "nvme_io": false 00:17:13.702 }, 00:17:13.702 "memory_domains": [ 00:17:13.702 { 00:17:13.702 "dma_device_id": "system", 00:17:13.702 "dma_device_type": 1 00:17:13.702 }, 00:17:13.702 { 00:17:13.702 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:13.702 "dma_device_type": 2 00:17:13.702 } 00:17:13.702 ], 00:17:13.702 "driver_specific": {} 00:17:13.702 }' 00:17:13.702 23:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:13.702 23:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:13.702 23:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:13.702 23:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:13.702 23:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:13.961 23:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:13.961 23:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:13.961 23:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:13.961 23:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:13.961 23:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:13.961 23:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:13.961 23:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:13.961 23:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:13.961 23:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:13.961 23:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:14.220 23:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:14.220 "name": "BaseBdev3", 00:17:14.220 "aliases": [ 00:17:14.220 "6222964c-3aa1-4b2c-abd2-39ae9ba8fc30" 00:17:14.220 ], 00:17:14.220 "product_name": "Malloc disk", 00:17:14.220 "block_size": 512, 00:17:14.220 "num_blocks": 65536, 00:17:14.220 "uuid": "6222964c-3aa1-4b2c-abd2-39ae9ba8fc30", 00:17:14.220 "assigned_rate_limits": { 00:17:14.220 "rw_ios_per_sec": 0, 00:17:14.220 "rw_mbytes_per_sec": 0, 00:17:14.220 "r_mbytes_per_sec": 0, 00:17:14.220 "w_mbytes_per_sec": 0 00:17:14.220 }, 00:17:14.220 "claimed": true, 00:17:14.220 "claim_type": "exclusive_write", 00:17:14.220 "zoned": false, 00:17:14.220 "supported_io_types": { 00:17:14.220 "read": true, 00:17:14.220 "write": true, 00:17:14.220 "unmap": true, 00:17:14.220 "write_zeroes": true, 00:17:14.220 "flush": true, 00:17:14.220 "reset": true, 00:17:14.220 "compare": false, 00:17:14.220 "compare_and_write": false, 00:17:14.220 "abort": true, 00:17:14.220 "nvme_admin": false, 00:17:14.220 "nvme_io": false 00:17:14.220 }, 00:17:14.220 "memory_domains": [ 00:17:14.220 { 00:17:14.220 "dma_device_id": "system", 00:17:14.220 "dma_device_type": 1 00:17:14.220 }, 00:17:14.220 { 00:17:14.220 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:14.220 "dma_device_type": 2 00:17:14.220 } 00:17:14.220 ], 00:17:14.220 "driver_specific": {} 00:17:14.220 }' 00:17:14.220 23:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:14.220 23:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:14.479 23:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:14.479 23:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:14.479 23:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:14.479 23:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:14.479 23:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:14.479 23:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:14.479 23:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:14.479 23:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:14.479 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:14.737 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:14.737 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:14.737 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:14.737 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:14.737 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:14.737 "name": "BaseBdev4", 00:17:14.737 "aliases": [ 00:17:14.737 "ec820b1b-3075-4be2-88f3-f90bbe663bb5" 00:17:14.737 ], 00:17:14.737 "product_name": "Malloc disk", 00:17:14.737 "block_size": 512, 00:17:14.737 "num_blocks": 65536, 00:17:14.737 "uuid": "ec820b1b-3075-4be2-88f3-f90bbe663bb5", 00:17:14.737 "assigned_rate_limits": { 00:17:14.737 "rw_ios_per_sec": 0, 00:17:14.737 "rw_mbytes_per_sec": 0, 00:17:14.737 "r_mbytes_per_sec": 0, 00:17:14.737 "w_mbytes_per_sec": 0 00:17:14.737 }, 00:17:14.737 "claimed": true, 00:17:14.737 "claim_type": "exclusive_write", 00:17:14.737 "zoned": false, 00:17:14.737 "supported_io_types": { 00:17:14.737 "read": true, 00:17:14.737 "write": true, 00:17:14.737 "unmap": true, 00:17:14.737 "write_zeroes": true, 00:17:14.737 "flush": true, 00:17:14.737 "reset": true, 00:17:14.737 "compare": false, 00:17:14.737 "compare_and_write": false, 00:17:14.737 "abort": true, 00:17:14.737 "nvme_admin": false, 00:17:14.737 "nvme_io": false 00:17:14.737 }, 00:17:14.737 "memory_domains": [ 00:17:14.737 { 00:17:14.737 "dma_device_id": "system", 00:17:14.737 "dma_device_type": 1 00:17:14.737 }, 00:17:14.737 { 00:17:14.737 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:14.737 "dma_device_type": 2 00:17:14.737 } 00:17:14.737 ], 00:17:14.737 "driver_specific": {} 00:17:14.737 }' 00:17:14.737 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:14.996 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:14.996 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:14.996 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:14.996 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:14.996 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:14.996 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:14.996 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:14.996 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:14.996 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:15.254 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:15.254 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:15.254 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:15.514 [2024-05-14 23:59:15.871083] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:15.514 [2024-05-14 23:59:15.871112] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:15.514 [2024-05-14 23:59:15.871163] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:15.514 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # local expected_state 00:17:15.514 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # has_redundancy concat 00:17:15.514 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:17:15.514 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@216 -- # return 1 00:17:15.514 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:17:15.514 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:17:15.514 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:15.514 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:17:15.514 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:15.514 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:15.514 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:17:15.514 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:15.514 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:15.514 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:15.514 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:15.514 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:15.514 23:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:15.773 23:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:15.773 "name": "Existed_Raid", 00:17:15.773 "uuid": "33cc5bcd-76b3-4e8e-9dcf-3428816baa77", 00:17:15.773 "strip_size_kb": 64, 00:17:15.773 "state": "offline", 00:17:15.773 "raid_level": "concat", 00:17:15.773 "superblock": false, 00:17:15.773 "num_base_bdevs": 4, 00:17:15.773 "num_base_bdevs_discovered": 3, 00:17:15.773 "num_base_bdevs_operational": 3, 00:17:15.773 "base_bdevs_list": [ 00:17:15.773 { 00:17:15.773 "name": null, 00:17:15.773 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:15.773 "is_configured": false, 00:17:15.773 "data_offset": 0, 00:17:15.773 "data_size": 65536 00:17:15.773 }, 00:17:15.773 { 00:17:15.773 "name": "BaseBdev2", 00:17:15.773 "uuid": "a55f3bc0-3713-453e-9112-d1da3ead94a8", 00:17:15.773 "is_configured": true, 00:17:15.773 "data_offset": 0, 00:17:15.773 "data_size": 65536 00:17:15.773 }, 00:17:15.773 { 00:17:15.773 "name": "BaseBdev3", 00:17:15.773 "uuid": "6222964c-3aa1-4b2c-abd2-39ae9ba8fc30", 00:17:15.773 "is_configured": true, 00:17:15.773 "data_offset": 0, 00:17:15.773 "data_size": 65536 00:17:15.773 }, 00:17:15.773 { 00:17:15.773 "name": "BaseBdev4", 00:17:15.773 "uuid": "ec820b1b-3075-4be2-88f3-f90bbe663bb5", 00:17:15.773 "is_configured": true, 00:17:15.773 "data_offset": 0, 00:17:15.773 "data_size": 65536 00:17:15.773 } 00:17:15.773 ] 00:17:15.773 }' 00:17:15.773 23:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:15.773 23:59:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:16.373 23:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:17:16.373 23:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:17:16.373 23:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.373 23:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:17:16.631 23:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:17:16.631 23:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:16.631 23:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:16.631 [2024-05-14 23:59:17.195652] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:16.889 23:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:17:16.889 23:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:17:16.889 23:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.889 23:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:17:16.889 23:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:17:16.889 23:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:16.889 23:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:17.148 [2024-05-14 23:59:17.645284] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:17.148 23:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:17:17.148 23:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:17:17.148 23:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:17.148 23:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:17:17.406 23:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:17:17.406 23:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:17.406 23:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:17:17.664 [2024-05-14 23:59:18.153022] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:17:17.664 [2024-05-14 23:59:18.153072] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x267c470 name Existed_Raid, state offline 00:17:17.664 23:59:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:17:17.664 23:59:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:17:17.664 23:59:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:17.664 23:59:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:17:17.922 23:59:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:17:17.922 23:59:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:17:17.922 23:59:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@300 -- # '[' 4 -gt 2 ']' 00:17:17.922 23:59:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:17:17.922 23:59:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:17:17.922 23:59:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:18.181 BaseBdev2 00:17:18.181 23:59:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:17:18.181 23:59:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:17:18.181 23:59:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:18.181 23:59:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:17:18.181 23:59:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:18.181 23:59:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:18.181 23:59:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:18.440 23:59:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:18.698 [ 00:17:18.698 { 00:17:18.698 "name": "BaseBdev2", 00:17:18.698 "aliases": [ 00:17:18.698 "5f99205a-57c4-4c46-b580-8999372308b7" 00:17:18.698 ], 00:17:18.698 "product_name": "Malloc disk", 00:17:18.698 "block_size": 512, 00:17:18.698 "num_blocks": 65536, 00:17:18.698 "uuid": "5f99205a-57c4-4c46-b580-8999372308b7", 00:17:18.698 "assigned_rate_limits": { 00:17:18.698 "rw_ios_per_sec": 0, 00:17:18.698 "rw_mbytes_per_sec": 0, 00:17:18.698 "r_mbytes_per_sec": 0, 00:17:18.698 "w_mbytes_per_sec": 0 00:17:18.698 }, 00:17:18.698 "claimed": false, 00:17:18.698 "zoned": false, 00:17:18.698 "supported_io_types": { 00:17:18.698 "read": true, 00:17:18.698 "write": true, 00:17:18.698 "unmap": true, 00:17:18.698 "write_zeroes": true, 00:17:18.698 "flush": true, 00:17:18.698 "reset": true, 00:17:18.698 "compare": false, 00:17:18.698 "compare_and_write": false, 00:17:18.698 "abort": true, 00:17:18.698 "nvme_admin": false, 00:17:18.698 "nvme_io": false 00:17:18.698 }, 00:17:18.698 "memory_domains": [ 00:17:18.698 { 00:17:18.698 "dma_device_id": "system", 00:17:18.698 "dma_device_type": 1 00:17:18.698 }, 00:17:18.698 { 00:17:18.698 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:18.698 "dma_device_type": 2 00:17:18.698 } 00:17:18.698 ], 00:17:18.698 "driver_specific": {} 00:17:18.698 } 00:17:18.698 ] 00:17:18.698 23:59:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:17:18.698 23:59:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:17:18.698 23:59:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:17:18.698 23:59:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:18.957 BaseBdev3 00:17:18.957 23:59:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:17:18.957 23:59:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:17:18.957 23:59:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:18.957 23:59:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:17:18.957 23:59:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:18.957 23:59:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:18.957 23:59:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:19.215 23:59:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:19.472 [ 00:17:19.472 { 00:17:19.472 "name": "BaseBdev3", 00:17:19.472 "aliases": [ 00:17:19.472 "dd37aea9-5e92-4375-92da-887c373673a6" 00:17:19.472 ], 00:17:19.472 "product_name": "Malloc disk", 00:17:19.472 "block_size": 512, 00:17:19.472 "num_blocks": 65536, 00:17:19.472 "uuid": "dd37aea9-5e92-4375-92da-887c373673a6", 00:17:19.472 "assigned_rate_limits": { 00:17:19.472 "rw_ios_per_sec": 0, 00:17:19.472 "rw_mbytes_per_sec": 0, 00:17:19.472 "r_mbytes_per_sec": 0, 00:17:19.472 "w_mbytes_per_sec": 0 00:17:19.472 }, 00:17:19.472 "claimed": false, 00:17:19.472 "zoned": false, 00:17:19.472 "supported_io_types": { 00:17:19.472 "read": true, 00:17:19.472 "write": true, 00:17:19.472 "unmap": true, 00:17:19.472 "write_zeroes": true, 00:17:19.472 "flush": true, 00:17:19.472 "reset": true, 00:17:19.472 "compare": false, 00:17:19.472 "compare_and_write": false, 00:17:19.472 "abort": true, 00:17:19.472 "nvme_admin": false, 00:17:19.472 "nvme_io": false 00:17:19.472 }, 00:17:19.472 "memory_domains": [ 00:17:19.472 { 00:17:19.472 "dma_device_id": "system", 00:17:19.472 "dma_device_type": 1 00:17:19.472 }, 00:17:19.472 { 00:17:19.472 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:19.472 "dma_device_type": 2 00:17:19.472 } 00:17:19.472 ], 00:17:19.472 "driver_specific": {} 00:17:19.472 } 00:17:19.472 ] 00:17:19.472 23:59:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:17:19.472 23:59:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:17:19.472 23:59:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:17:19.472 23:59:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:19.729 BaseBdev4 00:17:19.729 23:59:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev4 00:17:19.729 23:59:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:17:19.729 23:59:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:19.729 23:59:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:17:19.729 23:59:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:19.729 23:59:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:19.729 23:59:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:19.986 23:59:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:19.986 [ 00:17:19.986 { 00:17:19.986 "name": "BaseBdev4", 00:17:19.986 "aliases": [ 00:17:19.986 "d704c5e4-af1c-41b3-8a89-35aa2fb90ceb" 00:17:19.986 ], 00:17:19.986 "product_name": "Malloc disk", 00:17:19.986 "block_size": 512, 00:17:19.986 "num_blocks": 65536, 00:17:19.986 "uuid": "d704c5e4-af1c-41b3-8a89-35aa2fb90ceb", 00:17:19.986 "assigned_rate_limits": { 00:17:19.986 "rw_ios_per_sec": 0, 00:17:19.986 "rw_mbytes_per_sec": 0, 00:17:19.986 "r_mbytes_per_sec": 0, 00:17:19.986 "w_mbytes_per_sec": 0 00:17:19.986 }, 00:17:19.986 "claimed": false, 00:17:19.986 "zoned": false, 00:17:19.986 "supported_io_types": { 00:17:19.986 "read": true, 00:17:19.986 "write": true, 00:17:19.986 "unmap": true, 00:17:19.986 "write_zeroes": true, 00:17:19.986 "flush": true, 00:17:19.986 "reset": true, 00:17:19.986 "compare": false, 00:17:19.986 "compare_and_write": false, 00:17:19.986 "abort": true, 00:17:19.986 "nvme_admin": false, 00:17:19.986 "nvme_io": false 00:17:19.986 }, 00:17:19.986 "memory_domains": [ 00:17:19.986 { 00:17:19.986 "dma_device_id": "system", 00:17:19.986 "dma_device_type": 1 00:17:19.986 }, 00:17:19.986 { 00:17:19.986 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:19.986 "dma_device_type": 2 00:17:19.986 } 00:17:19.987 ], 00:17:19.987 "driver_specific": {} 00:17:19.987 } 00:17:19.987 ] 00:17:20.244 23:59:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:17:20.244 23:59:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:17:20.244 23:59:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:17:20.244 23:59:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:20.244 [2024-05-14 23:59:20.799820] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:20.244 [2024-05-14 23:59:20.799866] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:20.244 [2024-05-14 23:59:20.799886] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:20.244 [2024-05-14 23:59:20.801266] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:20.244 [2024-05-14 23:59:20.801310] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:20.244 23:59:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:20.244 23:59:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:20.245 23:59:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:20.245 23:59:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:20.245 23:59:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:20.245 23:59:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:20.245 23:59:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:20.245 23:59:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:20.245 23:59:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:20.245 23:59:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:20.245 23:59:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:20.245 23:59:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:20.503 23:59:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:20.503 "name": "Existed_Raid", 00:17:20.503 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:20.503 "strip_size_kb": 64, 00:17:20.503 "state": "configuring", 00:17:20.503 "raid_level": "concat", 00:17:20.503 "superblock": false, 00:17:20.503 "num_base_bdevs": 4, 00:17:20.503 "num_base_bdevs_discovered": 3, 00:17:20.503 "num_base_bdevs_operational": 4, 00:17:20.503 "base_bdevs_list": [ 00:17:20.503 { 00:17:20.503 "name": "BaseBdev1", 00:17:20.503 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:20.503 "is_configured": false, 00:17:20.503 "data_offset": 0, 00:17:20.503 "data_size": 0 00:17:20.503 }, 00:17:20.503 { 00:17:20.503 "name": "BaseBdev2", 00:17:20.503 "uuid": "5f99205a-57c4-4c46-b580-8999372308b7", 00:17:20.503 "is_configured": true, 00:17:20.503 "data_offset": 0, 00:17:20.503 "data_size": 65536 00:17:20.503 }, 00:17:20.503 { 00:17:20.503 "name": "BaseBdev3", 00:17:20.503 "uuid": "dd37aea9-5e92-4375-92da-887c373673a6", 00:17:20.503 "is_configured": true, 00:17:20.503 "data_offset": 0, 00:17:20.503 "data_size": 65536 00:17:20.503 }, 00:17:20.503 { 00:17:20.503 "name": "BaseBdev4", 00:17:20.503 "uuid": "d704c5e4-af1c-41b3-8a89-35aa2fb90ceb", 00:17:20.503 "is_configured": true, 00:17:20.503 "data_offset": 0, 00:17:20.503 "data_size": 65536 00:17:20.503 } 00:17:20.503 ] 00:17:20.503 }' 00:17:20.503 23:59:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:20.503 23:59:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:21.070 23:59:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:21.328 [2024-05-14 23:59:21.870805] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:21.328 23:59:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:21.328 23:59:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:21.328 23:59:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:21.328 23:59:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:21.328 23:59:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:21.328 23:59:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:21.328 23:59:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:21.328 23:59:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:21.328 23:59:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:21.328 23:59:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:21.328 23:59:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:21.328 23:59:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:21.586 23:59:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:21.586 "name": "Existed_Raid", 00:17:21.586 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:21.586 "strip_size_kb": 64, 00:17:21.586 "state": "configuring", 00:17:21.586 "raid_level": "concat", 00:17:21.586 "superblock": false, 00:17:21.586 "num_base_bdevs": 4, 00:17:21.586 "num_base_bdevs_discovered": 2, 00:17:21.586 "num_base_bdevs_operational": 4, 00:17:21.586 "base_bdevs_list": [ 00:17:21.586 { 00:17:21.586 "name": "BaseBdev1", 00:17:21.586 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:21.586 "is_configured": false, 00:17:21.586 "data_offset": 0, 00:17:21.586 "data_size": 0 00:17:21.586 }, 00:17:21.586 { 00:17:21.586 "name": null, 00:17:21.586 "uuid": "5f99205a-57c4-4c46-b580-8999372308b7", 00:17:21.586 "is_configured": false, 00:17:21.586 "data_offset": 0, 00:17:21.586 "data_size": 65536 00:17:21.586 }, 00:17:21.586 { 00:17:21.586 "name": "BaseBdev3", 00:17:21.586 "uuid": "dd37aea9-5e92-4375-92da-887c373673a6", 00:17:21.586 "is_configured": true, 00:17:21.586 "data_offset": 0, 00:17:21.586 "data_size": 65536 00:17:21.586 }, 00:17:21.586 { 00:17:21.586 "name": "BaseBdev4", 00:17:21.586 "uuid": "d704c5e4-af1c-41b3-8a89-35aa2fb90ceb", 00:17:21.586 "is_configured": true, 00:17:21.586 "data_offset": 0, 00:17:21.586 "data_size": 65536 00:17:21.586 } 00:17:21.586 ] 00:17:21.586 }' 00:17:21.586 23:59:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:21.586 23:59:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:22.150 23:59:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:22.150 23:59:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:22.407 23:59:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:17:22.407 23:59:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:22.665 [2024-05-14 23:59:23.142915] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:22.665 BaseBdev1 00:17:22.665 23:59:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:17:22.665 23:59:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:17:22.665 23:59:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:22.665 23:59:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:17:22.665 23:59:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:22.665 23:59:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:22.665 23:59:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:22.923 23:59:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:23.182 [ 00:17:23.182 { 00:17:23.182 "name": "BaseBdev1", 00:17:23.182 "aliases": [ 00:17:23.182 "c3c681a8-dc34-443f-ab32-70aaf687804a" 00:17:23.182 ], 00:17:23.182 "product_name": "Malloc disk", 00:17:23.182 "block_size": 512, 00:17:23.182 "num_blocks": 65536, 00:17:23.182 "uuid": "c3c681a8-dc34-443f-ab32-70aaf687804a", 00:17:23.182 "assigned_rate_limits": { 00:17:23.182 "rw_ios_per_sec": 0, 00:17:23.182 "rw_mbytes_per_sec": 0, 00:17:23.182 "r_mbytes_per_sec": 0, 00:17:23.182 "w_mbytes_per_sec": 0 00:17:23.182 }, 00:17:23.182 "claimed": true, 00:17:23.182 "claim_type": "exclusive_write", 00:17:23.182 "zoned": false, 00:17:23.182 "supported_io_types": { 00:17:23.182 "read": true, 00:17:23.182 "write": true, 00:17:23.182 "unmap": true, 00:17:23.182 "write_zeroes": true, 00:17:23.182 "flush": true, 00:17:23.182 "reset": true, 00:17:23.182 "compare": false, 00:17:23.182 "compare_and_write": false, 00:17:23.182 "abort": true, 00:17:23.182 "nvme_admin": false, 00:17:23.182 "nvme_io": false 00:17:23.182 }, 00:17:23.182 "memory_domains": [ 00:17:23.182 { 00:17:23.182 "dma_device_id": "system", 00:17:23.182 "dma_device_type": 1 00:17:23.182 }, 00:17:23.182 { 00:17:23.182 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:23.182 "dma_device_type": 2 00:17:23.182 } 00:17:23.182 ], 00:17:23.182 "driver_specific": {} 00:17:23.182 } 00:17:23.182 ] 00:17:23.182 23:59:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:17:23.182 23:59:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:23.182 23:59:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:23.182 23:59:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:23.182 23:59:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:23.182 23:59:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:23.182 23:59:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:23.182 23:59:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:23.182 23:59:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:23.182 23:59:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:23.182 23:59:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:23.182 23:59:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:23.182 23:59:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:23.440 23:59:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:23.440 "name": "Existed_Raid", 00:17:23.440 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:23.440 "strip_size_kb": 64, 00:17:23.440 "state": "configuring", 00:17:23.440 "raid_level": "concat", 00:17:23.440 "superblock": false, 00:17:23.440 "num_base_bdevs": 4, 00:17:23.440 "num_base_bdevs_discovered": 3, 00:17:23.440 "num_base_bdevs_operational": 4, 00:17:23.440 "base_bdevs_list": [ 00:17:23.440 { 00:17:23.440 "name": "BaseBdev1", 00:17:23.440 "uuid": "c3c681a8-dc34-443f-ab32-70aaf687804a", 00:17:23.440 "is_configured": true, 00:17:23.440 "data_offset": 0, 00:17:23.440 "data_size": 65536 00:17:23.440 }, 00:17:23.440 { 00:17:23.440 "name": null, 00:17:23.440 "uuid": "5f99205a-57c4-4c46-b580-8999372308b7", 00:17:23.440 "is_configured": false, 00:17:23.440 "data_offset": 0, 00:17:23.440 "data_size": 65536 00:17:23.440 }, 00:17:23.440 { 00:17:23.440 "name": "BaseBdev3", 00:17:23.440 "uuid": "dd37aea9-5e92-4375-92da-887c373673a6", 00:17:23.440 "is_configured": true, 00:17:23.440 "data_offset": 0, 00:17:23.440 "data_size": 65536 00:17:23.440 }, 00:17:23.440 { 00:17:23.440 "name": "BaseBdev4", 00:17:23.440 "uuid": "d704c5e4-af1c-41b3-8a89-35aa2fb90ceb", 00:17:23.440 "is_configured": true, 00:17:23.440 "data_offset": 0, 00:17:23.440 "data_size": 65536 00:17:23.440 } 00:17:23.440 ] 00:17:23.440 }' 00:17:23.440 23:59:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:23.440 23:59:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:24.004 23:59:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:24.004 23:59:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:24.261 23:59:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:17:24.262 23:59:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:24.519 [2024-05-14 23:59:24.935696] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:24.519 23:59:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:24.519 23:59:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:24.519 23:59:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:24.519 23:59:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:24.519 23:59:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:24.520 23:59:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:24.520 23:59:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:24.520 23:59:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:24.520 23:59:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:24.520 23:59:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:24.520 23:59:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:24.520 23:59:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:24.776 23:59:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:24.776 "name": "Existed_Raid", 00:17:24.776 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:24.776 "strip_size_kb": 64, 00:17:24.776 "state": "configuring", 00:17:24.776 "raid_level": "concat", 00:17:24.776 "superblock": false, 00:17:24.776 "num_base_bdevs": 4, 00:17:24.776 "num_base_bdevs_discovered": 2, 00:17:24.776 "num_base_bdevs_operational": 4, 00:17:24.776 "base_bdevs_list": [ 00:17:24.776 { 00:17:24.776 "name": "BaseBdev1", 00:17:24.776 "uuid": "c3c681a8-dc34-443f-ab32-70aaf687804a", 00:17:24.776 "is_configured": true, 00:17:24.776 "data_offset": 0, 00:17:24.776 "data_size": 65536 00:17:24.776 }, 00:17:24.776 { 00:17:24.776 "name": null, 00:17:24.776 "uuid": "5f99205a-57c4-4c46-b580-8999372308b7", 00:17:24.776 "is_configured": false, 00:17:24.776 "data_offset": 0, 00:17:24.776 "data_size": 65536 00:17:24.776 }, 00:17:24.776 { 00:17:24.776 "name": null, 00:17:24.776 "uuid": "dd37aea9-5e92-4375-92da-887c373673a6", 00:17:24.776 "is_configured": false, 00:17:24.776 "data_offset": 0, 00:17:24.776 "data_size": 65536 00:17:24.776 }, 00:17:24.776 { 00:17:24.776 "name": "BaseBdev4", 00:17:24.776 "uuid": "d704c5e4-af1c-41b3-8a89-35aa2fb90ceb", 00:17:24.776 "is_configured": true, 00:17:24.776 "data_offset": 0, 00:17:24.776 "data_size": 65536 00:17:24.776 } 00:17:24.776 ] 00:17:24.776 }' 00:17:24.777 23:59:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:24.777 23:59:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:25.342 23:59:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:25.342 23:59:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:25.598 23:59:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:17:25.598 23:59:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:25.855 [2024-05-14 23:59:26.239174] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:25.855 23:59:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:25.855 23:59:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:25.855 23:59:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:25.855 23:59:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:25.855 23:59:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:25.855 23:59:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:25.855 23:59:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:25.855 23:59:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:25.855 23:59:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:25.855 23:59:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:25.855 23:59:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:25.855 23:59:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:26.112 23:59:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:26.112 "name": "Existed_Raid", 00:17:26.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:26.112 "strip_size_kb": 64, 00:17:26.112 "state": "configuring", 00:17:26.112 "raid_level": "concat", 00:17:26.112 "superblock": false, 00:17:26.112 "num_base_bdevs": 4, 00:17:26.112 "num_base_bdevs_discovered": 3, 00:17:26.112 "num_base_bdevs_operational": 4, 00:17:26.112 "base_bdevs_list": [ 00:17:26.112 { 00:17:26.112 "name": "BaseBdev1", 00:17:26.112 "uuid": "c3c681a8-dc34-443f-ab32-70aaf687804a", 00:17:26.112 "is_configured": true, 00:17:26.112 "data_offset": 0, 00:17:26.112 "data_size": 65536 00:17:26.112 }, 00:17:26.112 { 00:17:26.112 "name": null, 00:17:26.112 "uuid": "5f99205a-57c4-4c46-b580-8999372308b7", 00:17:26.112 "is_configured": false, 00:17:26.112 "data_offset": 0, 00:17:26.112 "data_size": 65536 00:17:26.112 }, 00:17:26.112 { 00:17:26.112 "name": "BaseBdev3", 00:17:26.112 "uuid": "dd37aea9-5e92-4375-92da-887c373673a6", 00:17:26.112 "is_configured": true, 00:17:26.112 "data_offset": 0, 00:17:26.112 "data_size": 65536 00:17:26.112 }, 00:17:26.112 { 00:17:26.112 "name": "BaseBdev4", 00:17:26.112 "uuid": "d704c5e4-af1c-41b3-8a89-35aa2fb90ceb", 00:17:26.112 "is_configured": true, 00:17:26.112 "data_offset": 0, 00:17:26.112 "data_size": 65536 00:17:26.112 } 00:17:26.112 ] 00:17:26.112 }' 00:17:26.112 23:59:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:26.112 23:59:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:26.678 23:59:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:26.678 23:59:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:26.936 23:59:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:17:26.936 23:59:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:26.936 [2024-05-14 23:59:27.498534] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:27.197 23:59:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:27.197 23:59:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:27.197 23:59:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:27.197 23:59:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:27.197 23:59:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:27.197 23:59:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:27.197 23:59:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:27.197 23:59:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:27.197 23:59:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:27.198 23:59:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:27.198 23:59:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:27.198 23:59:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:27.198 23:59:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:27.198 "name": "Existed_Raid", 00:17:27.198 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:27.198 "strip_size_kb": 64, 00:17:27.198 "state": "configuring", 00:17:27.198 "raid_level": "concat", 00:17:27.198 "superblock": false, 00:17:27.198 "num_base_bdevs": 4, 00:17:27.198 "num_base_bdevs_discovered": 2, 00:17:27.198 "num_base_bdevs_operational": 4, 00:17:27.198 "base_bdevs_list": [ 00:17:27.198 { 00:17:27.198 "name": null, 00:17:27.198 "uuid": "c3c681a8-dc34-443f-ab32-70aaf687804a", 00:17:27.198 "is_configured": false, 00:17:27.198 "data_offset": 0, 00:17:27.198 "data_size": 65536 00:17:27.198 }, 00:17:27.198 { 00:17:27.198 "name": null, 00:17:27.198 "uuid": "5f99205a-57c4-4c46-b580-8999372308b7", 00:17:27.198 "is_configured": false, 00:17:27.198 "data_offset": 0, 00:17:27.198 "data_size": 65536 00:17:27.198 }, 00:17:27.198 { 00:17:27.198 "name": "BaseBdev3", 00:17:27.198 "uuid": "dd37aea9-5e92-4375-92da-887c373673a6", 00:17:27.198 "is_configured": true, 00:17:27.198 "data_offset": 0, 00:17:27.198 "data_size": 65536 00:17:27.198 }, 00:17:27.198 { 00:17:27.198 "name": "BaseBdev4", 00:17:27.198 "uuid": "d704c5e4-af1c-41b3-8a89-35aa2fb90ceb", 00:17:27.198 "is_configured": true, 00:17:27.198 "data_offset": 0, 00:17:27.198 "data_size": 65536 00:17:27.198 } 00:17:27.198 ] 00:17:27.198 }' 00:17:27.198 23:59:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:27.198 23:59:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:27.765 23:59:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:27.765 23:59:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:28.024 23:59:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:17:28.024 23:59:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:28.283 [2024-05-14 23:59:28.828680] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:28.283 23:59:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:28.283 23:59:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:28.283 23:59:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:28.283 23:59:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:28.283 23:59:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:28.283 23:59:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:28.283 23:59:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:28.283 23:59:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:28.283 23:59:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:28.283 23:59:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:28.283 23:59:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.283 23:59:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:28.542 23:59:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:28.542 "name": "Existed_Raid", 00:17:28.542 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:28.542 "strip_size_kb": 64, 00:17:28.542 "state": "configuring", 00:17:28.542 "raid_level": "concat", 00:17:28.542 "superblock": false, 00:17:28.542 "num_base_bdevs": 4, 00:17:28.542 "num_base_bdevs_discovered": 3, 00:17:28.542 "num_base_bdevs_operational": 4, 00:17:28.542 "base_bdevs_list": [ 00:17:28.542 { 00:17:28.542 "name": null, 00:17:28.542 "uuid": "c3c681a8-dc34-443f-ab32-70aaf687804a", 00:17:28.542 "is_configured": false, 00:17:28.542 "data_offset": 0, 00:17:28.542 "data_size": 65536 00:17:28.542 }, 00:17:28.542 { 00:17:28.542 "name": "BaseBdev2", 00:17:28.542 "uuid": "5f99205a-57c4-4c46-b580-8999372308b7", 00:17:28.542 "is_configured": true, 00:17:28.542 "data_offset": 0, 00:17:28.542 "data_size": 65536 00:17:28.542 }, 00:17:28.542 { 00:17:28.542 "name": "BaseBdev3", 00:17:28.542 "uuid": "dd37aea9-5e92-4375-92da-887c373673a6", 00:17:28.542 "is_configured": true, 00:17:28.542 "data_offset": 0, 00:17:28.542 "data_size": 65536 00:17:28.542 }, 00:17:28.542 { 00:17:28.542 "name": "BaseBdev4", 00:17:28.542 "uuid": "d704c5e4-af1c-41b3-8a89-35aa2fb90ceb", 00:17:28.542 "is_configured": true, 00:17:28.542 "data_offset": 0, 00:17:28.542 "data_size": 65536 00:17:28.542 } 00:17:28.543 ] 00:17:28.543 }' 00:17:28.543 23:59:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:28.543 23:59:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:29.111 23:59:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:29.111 23:59:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:29.369 23:59:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:17:29.369 23:59:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:29.369 23:59:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:29.627 23:59:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u c3c681a8-dc34-443f-ab32-70aaf687804a 00:17:29.885 [2024-05-14 23:59:30.428310] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:29.885 [2024-05-14 23:59:30.428350] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x267b6d0 00:17:29.885 [2024-05-14 23:59:30.428359] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:17:29.885 [2024-05-14 23:59:30.428561] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2597340 00:17:29.885 [2024-05-14 23:59:30.428690] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x267b6d0 00:17:29.885 [2024-05-14 23:59:30.428700] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x267b6d0 00:17:29.885 [2024-05-14 23:59:30.428865] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:29.885 NewBaseBdev 00:17:29.885 23:59:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:17:29.885 23:59:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:17:29.885 23:59:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:29.885 23:59:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:17:29.885 23:59:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:29.885 23:59:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:29.885 23:59:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:30.154 23:59:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:30.440 [ 00:17:30.440 { 00:17:30.440 "name": "NewBaseBdev", 00:17:30.440 "aliases": [ 00:17:30.440 "c3c681a8-dc34-443f-ab32-70aaf687804a" 00:17:30.440 ], 00:17:30.440 "product_name": "Malloc disk", 00:17:30.440 "block_size": 512, 00:17:30.440 "num_blocks": 65536, 00:17:30.440 "uuid": "c3c681a8-dc34-443f-ab32-70aaf687804a", 00:17:30.440 "assigned_rate_limits": { 00:17:30.440 "rw_ios_per_sec": 0, 00:17:30.440 "rw_mbytes_per_sec": 0, 00:17:30.440 "r_mbytes_per_sec": 0, 00:17:30.440 "w_mbytes_per_sec": 0 00:17:30.440 }, 00:17:30.440 "claimed": true, 00:17:30.440 "claim_type": "exclusive_write", 00:17:30.440 "zoned": false, 00:17:30.440 "supported_io_types": { 00:17:30.440 "read": true, 00:17:30.440 "write": true, 00:17:30.440 "unmap": true, 00:17:30.440 "write_zeroes": true, 00:17:30.440 "flush": true, 00:17:30.440 "reset": true, 00:17:30.440 "compare": false, 00:17:30.440 "compare_and_write": false, 00:17:30.440 "abort": true, 00:17:30.440 "nvme_admin": false, 00:17:30.440 "nvme_io": false 00:17:30.440 }, 00:17:30.440 "memory_domains": [ 00:17:30.440 { 00:17:30.440 "dma_device_id": "system", 00:17:30.440 "dma_device_type": 1 00:17:30.440 }, 00:17:30.440 { 00:17:30.440 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:30.440 "dma_device_type": 2 00:17:30.440 } 00:17:30.440 ], 00:17:30.440 "driver_specific": {} 00:17:30.440 } 00:17:30.440 ] 00:17:30.440 23:59:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:17:30.440 23:59:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:17:30.440 23:59:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:30.440 23:59:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:17:30.440 23:59:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:30.440 23:59:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:30.440 23:59:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:30.440 23:59:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:30.440 23:59:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:30.440 23:59:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:30.440 23:59:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:30.440 23:59:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:30.440 23:59:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:30.712 23:59:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:30.712 "name": "Existed_Raid", 00:17:30.712 "uuid": "184afd51-4178-461f-a639-0a889a48c782", 00:17:30.712 "strip_size_kb": 64, 00:17:30.712 "state": "online", 00:17:30.712 "raid_level": "concat", 00:17:30.712 "superblock": false, 00:17:30.712 "num_base_bdevs": 4, 00:17:30.712 "num_base_bdevs_discovered": 4, 00:17:30.712 "num_base_bdevs_operational": 4, 00:17:30.712 "base_bdevs_list": [ 00:17:30.712 { 00:17:30.712 "name": "NewBaseBdev", 00:17:30.712 "uuid": "c3c681a8-dc34-443f-ab32-70aaf687804a", 00:17:30.712 "is_configured": true, 00:17:30.712 "data_offset": 0, 00:17:30.712 "data_size": 65536 00:17:30.712 }, 00:17:30.712 { 00:17:30.712 "name": "BaseBdev2", 00:17:30.712 "uuid": "5f99205a-57c4-4c46-b580-8999372308b7", 00:17:30.712 "is_configured": true, 00:17:30.712 "data_offset": 0, 00:17:30.712 "data_size": 65536 00:17:30.712 }, 00:17:30.712 { 00:17:30.712 "name": "BaseBdev3", 00:17:30.712 "uuid": "dd37aea9-5e92-4375-92da-887c373673a6", 00:17:30.712 "is_configured": true, 00:17:30.712 "data_offset": 0, 00:17:30.712 "data_size": 65536 00:17:30.712 }, 00:17:30.712 { 00:17:30.712 "name": "BaseBdev4", 00:17:30.712 "uuid": "d704c5e4-af1c-41b3-8a89-35aa2fb90ceb", 00:17:30.712 "is_configured": true, 00:17:30.712 "data_offset": 0, 00:17:30.712 "data_size": 65536 00:17:30.712 } 00:17:30.712 ] 00:17:30.712 }' 00:17:30.712 23:59:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:30.712 23:59:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:31.277 23:59:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:17:31.277 23:59:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:17:31.277 23:59:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:17:31.277 23:59:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:17:31.277 23:59:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:17:31.277 23:59:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:17:31.277 23:59:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:31.277 23:59:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:17:31.535 [2024-05-14 23:59:31.980868] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:31.535 23:59:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:17:31.535 "name": "Existed_Raid", 00:17:31.535 "aliases": [ 00:17:31.535 "184afd51-4178-461f-a639-0a889a48c782" 00:17:31.535 ], 00:17:31.535 "product_name": "Raid Volume", 00:17:31.535 "block_size": 512, 00:17:31.535 "num_blocks": 262144, 00:17:31.535 "uuid": "184afd51-4178-461f-a639-0a889a48c782", 00:17:31.535 "assigned_rate_limits": { 00:17:31.535 "rw_ios_per_sec": 0, 00:17:31.535 "rw_mbytes_per_sec": 0, 00:17:31.535 "r_mbytes_per_sec": 0, 00:17:31.535 "w_mbytes_per_sec": 0 00:17:31.535 }, 00:17:31.535 "claimed": false, 00:17:31.535 "zoned": false, 00:17:31.535 "supported_io_types": { 00:17:31.535 "read": true, 00:17:31.535 "write": true, 00:17:31.535 "unmap": true, 00:17:31.535 "write_zeroes": true, 00:17:31.535 "flush": true, 00:17:31.535 "reset": true, 00:17:31.535 "compare": false, 00:17:31.535 "compare_and_write": false, 00:17:31.535 "abort": false, 00:17:31.535 "nvme_admin": false, 00:17:31.535 "nvme_io": false 00:17:31.535 }, 00:17:31.535 "memory_domains": [ 00:17:31.535 { 00:17:31.535 "dma_device_id": "system", 00:17:31.535 "dma_device_type": 1 00:17:31.535 }, 00:17:31.535 { 00:17:31.535 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.535 "dma_device_type": 2 00:17:31.535 }, 00:17:31.535 { 00:17:31.535 "dma_device_id": "system", 00:17:31.535 "dma_device_type": 1 00:17:31.535 }, 00:17:31.535 { 00:17:31.535 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.535 "dma_device_type": 2 00:17:31.535 }, 00:17:31.535 { 00:17:31.535 "dma_device_id": "system", 00:17:31.535 "dma_device_type": 1 00:17:31.535 }, 00:17:31.535 { 00:17:31.535 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.535 "dma_device_type": 2 00:17:31.535 }, 00:17:31.535 { 00:17:31.535 "dma_device_id": "system", 00:17:31.535 "dma_device_type": 1 00:17:31.535 }, 00:17:31.535 { 00:17:31.535 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.535 "dma_device_type": 2 00:17:31.536 } 00:17:31.536 ], 00:17:31.536 "driver_specific": { 00:17:31.536 "raid": { 00:17:31.536 "uuid": "184afd51-4178-461f-a639-0a889a48c782", 00:17:31.536 "strip_size_kb": 64, 00:17:31.536 "state": "online", 00:17:31.536 "raid_level": "concat", 00:17:31.536 "superblock": false, 00:17:31.536 "num_base_bdevs": 4, 00:17:31.536 "num_base_bdevs_discovered": 4, 00:17:31.536 "num_base_bdevs_operational": 4, 00:17:31.536 "base_bdevs_list": [ 00:17:31.536 { 00:17:31.536 "name": "NewBaseBdev", 00:17:31.536 "uuid": "c3c681a8-dc34-443f-ab32-70aaf687804a", 00:17:31.536 "is_configured": true, 00:17:31.536 "data_offset": 0, 00:17:31.536 "data_size": 65536 00:17:31.536 }, 00:17:31.536 { 00:17:31.536 "name": "BaseBdev2", 00:17:31.536 "uuid": "5f99205a-57c4-4c46-b580-8999372308b7", 00:17:31.536 "is_configured": true, 00:17:31.536 "data_offset": 0, 00:17:31.536 "data_size": 65536 00:17:31.536 }, 00:17:31.536 { 00:17:31.536 "name": "BaseBdev3", 00:17:31.536 "uuid": "dd37aea9-5e92-4375-92da-887c373673a6", 00:17:31.536 "is_configured": true, 00:17:31.536 "data_offset": 0, 00:17:31.536 "data_size": 65536 00:17:31.536 }, 00:17:31.536 { 00:17:31.536 "name": "BaseBdev4", 00:17:31.536 "uuid": "d704c5e4-af1c-41b3-8a89-35aa2fb90ceb", 00:17:31.536 "is_configured": true, 00:17:31.536 "data_offset": 0, 00:17:31.536 "data_size": 65536 00:17:31.536 } 00:17:31.536 ] 00:17:31.536 } 00:17:31.536 } 00:17:31.536 }' 00:17:31.536 23:59:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:31.536 23:59:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:17:31.536 BaseBdev2 00:17:31.536 BaseBdev3 00:17:31.536 BaseBdev4' 00:17:31.536 23:59:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:31.536 23:59:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:31.536 23:59:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:31.796 23:59:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:31.796 "name": "NewBaseBdev", 00:17:31.796 "aliases": [ 00:17:31.796 "c3c681a8-dc34-443f-ab32-70aaf687804a" 00:17:31.796 ], 00:17:31.796 "product_name": "Malloc disk", 00:17:31.796 "block_size": 512, 00:17:31.796 "num_blocks": 65536, 00:17:31.796 "uuid": "c3c681a8-dc34-443f-ab32-70aaf687804a", 00:17:31.796 "assigned_rate_limits": { 00:17:31.796 "rw_ios_per_sec": 0, 00:17:31.796 "rw_mbytes_per_sec": 0, 00:17:31.796 "r_mbytes_per_sec": 0, 00:17:31.796 "w_mbytes_per_sec": 0 00:17:31.796 }, 00:17:31.796 "claimed": true, 00:17:31.796 "claim_type": "exclusive_write", 00:17:31.796 "zoned": false, 00:17:31.796 "supported_io_types": { 00:17:31.796 "read": true, 00:17:31.796 "write": true, 00:17:31.796 "unmap": true, 00:17:31.796 "write_zeroes": true, 00:17:31.796 "flush": true, 00:17:31.796 "reset": true, 00:17:31.796 "compare": false, 00:17:31.796 "compare_and_write": false, 00:17:31.796 "abort": true, 00:17:31.796 "nvme_admin": false, 00:17:31.796 "nvme_io": false 00:17:31.796 }, 00:17:31.796 "memory_domains": [ 00:17:31.796 { 00:17:31.796 "dma_device_id": "system", 00:17:31.796 "dma_device_type": 1 00:17:31.796 }, 00:17:31.796 { 00:17:31.796 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.796 "dma_device_type": 2 00:17:31.796 } 00:17:31.796 ], 00:17:31.796 "driver_specific": {} 00:17:31.796 }' 00:17:31.796 23:59:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:31.796 23:59:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:32.054 23:59:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:32.054 23:59:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:32.054 23:59:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:32.054 23:59:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:32.054 23:59:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:32.054 23:59:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:32.054 23:59:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:32.054 23:59:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:32.054 23:59:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:32.312 23:59:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:32.312 23:59:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:32.312 23:59:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:32.312 23:59:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:32.570 23:59:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:32.571 "name": "BaseBdev2", 00:17:32.571 "aliases": [ 00:17:32.571 "5f99205a-57c4-4c46-b580-8999372308b7" 00:17:32.571 ], 00:17:32.571 "product_name": "Malloc disk", 00:17:32.571 "block_size": 512, 00:17:32.571 "num_blocks": 65536, 00:17:32.571 "uuid": "5f99205a-57c4-4c46-b580-8999372308b7", 00:17:32.571 "assigned_rate_limits": { 00:17:32.571 "rw_ios_per_sec": 0, 00:17:32.571 "rw_mbytes_per_sec": 0, 00:17:32.571 "r_mbytes_per_sec": 0, 00:17:32.571 "w_mbytes_per_sec": 0 00:17:32.571 }, 00:17:32.571 "claimed": true, 00:17:32.571 "claim_type": "exclusive_write", 00:17:32.571 "zoned": false, 00:17:32.571 "supported_io_types": { 00:17:32.571 "read": true, 00:17:32.571 "write": true, 00:17:32.571 "unmap": true, 00:17:32.571 "write_zeroes": true, 00:17:32.571 "flush": true, 00:17:32.571 "reset": true, 00:17:32.571 "compare": false, 00:17:32.571 "compare_and_write": false, 00:17:32.571 "abort": true, 00:17:32.571 "nvme_admin": false, 00:17:32.571 "nvme_io": false 00:17:32.571 }, 00:17:32.571 "memory_domains": [ 00:17:32.571 { 00:17:32.571 "dma_device_id": "system", 00:17:32.571 "dma_device_type": 1 00:17:32.571 }, 00:17:32.571 { 00:17:32.571 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:32.571 "dma_device_type": 2 00:17:32.571 } 00:17:32.571 ], 00:17:32.571 "driver_specific": {} 00:17:32.571 }' 00:17:32.571 23:59:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:32.571 23:59:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:32.571 23:59:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:32.571 23:59:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:32.571 23:59:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:32.571 23:59:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:32.571 23:59:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:32.571 23:59:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:32.571 23:59:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:32.571 23:59:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:32.829 23:59:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:32.829 23:59:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:32.829 23:59:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:32.829 23:59:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:32.829 23:59:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:33.087 23:59:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:33.087 "name": "BaseBdev3", 00:17:33.087 "aliases": [ 00:17:33.087 "dd37aea9-5e92-4375-92da-887c373673a6" 00:17:33.087 ], 00:17:33.087 "product_name": "Malloc disk", 00:17:33.087 "block_size": 512, 00:17:33.087 "num_blocks": 65536, 00:17:33.087 "uuid": "dd37aea9-5e92-4375-92da-887c373673a6", 00:17:33.087 "assigned_rate_limits": { 00:17:33.087 "rw_ios_per_sec": 0, 00:17:33.087 "rw_mbytes_per_sec": 0, 00:17:33.087 "r_mbytes_per_sec": 0, 00:17:33.087 "w_mbytes_per_sec": 0 00:17:33.087 }, 00:17:33.087 "claimed": true, 00:17:33.087 "claim_type": "exclusive_write", 00:17:33.087 "zoned": false, 00:17:33.087 "supported_io_types": { 00:17:33.087 "read": true, 00:17:33.087 "write": true, 00:17:33.087 "unmap": true, 00:17:33.087 "write_zeroes": true, 00:17:33.087 "flush": true, 00:17:33.087 "reset": true, 00:17:33.087 "compare": false, 00:17:33.087 "compare_and_write": false, 00:17:33.087 "abort": true, 00:17:33.087 "nvme_admin": false, 00:17:33.087 "nvme_io": false 00:17:33.087 }, 00:17:33.087 "memory_domains": [ 00:17:33.087 { 00:17:33.087 "dma_device_id": "system", 00:17:33.087 "dma_device_type": 1 00:17:33.087 }, 00:17:33.087 { 00:17:33.087 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:33.087 "dma_device_type": 2 00:17:33.087 } 00:17:33.087 ], 00:17:33.087 "driver_specific": {} 00:17:33.087 }' 00:17:33.087 23:59:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:33.087 23:59:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:33.087 23:59:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:33.087 23:59:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:33.087 23:59:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:33.345 23:59:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:33.345 23:59:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:33.345 23:59:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:33.345 23:59:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:33.345 23:59:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:33.345 23:59:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:33.345 23:59:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:33.345 23:59:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:33.345 23:59:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:33.345 23:59:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:33.603 23:59:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:33.603 "name": "BaseBdev4", 00:17:33.603 "aliases": [ 00:17:33.603 "d704c5e4-af1c-41b3-8a89-35aa2fb90ceb" 00:17:33.603 ], 00:17:33.603 "product_name": "Malloc disk", 00:17:33.603 "block_size": 512, 00:17:33.603 "num_blocks": 65536, 00:17:33.603 "uuid": "d704c5e4-af1c-41b3-8a89-35aa2fb90ceb", 00:17:33.603 "assigned_rate_limits": { 00:17:33.603 "rw_ios_per_sec": 0, 00:17:33.603 "rw_mbytes_per_sec": 0, 00:17:33.603 "r_mbytes_per_sec": 0, 00:17:33.603 "w_mbytes_per_sec": 0 00:17:33.603 }, 00:17:33.603 "claimed": true, 00:17:33.603 "claim_type": "exclusive_write", 00:17:33.603 "zoned": false, 00:17:33.603 "supported_io_types": { 00:17:33.603 "read": true, 00:17:33.603 "write": true, 00:17:33.603 "unmap": true, 00:17:33.603 "write_zeroes": true, 00:17:33.603 "flush": true, 00:17:33.603 "reset": true, 00:17:33.603 "compare": false, 00:17:33.603 "compare_and_write": false, 00:17:33.603 "abort": true, 00:17:33.603 "nvme_admin": false, 00:17:33.603 "nvme_io": false 00:17:33.603 }, 00:17:33.603 "memory_domains": [ 00:17:33.603 { 00:17:33.603 "dma_device_id": "system", 00:17:33.603 "dma_device_type": 1 00:17:33.603 }, 00:17:33.603 { 00:17:33.603 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:33.603 "dma_device_type": 2 00:17:33.603 } 00:17:33.603 ], 00:17:33.603 "driver_specific": {} 00:17:33.603 }' 00:17:33.603 23:59:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:33.603 23:59:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:33.603 23:59:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:33.603 23:59:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:33.861 23:59:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:33.861 23:59:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:33.861 23:59:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:33.861 23:59:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:33.861 23:59:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:33.861 23:59:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:33.861 23:59:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:33.861 23:59:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:33.861 23:59:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:34.120 [2024-05-14 23:59:34.607539] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:34.120 [2024-05-14 23:59:34.607567] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:34.120 [2024-05-14 23:59:34.607625] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:34.120 [2024-05-14 23:59:34.607686] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:34.120 [2024-05-14 23:59:34.607699] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x267b6d0 name Existed_Raid, state offline 00:17:34.120 23:59:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@342 -- # killprocess 446327 00:17:34.120 23:59:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@946 -- # '[' -z 446327 ']' 00:17:34.120 23:59:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # kill -0 446327 00:17:34.120 23:59:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # uname 00:17:34.120 23:59:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:17:34.120 23:59:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 446327 00:17:34.120 23:59:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:17:34.120 23:59:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:17:34.120 23:59:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 446327' 00:17:34.120 killing process with pid 446327 00:17:34.120 23:59:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@965 -- # kill 446327 00:17:34.120 [2024-05-14 23:59:34.685957] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:34.120 23:59:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@970 -- # wait 446327 00:17:34.378 [2024-05-14 23:59:34.725692] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:34.638 23:59:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@344 -- # return 0 00:17:34.638 00:17:34.638 real 0m31.916s 00:17:34.638 user 0m58.564s 00:17:34.638 sys 0m5.684s 00:17:34.638 23:59:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:17:34.638 23:59:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:34.638 ************************************ 00:17:34.638 END TEST raid_state_function_test 00:17:34.638 ************************************ 00:17:34.638 23:59:35 bdev_raid -- bdev/bdev_raid.sh@816 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:17:34.638 23:59:35 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:17:34.638 23:59:35 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:17:34.638 23:59:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:34.638 ************************************ 00:17:34.638 START TEST raid_state_function_test_sb 00:17:34.638 ************************************ 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1121 -- # raid_state_function_test concat 4 true 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local raid_level=concat 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=4 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev4 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # '[' concat '!=' raid1 ']' 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # raid_pid=451113 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 451113' 00:17:34.638 Process raid pid: 451113 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@247 -- # waitforlisten 451113 /var/tmp/spdk-raid.sock 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@827 -- # '[' -z 451113 ']' 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:34.638 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:17:34.638 23:59:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:34.638 [2024-05-14 23:59:35.142304] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:17:34.638 [2024-05-14 23:59:35.142374] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:34.898 [2024-05-14 23:59:35.275949] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:34.898 [2024-05-14 23:59:35.381688] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:17:34.898 [2024-05-14 23:59:35.440474] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:34.898 [2024-05-14 23:59:35.440504] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:35.834 23:59:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:17:35.834 23:59:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # return 0 00:17:35.834 23:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:35.834 [2024-05-14 23:59:36.302686] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:35.834 [2024-05-14 23:59:36.302733] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:35.834 [2024-05-14 23:59:36.302744] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:35.834 [2024-05-14 23:59:36.302756] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:35.834 [2024-05-14 23:59:36.302765] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:35.834 [2024-05-14 23:59:36.302776] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:35.834 [2024-05-14 23:59:36.302785] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:35.834 [2024-05-14 23:59:36.302796] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:35.834 23:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:35.834 23:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:35.834 23:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:35.834 23:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:35.834 23:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:35.834 23:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:35.834 23:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:35.834 23:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:35.834 23:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:35.834 23:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:35.835 23:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:35.835 23:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:36.093 23:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:36.093 "name": "Existed_Raid", 00:17:36.093 "uuid": "4168836f-b420-422b-9a04-541e89fcf341", 00:17:36.093 "strip_size_kb": 64, 00:17:36.093 "state": "configuring", 00:17:36.093 "raid_level": "concat", 00:17:36.093 "superblock": true, 00:17:36.093 "num_base_bdevs": 4, 00:17:36.093 "num_base_bdevs_discovered": 0, 00:17:36.093 "num_base_bdevs_operational": 4, 00:17:36.093 "base_bdevs_list": [ 00:17:36.093 { 00:17:36.093 "name": "BaseBdev1", 00:17:36.093 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.093 "is_configured": false, 00:17:36.093 "data_offset": 0, 00:17:36.093 "data_size": 0 00:17:36.093 }, 00:17:36.093 { 00:17:36.093 "name": "BaseBdev2", 00:17:36.093 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.093 "is_configured": false, 00:17:36.093 "data_offset": 0, 00:17:36.093 "data_size": 0 00:17:36.093 }, 00:17:36.093 { 00:17:36.093 "name": "BaseBdev3", 00:17:36.093 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.093 "is_configured": false, 00:17:36.093 "data_offset": 0, 00:17:36.093 "data_size": 0 00:17:36.093 }, 00:17:36.093 { 00:17:36.093 "name": "BaseBdev4", 00:17:36.093 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.093 "is_configured": false, 00:17:36.093 "data_offset": 0, 00:17:36.093 "data_size": 0 00:17:36.093 } 00:17:36.093 ] 00:17:36.093 }' 00:17:36.093 23:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:36.093 23:59:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:36.660 23:59:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:36.918 [2024-05-14 23:59:37.369334] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:36.918 [2024-05-14 23:59:37.369369] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1775c00 name Existed_Raid, state configuring 00:17:36.918 23:59:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:37.176 [2024-05-14 23:59:37.610003] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:37.176 [2024-05-14 23:59:37.610034] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:37.176 [2024-05-14 23:59:37.610045] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:37.176 [2024-05-14 23:59:37.610057] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:37.176 [2024-05-14 23:59:37.610066] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:37.176 [2024-05-14 23:59:37.610077] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:37.176 [2024-05-14 23:59:37.610086] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:37.177 [2024-05-14 23:59:37.610097] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:37.177 23:59:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:37.436 [2024-05-14 23:59:37.860602] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:37.436 BaseBdev1 00:17:37.436 23:59:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:17:37.436 23:59:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:17:37.436 23:59:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:37.436 23:59:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:17:37.436 23:59:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:37.436 23:59:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:37.436 23:59:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:37.695 23:59:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:37.954 [ 00:17:37.954 { 00:17:37.955 "name": "BaseBdev1", 00:17:37.955 "aliases": [ 00:17:37.955 "4bf5fc96-4b88-47a1-b725-9fb40b62503c" 00:17:37.955 ], 00:17:37.955 "product_name": "Malloc disk", 00:17:37.955 "block_size": 512, 00:17:37.955 "num_blocks": 65536, 00:17:37.955 "uuid": "4bf5fc96-4b88-47a1-b725-9fb40b62503c", 00:17:37.955 "assigned_rate_limits": { 00:17:37.955 "rw_ios_per_sec": 0, 00:17:37.955 "rw_mbytes_per_sec": 0, 00:17:37.955 "r_mbytes_per_sec": 0, 00:17:37.955 "w_mbytes_per_sec": 0 00:17:37.955 }, 00:17:37.955 "claimed": true, 00:17:37.955 "claim_type": "exclusive_write", 00:17:37.955 "zoned": false, 00:17:37.955 "supported_io_types": { 00:17:37.955 "read": true, 00:17:37.955 "write": true, 00:17:37.955 "unmap": true, 00:17:37.955 "write_zeroes": true, 00:17:37.955 "flush": true, 00:17:37.955 "reset": true, 00:17:37.955 "compare": false, 00:17:37.955 "compare_and_write": false, 00:17:37.955 "abort": true, 00:17:37.955 "nvme_admin": false, 00:17:37.955 "nvme_io": false 00:17:37.955 }, 00:17:37.955 "memory_domains": [ 00:17:37.955 { 00:17:37.955 "dma_device_id": "system", 00:17:37.955 "dma_device_type": 1 00:17:37.955 }, 00:17:37.955 { 00:17:37.955 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:37.955 "dma_device_type": 2 00:17:37.955 } 00:17:37.955 ], 00:17:37.955 "driver_specific": {} 00:17:37.955 } 00:17:37.955 ] 00:17:37.955 23:59:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:17:37.955 23:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:37.955 23:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:37.955 23:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:37.955 23:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:37.955 23:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:37.955 23:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:37.955 23:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:37.955 23:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:37.955 23:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:37.955 23:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:37.955 23:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:37.955 23:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:38.213 23:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:38.214 "name": "Existed_Raid", 00:17:38.214 "uuid": "e02eb1dc-38d6-4518-9898-49ae917b48fa", 00:17:38.214 "strip_size_kb": 64, 00:17:38.214 "state": "configuring", 00:17:38.214 "raid_level": "concat", 00:17:38.214 "superblock": true, 00:17:38.214 "num_base_bdevs": 4, 00:17:38.214 "num_base_bdevs_discovered": 1, 00:17:38.214 "num_base_bdevs_operational": 4, 00:17:38.214 "base_bdevs_list": [ 00:17:38.214 { 00:17:38.214 "name": "BaseBdev1", 00:17:38.214 "uuid": "4bf5fc96-4b88-47a1-b725-9fb40b62503c", 00:17:38.214 "is_configured": true, 00:17:38.214 "data_offset": 2048, 00:17:38.214 "data_size": 63488 00:17:38.214 }, 00:17:38.214 { 00:17:38.214 "name": "BaseBdev2", 00:17:38.214 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:38.214 "is_configured": false, 00:17:38.214 "data_offset": 0, 00:17:38.214 "data_size": 0 00:17:38.214 }, 00:17:38.214 { 00:17:38.214 "name": "BaseBdev3", 00:17:38.214 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:38.214 "is_configured": false, 00:17:38.214 "data_offset": 0, 00:17:38.214 "data_size": 0 00:17:38.214 }, 00:17:38.214 { 00:17:38.214 "name": "BaseBdev4", 00:17:38.214 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:38.214 "is_configured": false, 00:17:38.214 "data_offset": 0, 00:17:38.214 "data_size": 0 00:17:38.214 } 00:17:38.214 ] 00:17:38.214 }' 00:17:38.214 23:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:38.214 23:59:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:38.782 23:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:39.041 [2024-05-14 23:59:39.424744] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:39.041 [2024-05-14 23:59:39.424793] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1775ea0 name Existed_Raid, state configuring 00:17:39.041 23:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:39.300 [2024-05-14 23:59:39.665437] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:39.300 [2024-05-14 23:59:39.666977] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:39.300 [2024-05-14 23:59:39.667013] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:39.300 [2024-05-14 23:59:39.667024] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:39.300 [2024-05-14 23:59:39.667035] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:39.300 [2024-05-14 23:59:39.667044] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:39.300 [2024-05-14 23:59:39.667056] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:39.300 23:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:17:39.300 23:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:17:39.300 23:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:39.300 23:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:39.300 23:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:39.300 23:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:39.300 23:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:39.300 23:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:39.300 23:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:39.300 23:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:39.300 23:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:39.300 23:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:39.300 23:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:39.300 23:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:39.560 23:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:39.560 "name": "Existed_Raid", 00:17:39.560 "uuid": "8862704c-9797-4c11-8938-e91ae4f5acea", 00:17:39.560 "strip_size_kb": 64, 00:17:39.560 "state": "configuring", 00:17:39.560 "raid_level": "concat", 00:17:39.560 "superblock": true, 00:17:39.560 "num_base_bdevs": 4, 00:17:39.560 "num_base_bdevs_discovered": 1, 00:17:39.560 "num_base_bdevs_operational": 4, 00:17:39.560 "base_bdevs_list": [ 00:17:39.560 { 00:17:39.560 "name": "BaseBdev1", 00:17:39.560 "uuid": "4bf5fc96-4b88-47a1-b725-9fb40b62503c", 00:17:39.560 "is_configured": true, 00:17:39.560 "data_offset": 2048, 00:17:39.560 "data_size": 63488 00:17:39.560 }, 00:17:39.560 { 00:17:39.560 "name": "BaseBdev2", 00:17:39.560 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:39.560 "is_configured": false, 00:17:39.560 "data_offset": 0, 00:17:39.560 "data_size": 0 00:17:39.560 }, 00:17:39.560 { 00:17:39.560 "name": "BaseBdev3", 00:17:39.560 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:39.560 "is_configured": false, 00:17:39.560 "data_offset": 0, 00:17:39.560 "data_size": 0 00:17:39.560 }, 00:17:39.560 { 00:17:39.560 "name": "BaseBdev4", 00:17:39.560 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:39.560 "is_configured": false, 00:17:39.560 "data_offset": 0, 00:17:39.560 "data_size": 0 00:17:39.560 } 00:17:39.560 ] 00:17:39.560 }' 00:17:39.560 23:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:39.560 23:59:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:40.128 23:59:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:40.387 [2024-05-14 23:59:40.739705] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:40.387 BaseBdev2 00:17:40.387 23:59:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:17:40.387 23:59:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:17:40.387 23:59:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:40.387 23:59:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:17:40.387 23:59:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:40.387 23:59:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:40.387 23:59:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:40.646 23:59:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:40.646 [ 00:17:40.646 { 00:17:40.646 "name": "BaseBdev2", 00:17:40.646 "aliases": [ 00:17:40.646 "45d7d3ef-c705-44da-ab59-8fa94b2503fb" 00:17:40.646 ], 00:17:40.646 "product_name": "Malloc disk", 00:17:40.646 "block_size": 512, 00:17:40.646 "num_blocks": 65536, 00:17:40.646 "uuid": "45d7d3ef-c705-44da-ab59-8fa94b2503fb", 00:17:40.646 "assigned_rate_limits": { 00:17:40.646 "rw_ios_per_sec": 0, 00:17:40.646 "rw_mbytes_per_sec": 0, 00:17:40.646 "r_mbytes_per_sec": 0, 00:17:40.646 "w_mbytes_per_sec": 0 00:17:40.646 }, 00:17:40.646 "claimed": true, 00:17:40.646 "claim_type": "exclusive_write", 00:17:40.646 "zoned": false, 00:17:40.646 "supported_io_types": { 00:17:40.646 "read": true, 00:17:40.646 "write": true, 00:17:40.646 "unmap": true, 00:17:40.646 "write_zeroes": true, 00:17:40.646 "flush": true, 00:17:40.646 "reset": true, 00:17:40.646 "compare": false, 00:17:40.646 "compare_and_write": false, 00:17:40.646 "abort": true, 00:17:40.646 "nvme_admin": false, 00:17:40.646 "nvme_io": false 00:17:40.646 }, 00:17:40.646 "memory_domains": [ 00:17:40.646 { 00:17:40.646 "dma_device_id": "system", 00:17:40.646 "dma_device_type": 1 00:17:40.646 }, 00:17:40.646 { 00:17:40.646 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:40.646 "dma_device_type": 2 00:17:40.646 } 00:17:40.646 ], 00:17:40.646 "driver_specific": {} 00:17:40.646 } 00:17:40.646 ] 00:17:40.905 23:59:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:17:40.905 23:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:17:40.905 23:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:17:40.905 23:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:40.905 23:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:40.905 23:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:40.905 23:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:40.905 23:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:40.905 23:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:40.905 23:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:40.905 23:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:40.905 23:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:40.905 23:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:40.905 23:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:40.905 23:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:41.163 23:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:41.163 "name": "Existed_Raid", 00:17:41.163 "uuid": "8862704c-9797-4c11-8938-e91ae4f5acea", 00:17:41.163 "strip_size_kb": 64, 00:17:41.163 "state": "configuring", 00:17:41.163 "raid_level": "concat", 00:17:41.163 "superblock": true, 00:17:41.164 "num_base_bdevs": 4, 00:17:41.164 "num_base_bdevs_discovered": 2, 00:17:41.164 "num_base_bdevs_operational": 4, 00:17:41.164 "base_bdevs_list": [ 00:17:41.164 { 00:17:41.164 "name": "BaseBdev1", 00:17:41.164 "uuid": "4bf5fc96-4b88-47a1-b725-9fb40b62503c", 00:17:41.164 "is_configured": true, 00:17:41.164 "data_offset": 2048, 00:17:41.164 "data_size": 63488 00:17:41.164 }, 00:17:41.164 { 00:17:41.164 "name": "BaseBdev2", 00:17:41.164 "uuid": "45d7d3ef-c705-44da-ab59-8fa94b2503fb", 00:17:41.164 "is_configured": true, 00:17:41.164 "data_offset": 2048, 00:17:41.164 "data_size": 63488 00:17:41.164 }, 00:17:41.164 { 00:17:41.164 "name": "BaseBdev3", 00:17:41.164 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:41.164 "is_configured": false, 00:17:41.164 "data_offset": 0, 00:17:41.164 "data_size": 0 00:17:41.164 }, 00:17:41.164 { 00:17:41.164 "name": "BaseBdev4", 00:17:41.164 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:41.164 "is_configured": false, 00:17:41.164 "data_offset": 0, 00:17:41.164 "data_size": 0 00:17:41.164 } 00:17:41.164 ] 00:17:41.164 }' 00:17:41.164 23:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:41.164 23:59:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:41.729 23:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:41.988 [2024-05-14 23:59:42.323325] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:41.988 BaseBdev3 00:17:41.988 23:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:17:41.988 23:59:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:17:41.988 23:59:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:41.988 23:59:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:17:41.988 23:59:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:41.988 23:59:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:41.988 23:59:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:41.988 23:59:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:42.246 [ 00:17:42.246 { 00:17:42.246 "name": "BaseBdev3", 00:17:42.246 "aliases": [ 00:17:42.246 "d9082f42-a20d-4dc3-ae3b-4cced6ef286a" 00:17:42.246 ], 00:17:42.246 "product_name": "Malloc disk", 00:17:42.246 "block_size": 512, 00:17:42.246 "num_blocks": 65536, 00:17:42.246 "uuid": "d9082f42-a20d-4dc3-ae3b-4cced6ef286a", 00:17:42.246 "assigned_rate_limits": { 00:17:42.246 "rw_ios_per_sec": 0, 00:17:42.246 "rw_mbytes_per_sec": 0, 00:17:42.246 "r_mbytes_per_sec": 0, 00:17:42.246 "w_mbytes_per_sec": 0 00:17:42.246 }, 00:17:42.246 "claimed": true, 00:17:42.246 "claim_type": "exclusive_write", 00:17:42.246 "zoned": false, 00:17:42.246 "supported_io_types": { 00:17:42.246 "read": true, 00:17:42.246 "write": true, 00:17:42.246 "unmap": true, 00:17:42.246 "write_zeroes": true, 00:17:42.246 "flush": true, 00:17:42.246 "reset": true, 00:17:42.246 "compare": false, 00:17:42.246 "compare_and_write": false, 00:17:42.246 "abort": true, 00:17:42.246 "nvme_admin": false, 00:17:42.246 "nvme_io": false 00:17:42.246 }, 00:17:42.246 "memory_domains": [ 00:17:42.246 { 00:17:42.246 "dma_device_id": "system", 00:17:42.246 "dma_device_type": 1 00:17:42.246 }, 00:17:42.246 { 00:17:42.246 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.246 "dma_device_type": 2 00:17:42.246 } 00:17:42.246 ], 00:17:42.246 "driver_specific": {} 00:17:42.246 } 00:17:42.246 ] 00:17:42.246 23:59:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:17:42.246 23:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:17:42.246 23:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:17:42.246 23:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:42.246 23:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:42.246 23:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:42.246 23:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:42.246 23:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:42.246 23:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:42.246 23:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:42.246 23:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:42.246 23:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:42.246 23:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:42.246 23:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:42.246 23:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:42.504 23:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:42.504 "name": "Existed_Raid", 00:17:42.504 "uuid": "8862704c-9797-4c11-8938-e91ae4f5acea", 00:17:42.504 "strip_size_kb": 64, 00:17:42.504 "state": "configuring", 00:17:42.504 "raid_level": "concat", 00:17:42.504 "superblock": true, 00:17:42.504 "num_base_bdevs": 4, 00:17:42.504 "num_base_bdevs_discovered": 3, 00:17:42.504 "num_base_bdevs_operational": 4, 00:17:42.504 "base_bdevs_list": [ 00:17:42.504 { 00:17:42.504 "name": "BaseBdev1", 00:17:42.504 "uuid": "4bf5fc96-4b88-47a1-b725-9fb40b62503c", 00:17:42.504 "is_configured": true, 00:17:42.504 "data_offset": 2048, 00:17:42.504 "data_size": 63488 00:17:42.504 }, 00:17:42.504 { 00:17:42.504 "name": "BaseBdev2", 00:17:42.504 "uuid": "45d7d3ef-c705-44da-ab59-8fa94b2503fb", 00:17:42.504 "is_configured": true, 00:17:42.504 "data_offset": 2048, 00:17:42.504 "data_size": 63488 00:17:42.504 }, 00:17:42.504 { 00:17:42.504 "name": "BaseBdev3", 00:17:42.504 "uuid": "d9082f42-a20d-4dc3-ae3b-4cced6ef286a", 00:17:42.504 "is_configured": true, 00:17:42.504 "data_offset": 2048, 00:17:42.504 "data_size": 63488 00:17:42.504 }, 00:17:42.504 { 00:17:42.504 "name": "BaseBdev4", 00:17:42.504 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:42.504 "is_configured": false, 00:17:42.504 "data_offset": 0, 00:17:42.504 "data_size": 0 00:17:42.504 } 00:17:42.504 ] 00:17:42.504 }' 00:17:42.504 23:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:42.504 23:59:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:43.069 23:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:43.327 [2024-05-14 23:59:43.862761] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:43.327 [2024-05-14 23:59:43.862930] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1775470 00:17:43.327 [2024-05-14 23:59:43.862944] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:43.327 [2024-05-14 23:59:43.863133] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1775b40 00:17:43.327 [2024-05-14 23:59:43.863262] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1775470 00:17:43.327 [2024-05-14 23:59:43.863273] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1775470 00:17:43.327 [2024-05-14 23:59:43.863373] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:43.327 BaseBdev4 00:17:43.327 23:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev4 00:17:43.327 23:59:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:17:43.327 23:59:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:43.327 23:59:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:17:43.327 23:59:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:43.327 23:59:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:43.327 23:59:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:43.587 23:59:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:43.908 [ 00:17:43.908 { 00:17:43.908 "name": "BaseBdev4", 00:17:43.908 "aliases": [ 00:17:43.908 "92b70663-4a60-4f2d-a929-3cb9d99099ee" 00:17:43.908 ], 00:17:43.908 "product_name": "Malloc disk", 00:17:43.908 "block_size": 512, 00:17:43.908 "num_blocks": 65536, 00:17:43.908 "uuid": "92b70663-4a60-4f2d-a929-3cb9d99099ee", 00:17:43.908 "assigned_rate_limits": { 00:17:43.908 "rw_ios_per_sec": 0, 00:17:43.908 "rw_mbytes_per_sec": 0, 00:17:43.908 "r_mbytes_per_sec": 0, 00:17:43.908 "w_mbytes_per_sec": 0 00:17:43.908 }, 00:17:43.908 "claimed": true, 00:17:43.908 "claim_type": "exclusive_write", 00:17:43.908 "zoned": false, 00:17:43.908 "supported_io_types": { 00:17:43.908 "read": true, 00:17:43.908 "write": true, 00:17:43.908 "unmap": true, 00:17:43.908 "write_zeroes": true, 00:17:43.908 "flush": true, 00:17:43.908 "reset": true, 00:17:43.908 "compare": false, 00:17:43.908 "compare_and_write": false, 00:17:43.908 "abort": true, 00:17:43.908 "nvme_admin": false, 00:17:43.908 "nvme_io": false 00:17:43.908 }, 00:17:43.908 "memory_domains": [ 00:17:43.908 { 00:17:43.908 "dma_device_id": "system", 00:17:43.908 "dma_device_type": 1 00:17:43.908 }, 00:17:43.908 { 00:17:43.908 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:43.908 "dma_device_type": 2 00:17:43.908 } 00:17:43.908 ], 00:17:43.908 "driver_specific": {} 00:17:43.908 } 00:17:43.908 ] 00:17:43.908 23:59:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:17:43.908 23:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:17:43.908 23:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:17:43.908 23:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:17:43.908 23:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:43.908 23:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:17:43.908 23:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:43.909 23:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:43.909 23:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:43.909 23:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:43.909 23:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:43.909 23:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:43.909 23:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:43.909 23:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:43.909 23:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:44.168 23:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:44.168 "name": "Existed_Raid", 00:17:44.168 "uuid": "8862704c-9797-4c11-8938-e91ae4f5acea", 00:17:44.168 "strip_size_kb": 64, 00:17:44.168 "state": "online", 00:17:44.168 "raid_level": "concat", 00:17:44.168 "superblock": true, 00:17:44.168 "num_base_bdevs": 4, 00:17:44.168 "num_base_bdevs_discovered": 4, 00:17:44.168 "num_base_bdevs_operational": 4, 00:17:44.168 "base_bdevs_list": [ 00:17:44.168 { 00:17:44.168 "name": "BaseBdev1", 00:17:44.168 "uuid": "4bf5fc96-4b88-47a1-b725-9fb40b62503c", 00:17:44.168 "is_configured": true, 00:17:44.168 "data_offset": 2048, 00:17:44.168 "data_size": 63488 00:17:44.168 }, 00:17:44.168 { 00:17:44.168 "name": "BaseBdev2", 00:17:44.168 "uuid": "45d7d3ef-c705-44da-ab59-8fa94b2503fb", 00:17:44.168 "is_configured": true, 00:17:44.168 "data_offset": 2048, 00:17:44.168 "data_size": 63488 00:17:44.168 }, 00:17:44.168 { 00:17:44.168 "name": "BaseBdev3", 00:17:44.168 "uuid": "d9082f42-a20d-4dc3-ae3b-4cced6ef286a", 00:17:44.168 "is_configured": true, 00:17:44.168 "data_offset": 2048, 00:17:44.168 "data_size": 63488 00:17:44.168 }, 00:17:44.168 { 00:17:44.168 "name": "BaseBdev4", 00:17:44.168 "uuid": "92b70663-4a60-4f2d-a929-3cb9d99099ee", 00:17:44.168 "is_configured": true, 00:17:44.168 "data_offset": 2048, 00:17:44.168 "data_size": 63488 00:17:44.168 } 00:17:44.168 ] 00:17:44.168 }' 00:17:44.168 23:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:44.168 23:59:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:44.736 23:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:17:44.736 23:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:17:44.736 23:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:17:44.736 23:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:17:44.736 23:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:17:44.736 23:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:17:44.736 23:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:44.736 23:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:17:44.995 [2024-05-14 23:59:45.343125] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:44.995 23:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:17:44.995 "name": "Existed_Raid", 00:17:44.995 "aliases": [ 00:17:44.995 "8862704c-9797-4c11-8938-e91ae4f5acea" 00:17:44.995 ], 00:17:44.995 "product_name": "Raid Volume", 00:17:44.995 "block_size": 512, 00:17:44.995 "num_blocks": 253952, 00:17:44.995 "uuid": "8862704c-9797-4c11-8938-e91ae4f5acea", 00:17:44.995 "assigned_rate_limits": { 00:17:44.995 "rw_ios_per_sec": 0, 00:17:44.995 "rw_mbytes_per_sec": 0, 00:17:44.995 "r_mbytes_per_sec": 0, 00:17:44.995 "w_mbytes_per_sec": 0 00:17:44.995 }, 00:17:44.995 "claimed": false, 00:17:44.995 "zoned": false, 00:17:44.995 "supported_io_types": { 00:17:44.995 "read": true, 00:17:44.995 "write": true, 00:17:44.995 "unmap": true, 00:17:44.995 "write_zeroes": true, 00:17:44.995 "flush": true, 00:17:44.995 "reset": true, 00:17:44.995 "compare": false, 00:17:44.995 "compare_and_write": false, 00:17:44.995 "abort": false, 00:17:44.995 "nvme_admin": false, 00:17:44.995 "nvme_io": false 00:17:44.995 }, 00:17:44.995 "memory_domains": [ 00:17:44.995 { 00:17:44.995 "dma_device_id": "system", 00:17:44.995 "dma_device_type": 1 00:17:44.995 }, 00:17:44.995 { 00:17:44.995 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:44.995 "dma_device_type": 2 00:17:44.995 }, 00:17:44.995 { 00:17:44.995 "dma_device_id": "system", 00:17:44.995 "dma_device_type": 1 00:17:44.995 }, 00:17:44.995 { 00:17:44.995 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:44.995 "dma_device_type": 2 00:17:44.995 }, 00:17:44.995 { 00:17:44.995 "dma_device_id": "system", 00:17:44.995 "dma_device_type": 1 00:17:44.995 }, 00:17:44.995 { 00:17:44.995 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:44.995 "dma_device_type": 2 00:17:44.995 }, 00:17:44.995 { 00:17:44.995 "dma_device_id": "system", 00:17:44.995 "dma_device_type": 1 00:17:44.995 }, 00:17:44.995 { 00:17:44.995 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:44.995 "dma_device_type": 2 00:17:44.995 } 00:17:44.995 ], 00:17:44.995 "driver_specific": { 00:17:44.995 "raid": { 00:17:44.995 "uuid": "8862704c-9797-4c11-8938-e91ae4f5acea", 00:17:44.995 "strip_size_kb": 64, 00:17:44.995 "state": "online", 00:17:44.995 "raid_level": "concat", 00:17:44.995 "superblock": true, 00:17:44.995 "num_base_bdevs": 4, 00:17:44.995 "num_base_bdevs_discovered": 4, 00:17:44.995 "num_base_bdevs_operational": 4, 00:17:44.995 "base_bdevs_list": [ 00:17:44.995 { 00:17:44.995 "name": "BaseBdev1", 00:17:44.995 "uuid": "4bf5fc96-4b88-47a1-b725-9fb40b62503c", 00:17:44.995 "is_configured": true, 00:17:44.995 "data_offset": 2048, 00:17:44.995 "data_size": 63488 00:17:44.995 }, 00:17:44.995 { 00:17:44.995 "name": "BaseBdev2", 00:17:44.995 "uuid": "45d7d3ef-c705-44da-ab59-8fa94b2503fb", 00:17:44.995 "is_configured": true, 00:17:44.995 "data_offset": 2048, 00:17:44.995 "data_size": 63488 00:17:44.995 }, 00:17:44.995 { 00:17:44.995 "name": "BaseBdev3", 00:17:44.995 "uuid": "d9082f42-a20d-4dc3-ae3b-4cced6ef286a", 00:17:44.995 "is_configured": true, 00:17:44.995 "data_offset": 2048, 00:17:44.995 "data_size": 63488 00:17:44.995 }, 00:17:44.995 { 00:17:44.995 "name": "BaseBdev4", 00:17:44.995 "uuid": "92b70663-4a60-4f2d-a929-3cb9d99099ee", 00:17:44.995 "is_configured": true, 00:17:44.995 "data_offset": 2048, 00:17:44.995 "data_size": 63488 00:17:44.995 } 00:17:44.995 ] 00:17:44.995 } 00:17:44.995 } 00:17:44.995 }' 00:17:44.995 23:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:44.995 23:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:17:44.995 BaseBdev2 00:17:44.995 BaseBdev3 00:17:44.995 BaseBdev4' 00:17:44.995 23:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:44.995 23:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:44.995 23:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:45.254 23:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:45.254 "name": "BaseBdev1", 00:17:45.254 "aliases": [ 00:17:45.254 "4bf5fc96-4b88-47a1-b725-9fb40b62503c" 00:17:45.254 ], 00:17:45.254 "product_name": "Malloc disk", 00:17:45.254 "block_size": 512, 00:17:45.254 "num_blocks": 65536, 00:17:45.254 "uuid": "4bf5fc96-4b88-47a1-b725-9fb40b62503c", 00:17:45.254 "assigned_rate_limits": { 00:17:45.254 "rw_ios_per_sec": 0, 00:17:45.254 "rw_mbytes_per_sec": 0, 00:17:45.254 "r_mbytes_per_sec": 0, 00:17:45.254 "w_mbytes_per_sec": 0 00:17:45.254 }, 00:17:45.254 "claimed": true, 00:17:45.254 "claim_type": "exclusive_write", 00:17:45.254 "zoned": false, 00:17:45.254 "supported_io_types": { 00:17:45.254 "read": true, 00:17:45.254 "write": true, 00:17:45.254 "unmap": true, 00:17:45.254 "write_zeroes": true, 00:17:45.254 "flush": true, 00:17:45.254 "reset": true, 00:17:45.254 "compare": false, 00:17:45.254 "compare_and_write": false, 00:17:45.254 "abort": true, 00:17:45.254 "nvme_admin": false, 00:17:45.254 "nvme_io": false 00:17:45.254 }, 00:17:45.254 "memory_domains": [ 00:17:45.254 { 00:17:45.254 "dma_device_id": "system", 00:17:45.254 "dma_device_type": 1 00:17:45.254 }, 00:17:45.254 { 00:17:45.254 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:45.254 "dma_device_type": 2 00:17:45.254 } 00:17:45.254 ], 00:17:45.254 "driver_specific": {} 00:17:45.254 }' 00:17:45.254 23:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:45.254 23:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:45.254 23:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:45.254 23:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:45.254 23:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:45.254 23:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:45.254 23:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:45.513 23:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:45.513 23:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:45.513 23:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:45.513 23:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:45.513 23:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:45.513 23:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:45.513 23:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:45.513 23:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:45.772 23:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:45.772 "name": "BaseBdev2", 00:17:45.772 "aliases": [ 00:17:45.772 "45d7d3ef-c705-44da-ab59-8fa94b2503fb" 00:17:45.772 ], 00:17:45.772 "product_name": "Malloc disk", 00:17:45.772 "block_size": 512, 00:17:45.772 "num_blocks": 65536, 00:17:45.772 "uuid": "45d7d3ef-c705-44da-ab59-8fa94b2503fb", 00:17:45.772 "assigned_rate_limits": { 00:17:45.772 "rw_ios_per_sec": 0, 00:17:45.772 "rw_mbytes_per_sec": 0, 00:17:45.772 "r_mbytes_per_sec": 0, 00:17:45.772 "w_mbytes_per_sec": 0 00:17:45.772 }, 00:17:45.772 "claimed": true, 00:17:45.772 "claim_type": "exclusive_write", 00:17:45.772 "zoned": false, 00:17:45.772 "supported_io_types": { 00:17:45.772 "read": true, 00:17:45.772 "write": true, 00:17:45.772 "unmap": true, 00:17:45.772 "write_zeroes": true, 00:17:45.772 "flush": true, 00:17:45.772 "reset": true, 00:17:45.772 "compare": false, 00:17:45.772 "compare_and_write": false, 00:17:45.772 "abort": true, 00:17:45.772 "nvme_admin": false, 00:17:45.772 "nvme_io": false 00:17:45.772 }, 00:17:45.772 "memory_domains": [ 00:17:45.772 { 00:17:45.772 "dma_device_id": "system", 00:17:45.772 "dma_device_type": 1 00:17:45.772 }, 00:17:45.772 { 00:17:45.772 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:45.772 "dma_device_type": 2 00:17:45.772 } 00:17:45.772 ], 00:17:45.772 "driver_specific": {} 00:17:45.772 }' 00:17:45.772 23:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:45.772 23:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:45.772 23:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:45.772 23:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:46.031 23:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:46.031 23:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:46.031 23:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:46.031 23:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:46.031 23:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:46.031 23:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:46.031 23:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:46.031 23:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:46.031 23:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:46.031 23:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:46.031 23:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:46.289 23:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:46.289 "name": "BaseBdev3", 00:17:46.289 "aliases": [ 00:17:46.289 "d9082f42-a20d-4dc3-ae3b-4cced6ef286a" 00:17:46.289 ], 00:17:46.289 "product_name": "Malloc disk", 00:17:46.289 "block_size": 512, 00:17:46.289 "num_blocks": 65536, 00:17:46.289 "uuid": "d9082f42-a20d-4dc3-ae3b-4cced6ef286a", 00:17:46.289 "assigned_rate_limits": { 00:17:46.289 "rw_ios_per_sec": 0, 00:17:46.289 "rw_mbytes_per_sec": 0, 00:17:46.290 "r_mbytes_per_sec": 0, 00:17:46.290 "w_mbytes_per_sec": 0 00:17:46.290 }, 00:17:46.290 "claimed": true, 00:17:46.290 "claim_type": "exclusive_write", 00:17:46.290 "zoned": false, 00:17:46.290 "supported_io_types": { 00:17:46.290 "read": true, 00:17:46.290 "write": true, 00:17:46.290 "unmap": true, 00:17:46.290 "write_zeroes": true, 00:17:46.290 "flush": true, 00:17:46.290 "reset": true, 00:17:46.290 "compare": false, 00:17:46.290 "compare_and_write": false, 00:17:46.290 "abort": true, 00:17:46.290 "nvme_admin": false, 00:17:46.290 "nvme_io": false 00:17:46.290 }, 00:17:46.290 "memory_domains": [ 00:17:46.290 { 00:17:46.290 "dma_device_id": "system", 00:17:46.290 "dma_device_type": 1 00:17:46.290 }, 00:17:46.290 { 00:17:46.290 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.290 "dma_device_type": 2 00:17:46.290 } 00:17:46.290 ], 00:17:46.290 "driver_specific": {} 00:17:46.290 }' 00:17:46.290 23:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:46.547 23:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:46.547 23:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:46.547 23:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:46.547 23:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:46.547 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:46.547 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:46.547 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:46.547 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:46.547 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:46.805 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:46.805 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:46.805 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:46.805 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:46.805 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:47.063 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:47.063 "name": "BaseBdev4", 00:17:47.063 "aliases": [ 00:17:47.063 "92b70663-4a60-4f2d-a929-3cb9d99099ee" 00:17:47.063 ], 00:17:47.063 "product_name": "Malloc disk", 00:17:47.063 "block_size": 512, 00:17:47.063 "num_blocks": 65536, 00:17:47.063 "uuid": "92b70663-4a60-4f2d-a929-3cb9d99099ee", 00:17:47.063 "assigned_rate_limits": { 00:17:47.063 "rw_ios_per_sec": 0, 00:17:47.063 "rw_mbytes_per_sec": 0, 00:17:47.063 "r_mbytes_per_sec": 0, 00:17:47.063 "w_mbytes_per_sec": 0 00:17:47.063 }, 00:17:47.063 "claimed": true, 00:17:47.063 "claim_type": "exclusive_write", 00:17:47.063 "zoned": false, 00:17:47.063 "supported_io_types": { 00:17:47.063 "read": true, 00:17:47.063 "write": true, 00:17:47.063 "unmap": true, 00:17:47.063 "write_zeroes": true, 00:17:47.063 "flush": true, 00:17:47.063 "reset": true, 00:17:47.063 "compare": false, 00:17:47.063 "compare_and_write": false, 00:17:47.063 "abort": true, 00:17:47.063 "nvme_admin": false, 00:17:47.063 "nvme_io": false 00:17:47.063 }, 00:17:47.063 "memory_domains": [ 00:17:47.063 { 00:17:47.063 "dma_device_id": "system", 00:17:47.063 "dma_device_type": 1 00:17:47.063 }, 00:17:47.063 { 00:17:47.063 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.063 "dma_device_type": 2 00:17:47.063 } 00:17:47.063 ], 00:17:47.063 "driver_specific": {} 00:17:47.063 }' 00:17:47.063 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:47.063 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:47.063 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:47.063 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:47.063 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:47.063 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:47.063 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:47.320 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:47.320 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:47.320 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:47.320 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:47.320 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:47.320 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:47.578 [2024-05-14 23:59:47.961832] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:47.578 [2024-05-14 23:59:47.961863] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:47.578 [2024-05-14 23:59:47.961917] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:47.578 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # local expected_state 00:17:47.578 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # has_redundancy concat 00:17:47.578 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # case $1 in 00:17:47.578 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@216 -- # return 1 00:17:47.578 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:17:47.578 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:17:47.578 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:47.578 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:17:47.578 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:47.578 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:47.578 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:17:47.578 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:47.578 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:47.578 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:47.578 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:47.578 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:47.578 23:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:47.836 23:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:47.836 "name": "Existed_Raid", 00:17:47.836 "uuid": "8862704c-9797-4c11-8938-e91ae4f5acea", 00:17:47.836 "strip_size_kb": 64, 00:17:47.836 "state": "offline", 00:17:47.836 "raid_level": "concat", 00:17:47.836 "superblock": true, 00:17:47.836 "num_base_bdevs": 4, 00:17:47.836 "num_base_bdevs_discovered": 3, 00:17:47.836 "num_base_bdevs_operational": 3, 00:17:47.836 "base_bdevs_list": [ 00:17:47.836 { 00:17:47.836 "name": null, 00:17:47.836 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:47.836 "is_configured": false, 00:17:47.836 "data_offset": 2048, 00:17:47.836 "data_size": 63488 00:17:47.836 }, 00:17:47.836 { 00:17:47.836 "name": "BaseBdev2", 00:17:47.837 "uuid": "45d7d3ef-c705-44da-ab59-8fa94b2503fb", 00:17:47.837 "is_configured": true, 00:17:47.837 "data_offset": 2048, 00:17:47.837 "data_size": 63488 00:17:47.837 }, 00:17:47.837 { 00:17:47.837 "name": "BaseBdev3", 00:17:47.837 "uuid": "d9082f42-a20d-4dc3-ae3b-4cced6ef286a", 00:17:47.837 "is_configured": true, 00:17:47.837 "data_offset": 2048, 00:17:47.837 "data_size": 63488 00:17:47.837 }, 00:17:47.837 { 00:17:47.837 "name": "BaseBdev4", 00:17:47.837 "uuid": "92b70663-4a60-4f2d-a929-3cb9d99099ee", 00:17:47.837 "is_configured": true, 00:17:47.837 "data_offset": 2048, 00:17:47.837 "data_size": 63488 00:17:47.837 } 00:17:47.837 ] 00:17:47.837 }' 00:17:47.837 23:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:47.837 23:59:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:48.403 23:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:17:48.403 23:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:17:48.403 23:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:48.403 23:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:17:48.662 23:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:17:48.662 23:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:48.662 23:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:48.920 [2024-05-14 23:59:49.318533] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:48.920 23:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:17:48.920 23:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:17:48.920 23:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:48.920 23:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:17:49.179 23:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:17:49.179 23:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:49.179 23:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:49.179 [2024-05-14 23:59:49.754073] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:49.438 23:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:17:49.438 23:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:17:49.438 23:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:49.438 23:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:17:49.696 23:59:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:17:49.696 23:59:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:49.696 23:59:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:17:49.696 [2024-05-14 23:59:50.254372] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:17:49.696 [2024-05-14 23:59:50.254426] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1775470 name Existed_Raid, state offline 00:17:49.955 23:59:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:17:49.955 23:59:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:17:49.955 23:59:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:49.955 23:59:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:17:49.955 23:59:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:17:49.955 23:59:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:17:49.955 23:59:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@300 -- # '[' 4 -gt 2 ']' 00:17:49.955 23:59:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:17:49.955 23:59:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:17:49.955 23:59:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:50.214 BaseBdev2 00:17:50.214 23:59:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:17:50.214 23:59:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:17:50.214 23:59:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:50.214 23:59:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:17:50.214 23:59:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:50.214 23:59:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:50.214 23:59:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:50.473 23:59:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:50.732 [ 00:17:50.732 { 00:17:50.732 "name": "BaseBdev2", 00:17:50.732 "aliases": [ 00:17:50.732 "da462556-2429-4aef-b1fc-38c1946a3514" 00:17:50.732 ], 00:17:50.732 "product_name": "Malloc disk", 00:17:50.732 "block_size": 512, 00:17:50.732 "num_blocks": 65536, 00:17:50.732 "uuid": "da462556-2429-4aef-b1fc-38c1946a3514", 00:17:50.732 "assigned_rate_limits": { 00:17:50.732 "rw_ios_per_sec": 0, 00:17:50.732 "rw_mbytes_per_sec": 0, 00:17:50.732 "r_mbytes_per_sec": 0, 00:17:50.732 "w_mbytes_per_sec": 0 00:17:50.732 }, 00:17:50.732 "claimed": false, 00:17:50.732 "zoned": false, 00:17:50.732 "supported_io_types": { 00:17:50.732 "read": true, 00:17:50.732 "write": true, 00:17:50.732 "unmap": true, 00:17:50.732 "write_zeroes": true, 00:17:50.732 "flush": true, 00:17:50.732 "reset": true, 00:17:50.732 "compare": false, 00:17:50.732 "compare_and_write": false, 00:17:50.732 "abort": true, 00:17:50.732 "nvme_admin": false, 00:17:50.732 "nvme_io": false 00:17:50.732 }, 00:17:50.732 "memory_domains": [ 00:17:50.732 { 00:17:50.732 "dma_device_id": "system", 00:17:50.732 "dma_device_type": 1 00:17:50.732 }, 00:17:50.732 { 00:17:50.732 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:50.732 "dma_device_type": 2 00:17:50.732 } 00:17:50.732 ], 00:17:50.732 "driver_specific": {} 00:17:50.732 } 00:17:50.732 ] 00:17:50.732 23:59:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:17:50.732 23:59:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:17:50.732 23:59:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:17:50.732 23:59:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:50.991 BaseBdev3 00:17:50.991 23:59:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:17:50.991 23:59:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:17:50.991 23:59:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:50.991 23:59:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:17:50.991 23:59:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:50.991 23:59:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:50.991 23:59:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:51.250 23:59:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:51.509 [ 00:17:51.509 { 00:17:51.509 "name": "BaseBdev3", 00:17:51.509 "aliases": [ 00:17:51.509 "a8251f6c-32b0-48a3-91ae-820fd264ed23" 00:17:51.509 ], 00:17:51.509 "product_name": "Malloc disk", 00:17:51.509 "block_size": 512, 00:17:51.509 "num_blocks": 65536, 00:17:51.509 "uuid": "a8251f6c-32b0-48a3-91ae-820fd264ed23", 00:17:51.509 "assigned_rate_limits": { 00:17:51.509 "rw_ios_per_sec": 0, 00:17:51.509 "rw_mbytes_per_sec": 0, 00:17:51.509 "r_mbytes_per_sec": 0, 00:17:51.509 "w_mbytes_per_sec": 0 00:17:51.509 }, 00:17:51.509 "claimed": false, 00:17:51.509 "zoned": false, 00:17:51.509 "supported_io_types": { 00:17:51.509 "read": true, 00:17:51.509 "write": true, 00:17:51.509 "unmap": true, 00:17:51.509 "write_zeroes": true, 00:17:51.509 "flush": true, 00:17:51.509 "reset": true, 00:17:51.509 "compare": false, 00:17:51.509 "compare_and_write": false, 00:17:51.509 "abort": true, 00:17:51.509 "nvme_admin": false, 00:17:51.509 "nvme_io": false 00:17:51.509 }, 00:17:51.509 "memory_domains": [ 00:17:51.509 { 00:17:51.509 "dma_device_id": "system", 00:17:51.509 "dma_device_type": 1 00:17:51.509 }, 00:17:51.509 { 00:17:51.509 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:51.509 "dma_device_type": 2 00:17:51.509 } 00:17:51.509 ], 00:17:51.510 "driver_specific": {} 00:17:51.510 } 00:17:51.510 ] 00:17:51.510 23:59:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:17:51.510 23:59:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:17:51.510 23:59:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:17:51.510 23:59:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:51.768 BaseBdev4 00:17:51.768 23:59:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev4 00:17:51.768 23:59:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:17:51.768 23:59:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:51.768 23:59:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:17:51.768 23:59:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:51.768 23:59:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:51.768 23:59:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:52.026 23:59:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:52.285 [ 00:17:52.285 { 00:17:52.285 "name": "BaseBdev4", 00:17:52.285 "aliases": [ 00:17:52.285 "39861eae-0c78-42db-a417-1364be87aba9" 00:17:52.285 ], 00:17:52.285 "product_name": "Malloc disk", 00:17:52.285 "block_size": 512, 00:17:52.285 "num_blocks": 65536, 00:17:52.285 "uuid": "39861eae-0c78-42db-a417-1364be87aba9", 00:17:52.285 "assigned_rate_limits": { 00:17:52.285 "rw_ios_per_sec": 0, 00:17:52.285 "rw_mbytes_per_sec": 0, 00:17:52.285 "r_mbytes_per_sec": 0, 00:17:52.285 "w_mbytes_per_sec": 0 00:17:52.285 }, 00:17:52.285 "claimed": false, 00:17:52.285 "zoned": false, 00:17:52.285 "supported_io_types": { 00:17:52.285 "read": true, 00:17:52.285 "write": true, 00:17:52.285 "unmap": true, 00:17:52.285 "write_zeroes": true, 00:17:52.285 "flush": true, 00:17:52.285 "reset": true, 00:17:52.285 "compare": false, 00:17:52.285 "compare_and_write": false, 00:17:52.285 "abort": true, 00:17:52.285 "nvme_admin": false, 00:17:52.285 "nvme_io": false 00:17:52.285 }, 00:17:52.285 "memory_domains": [ 00:17:52.285 { 00:17:52.285 "dma_device_id": "system", 00:17:52.285 "dma_device_type": 1 00:17:52.285 }, 00:17:52.285 { 00:17:52.285 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:52.285 "dma_device_type": 2 00:17:52.285 } 00:17:52.285 ], 00:17:52.285 "driver_specific": {} 00:17:52.285 } 00:17:52.285 ] 00:17:52.285 23:59:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:17:52.285 23:59:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:17:52.285 23:59:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:17:52.285 23:59:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:52.544 [2024-05-14 23:59:52.931606] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:52.544 [2024-05-14 23:59:52.931649] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:52.544 [2024-05-14 23:59:52.931670] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:52.544 [2024-05-14 23:59:52.933015] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:52.544 [2024-05-14 23:59:52.933057] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:52.544 23:59:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:52.544 23:59:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:52.544 23:59:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:52.544 23:59:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:52.544 23:59:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:52.544 23:59:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:52.544 23:59:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:52.544 23:59:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:52.544 23:59:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:52.544 23:59:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:52.544 23:59:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:52.544 23:59:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:52.803 23:59:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:52.803 "name": "Existed_Raid", 00:17:52.803 "uuid": "ef29b586-fe9e-411c-96ef-053a73892fb1", 00:17:52.803 "strip_size_kb": 64, 00:17:52.803 "state": "configuring", 00:17:52.803 "raid_level": "concat", 00:17:52.803 "superblock": true, 00:17:52.803 "num_base_bdevs": 4, 00:17:52.803 "num_base_bdevs_discovered": 3, 00:17:52.803 "num_base_bdevs_operational": 4, 00:17:52.803 "base_bdevs_list": [ 00:17:52.803 { 00:17:52.803 "name": "BaseBdev1", 00:17:52.803 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:52.803 "is_configured": false, 00:17:52.803 "data_offset": 0, 00:17:52.803 "data_size": 0 00:17:52.803 }, 00:17:52.803 { 00:17:52.803 "name": "BaseBdev2", 00:17:52.803 "uuid": "da462556-2429-4aef-b1fc-38c1946a3514", 00:17:52.803 "is_configured": true, 00:17:52.803 "data_offset": 2048, 00:17:52.803 "data_size": 63488 00:17:52.803 }, 00:17:52.803 { 00:17:52.803 "name": "BaseBdev3", 00:17:52.803 "uuid": "a8251f6c-32b0-48a3-91ae-820fd264ed23", 00:17:52.803 "is_configured": true, 00:17:52.803 "data_offset": 2048, 00:17:52.803 "data_size": 63488 00:17:52.803 }, 00:17:52.803 { 00:17:52.803 "name": "BaseBdev4", 00:17:52.803 "uuid": "39861eae-0c78-42db-a417-1364be87aba9", 00:17:52.803 "is_configured": true, 00:17:52.803 "data_offset": 2048, 00:17:52.803 "data_size": 63488 00:17:52.803 } 00:17:52.803 ] 00:17:52.803 }' 00:17:52.803 23:59:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:52.803 23:59:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:53.369 23:59:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:53.627 [2024-05-14 23:59:53.998591] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:53.627 23:59:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:53.627 23:59:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:53.627 23:59:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:53.627 23:59:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:53.627 23:59:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:53.627 23:59:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:53.627 23:59:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:53.627 23:59:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:53.627 23:59:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:53.627 23:59:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:53.627 23:59:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:53.627 23:59:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:53.886 23:59:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:53.886 "name": "Existed_Raid", 00:17:53.886 "uuid": "ef29b586-fe9e-411c-96ef-053a73892fb1", 00:17:53.886 "strip_size_kb": 64, 00:17:53.886 "state": "configuring", 00:17:53.886 "raid_level": "concat", 00:17:53.886 "superblock": true, 00:17:53.886 "num_base_bdevs": 4, 00:17:53.886 "num_base_bdevs_discovered": 2, 00:17:53.886 "num_base_bdevs_operational": 4, 00:17:53.886 "base_bdevs_list": [ 00:17:53.886 { 00:17:53.886 "name": "BaseBdev1", 00:17:53.886 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:53.886 "is_configured": false, 00:17:53.886 "data_offset": 0, 00:17:53.886 "data_size": 0 00:17:53.886 }, 00:17:53.886 { 00:17:53.886 "name": null, 00:17:53.886 "uuid": "da462556-2429-4aef-b1fc-38c1946a3514", 00:17:53.886 "is_configured": false, 00:17:53.886 "data_offset": 2048, 00:17:53.886 "data_size": 63488 00:17:53.886 }, 00:17:53.886 { 00:17:53.886 "name": "BaseBdev3", 00:17:53.886 "uuid": "a8251f6c-32b0-48a3-91ae-820fd264ed23", 00:17:53.886 "is_configured": true, 00:17:53.886 "data_offset": 2048, 00:17:53.886 "data_size": 63488 00:17:53.886 }, 00:17:53.886 { 00:17:53.886 "name": "BaseBdev4", 00:17:53.886 "uuid": "39861eae-0c78-42db-a417-1364be87aba9", 00:17:53.886 "is_configured": true, 00:17:53.886 "data_offset": 2048, 00:17:53.886 "data_size": 63488 00:17:53.886 } 00:17:53.886 ] 00:17:53.886 }' 00:17:53.886 23:59:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:53.886 23:59:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:54.453 23:59:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:54.453 23:59:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:54.711 23:59:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:17:54.711 23:59:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:54.969 [2024-05-14 23:59:55.345754] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:54.969 BaseBdev1 00:17:54.969 23:59:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:17:54.969 23:59:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:17:54.969 23:59:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:54.969 23:59:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:17:54.969 23:59:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:54.969 23:59:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:54.969 23:59:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:55.227 23:59:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:55.486 [ 00:17:55.486 { 00:17:55.486 "name": "BaseBdev1", 00:17:55.486 "aliases": [ 00:17:55.486 "e30593e2-cf25-43b8-a846-a7faa6b085b4" 00:17:55.486 ], 00:17:55.486 "product_name": "Malloc disk", 00:17:55.486 "block_size": 512, 00:17:55.486 "num_blocks": 65536, 00:17:55.486 "uuid": "e30593e2-cf25-43b8-a846-a7faa6b085b4", 00:17:55.486 "assigned_rate_limits": { 00:17:55.486 "rw_ios_per_sec": 0, 00:17:55.486 "rw_mbytes_per_sec": 0, 00:17:55.486 "r_mbytes_per_sec": 0, 00:17:55.486 "w_mbytes_per_sec": 0 00:17:55.486 }, 00:17:55.486 "claimed": true, 00:17:55.486 "claim_type": "exclusive_write", 00:17:55.486 "zoned": false, 00:17:55.486 "supported_io_types": { 00:17:55.486 "read": true, 00:17:55.486 "write": true, 00:17:55.486 "unmap": true, 00:17:55.486 "write_zeroes": true, 00:17:55.486 "flush": true, 00:17:55.486 "reset": true, 00:17:55.486 "compare": false, 00:17:55.486 "compare_and_write": false, 00:17:55.486 "abort": true, 00:17:55.486 "nvme_admin": false, 00:17:55.486 "nvme_io": false 00:17:55.486 }, 00:17:55.486 "memory_domains": [ 00:17:55.486 { 00:17:55.486 "dma_device_id": "system", 00:17:55.486 "dma_device_type": 1 00:17:55.486 }, 00:17:55.486 { 00:17:55.486 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:55.486 "dma_device_type": 2 00:17:55.486 } 00:17:55.486 ], 00:17:55.486 "driver_specific": {} 00:17:55.486 } 00:17:55.486 ] 00:17:55.486 23:59:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:17:55.486 23:59:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:55.486 23:59:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:55.486 23:59:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:55.486 23:59:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:55.486 23:59:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:55.486 23:59:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:55.486 23:59:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:55.486 23:59:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:55.486 23:59:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:55.486 23:59:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:55.486 23:59:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:55.486 23:59:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:55.745 23:59:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:55.745 "name": "Existed_Raid", 00:17:55.745 "uuid": "ef29b586-fe9e-411c-96ef-053a73892fb1", 00:17:55.745 "strip_size_kb": 64, 00:17:55.745 "state": "configuring", 00:17:55.745 "raid_level": "concat", 00:17:55.745 "superblock": true, 00:17:55.745 "num_base_bdevs": 4, 00:17:55.745 "num_base_bdevs_discovered": 3, 00:17:55.745 "num_base_bdevs_operational": 4, 00:17:55.745 "base_bdevs_list": [ 00:17:55.745 { 00:17:55.745 "name": "BaseBdev1", 00:17:55.745 "uuid": "e30593e2-cf25-43b8-a846-a7faa6b085b4", 00:17:55.745 "is_configured": true, 00:17:55.745 "data_offset": 2048, 00:17:55.745 "data_size": 63488 00:17:55.746 }, 00:17:55.746 { 00:17:55.746 "name": null, 00:17:55.746 "uuid": "da462556-2429-4aef-b1fc-38c1946a3514", 00:17:55.746 "is_configured": false, 00:17:55.746 "data_offset": 2048, 00:17:55.746 "data_size": 63488 00:17:55.746 }, 00:17:55.746 { 00:17:55.746 "name": "BaseBdev3", 00:17:55.746 "uuid": "a8251f6c-32b0-48a3-91ae-820fd264ed23", 00:17:55.746 "is_configured": true, 00:17:55.746 "data_offset": 2048, 00:17:55.746 "data_size": 63488 00:17:55.746 }, 00:17:55.746 { 00:17:55.746 "name": "BaseBdev4", 00:17:55.746 "uuid": "39861eae-0c78-42db-a417-1364be87aba9", 00:17:55.746 "is_configured": true, 00:17:55.746 "data_offset": 2048, 00:17:55.746 "data_size": 63488 00:17:55.746 } 00:17:55.746 ] 00:17:55.746 }' 00:17:55.746 23:59:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:55.746 23:59:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:56.313 23:59:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:56.313 23:59:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.571 23:59:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:17:56.571 23:59:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:56.571 [2024-05-14 23:59:57.146565] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:56.830 23:59:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:56.830 23:59:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:56.830 23:59:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:56.830 23:59:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:56.830 23:59:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:56.830 23:59:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:56.830 23:59:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:56.830 23:59:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:56.830 23:59:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:56.830 23:59:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:56.830 23:59:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.830 23:59:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:56.830 23:59:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:56.830 "name": "Existed_Raid", 00:17:56.830 "uuid": "ef29b586-fe9e-411c-96ef-053a73892fb1", 00:17:56.830 "strip_size_kb": 64, 00:17:56.830 "state": "configuring", 00:17:56.830 "raid_level": "concat", 00:17:56.830 "superblock": true, 00:17:56.830 "num_base_bdevs": 4, 00:17:56.830 "num_base_bdevs_discovered": 2, 00:17:56.830 "num_base_bdevs_operational": 4, 00:17:56.830 "base_bdevs_list": [ 00:17:56.830 { 00:17:56.830 "name": "BaseBdev1", 00:17:56.830 "uuid": "e30593e2-cf25-43b8-a846-a7faa6b085b4", 00:17:56.830 "is_configured": true, 00:17:56.830 "data_offset": 2048, 00:17:56.830 "data_size": 63488 00:17:56.830 }, 00:17:56.830 { 00:17:56.830 "name": null, 00:17:56.830 "uuid": "da462556-2429-4aef-b1fc-38c1946a3514", 00:17:56.830 "is_configured": false, 00:17:56.830 "data_offset": 2048, 00:17:56.830 "data_size": 63488 00:17:56.830 }, 00:17:56.830 { 00:17:56.830 "name": null, 00:17:56.831 "uuid": "a8251f6c-32b0-48a3-91ae-820fd264ed23", 00:17:56.831 "is_configured": false, 00:17:56.831 "data_offset": 2048, 00:17:56.831 "data_size": 63488 00:17:56.831 }, 00:17:56.831 { 00:17:56.831 "name": "BaseBdev4", 00:17:56.831 "uuid": "39861eae-0c78-42db-a417-1364be87aba9", 00:17:56.831 "is_configured": true, 00:17:56.831 "data_offset": 2048, 00:17:56.831 "data_size": 63488 00:17:56.831 } 00:17:56.831 ] 00:17:56.831 }' 00:17:56.831 23:59:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:56.831 23:59:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:57.769 23:59:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:57.769 23:59:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:57.769 23:59:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:17:57.769 23:59:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:58.064 [2024-05-14 23:59:58.397913] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:58.064 23:59:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:58.064 23:59:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:58.064 23:59:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:58.064 23:59:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:58.064 23:59:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:58.064 23:59:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:58.064 23:59:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:58.064 23:59:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:58.064 23:59:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:58.064 23:59:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:58.064 23:59:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:58.064 23:59:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:58.322 23:59:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:58.322 "name": "Existed_Raid", 00:17:58.322 "uuid": "ef29b586-fe9e-411c-96ef-053a73892fb1", 00:17:58.322 "strip_size_kb": 64, 00:17:58.322 "state": "configuring", 00:17:58.322 "raid_level": "concat", 00:17:58.322 "superblock": true, 00:17:58.322 "num_base_bdevs": 4, 00:17:58.322 "num_base_bdevs_discovered": 3, 00:17:58.322 "num_base_bdevs_operational": 4, 00:17:58.322 "base_bdevs_list": [ 00:17:58.322 { 00:17:58.322 "name": "BaseBdev1", 00:17:58.323 "uuid": "e30593e2-cf25-43b8-a846-a7faa6b085b4", 00:17:58.323 "is_configured": true, 00:17:58.323 "data_offset": 2048, 00:17:58.323 "data_size": 63488 00:17:58.323 }, 00:17:58.323 { 00:17:58.323 "name": null, 00:17:58.323 "uuid": "da462556-2429-4aef-b1fc-38c1946a3514", 00:17:58.323 "is_configured": false, 00:17:58.323 "data_offset": 2048, 00:17:58.323 "data_size": 63488 00:17:58.323 }, 00:17:58.323 { 00:17:58.323 "name": "BaseBdev3", 00:17:58.323 "uuid": "a8251f6c-32b0-48a3-91ae-820fd264ed23", 00:17:58.323 "is_configured": true, 00:17:58.323 "data_offset": 2048, 00:17:58.323 "data_size": 63488 00:17:58.323 }, 00:17:58.323 { 00:17:58.323 "name": "BaseBdev4", 00:17:58.323 "uuid": "39861eae-0c78-42db-a417-1364be87aba9", 00:17:58.323 "is_configured": true, 00:17:58.323 "data_offset": 2048, 00:17:58.323 "data_size": 63488 00:17:58.323 } 00:17:58.323 ] 00:17:58.323 }' 00:17:58.323 23:59:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:58.323 23:59:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:58.888 23:59:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:58.888 23:59:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:58.888 23:59:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:17:58.888 23:59:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:59.146 [2024-05-14 23:59:59.657270] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:59.146 23:59:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:59.146 23:59:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:59.146 23:59:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:59.146 23:59:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:59.146 23:59:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:59.146 23:59:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:59.146 23:59:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:59.146 23:59:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:59.146 23:59:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:59.146 23:59:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:59.146 23:59:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:59.146 23:59:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:59.404 23:59:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:59.404 "name": "Existed_Raid", 00:17:59.404 "uuid": "ef29b586-fe9e-411c-96ef-053a73892fb1", 00:17:59.404 "strip_size_kb": 64, 00:17:59.404 "state": "configuring", 00:17:59.404 "raid_level": "concat", 00:17:59.404 "superblock": true, 00:17:59.404 "num_base_bdevs": 4, 00:17:59.404 "num_base_bdevs_discovered": 2, 00:17:59.404 "num_base_bdevs_operational": 4, 00:17:59.404 "base_bdevs_list": [ 00:17:59.404 { 00:17:59.404 "name": null, 00:17:59.404 "uuid": "e30593e2-cf25-43b8-a846-a7faa6b085b4", 00:17:59.404 "is_configured": false, 00:17:59.404 "data_offset": 2048, 00:17:59.404 "data_size": 63488 00:17:59.404 }, 00:17:59.404 { 00:17:59.404 "name": null, 00:17:59.404 "uuid": "da462556-2429-4aef-b1fc-38c1946a3514", 00:17:59.404 "is_configured": false, 00:17:59.404 "data_offset": 2048, 00:17:59.404 "data_size": 63488 00:17:59.404 }, 00:17:59.404 { 00:17:59.404 "name": "BaseBdev3", 00:17:59.404 "uuid": "a8251f6c-32b0-48a3-91ae-820fd264ed23", 00:17:59.404 "is_configured": true, 00:17:59.404 "data_offset": 2048, 00:17:59.404 "data_size": 63488 00:17:59.404 }, 00:17:59.404 { 00:17:59.404 "name": "BaseBdev4", 00:17:59.404 "uuid": "39861eae-0c78-42db-a417-1364be87aba9", 00:17:59.404 "is_configured": true, 00:17:59.404 "data_offset": 2048, 00:17:59.404 "data_size": 63488 00:17:59.404 } 00:17:59.404 ] 00:17:59.404 }' 00:17:59.404 23:59:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:59.404 23:59:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:59.970 00:00:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:59.970 00:00:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:00.229 00:00:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:18:00.229 00:00:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:00.487 [2024-05-15 00:00:00.989271] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:00.487 00:00:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:00.488 00:00:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:00.488 00:00:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:00.488 00:00:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:18:00.488 00:00:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:18:00.488 00:00:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:00.488 00:00:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:00.488 00:00:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:00.488 00:00:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:00.488 00:00:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:00.488 00:00:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:00.488 00:00:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:00.747 00:00:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:00.747 "name": "Existed_Raid", 00:18:00.747 "uuid": "ef29b586-fe9e-411c-96ef-053a73892fb1", 00:18:00.747 "strip_size_kb": 64, 00:18:00.747 "state": "configuring", 00:18:00.747 "raid_level": "concat", 00:18:00.747 "superblock": true, 00:18:00.747 "num_base_bdevs": 4, 00:18:00.747 "num_base_bdevs_discovered": 3, 00:18:00.747 "num_base_bdevs_operational": 4, 00:18:00.747 "base_bdevs_list": [ 00:18:00.747 { 00:18:00.747 "name": null, 00:18:00.747 "uuid": "e30593e2-cf25-43b8-a846-a7faa6b085b4", 00:18:00.747 "is_configured": false, 00:18:00.747 "data_offset": 2048, 00:18:00.747 "data_size": 63488 00:18:00.747 }, 00:18:00.747 { 00:18:00.747 "name": "BaseBdev2", 00:18:00.747 "uuid": "da462556-2429-4aef-b1fc-38c1946a3514", 00:18:00.747 "is_configured": true, 00:18:00.747 "data_offset": 2048, 00:18:00.747 "data_size": 63488 00:18:00.747 }, 00:18:00.747 { 00:18:00.747 "name": "BaseBdev3", 00:18:00.747 "uuid": "a8251f6c-32b0-48a3-91ae-820fd264ed23", 00:18:00.747 "is_configured": true, 00:18:00.747 "data_offset": 2048, 00:18:00.747 "data_size": 63488 00:18:00.747 }, 00:18:00.747 { 00:18:00.747 "name": "BaseBdev4", 00:18:00.747 "uuid": "39861eae-0c78-42db-a417-1364be87aba9", 00:18:00.747 "is_configured": true, 00:18:00.747 "data_offset": 2048, 00:18:00.747 "data_size": 63488 00:18:00.747 } 00:18:00.747 ] 00:18:00.747 }' 00:18:00.747 00:00:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:00.747 00:00:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:01.314 00:00:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:01.314 00:00:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:01.572 00:00:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:18:01.572 00:00:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:01.572 00:00:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:01.829 00:00:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u e30593e2-cf25-43b8-a846-a7faa6b085b4 00:18:02.085 [2024-05-15 00:00:02.494040] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:02.085 [2024-05-15 00:00:02.494210] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x191b370 00:18:02.085 [2024-05-15 00:00:02.494223] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:02.085 [2024-05-15 00:00:02.494423] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1690340 00:18:02.085 [2024-05-15 00:00:02.494549] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x191b370 00:18:02.085 [2024-05-15 00:00:02.494559] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x191b370 00:18:02.085 [2024-05-15 00:00:02.494664] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:02.085 NewBaseBdev 00:18:02.086 00:00:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:18:02.086 00:00:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:18:02.086 00:00:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:18:02.086 00:00:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:18:02.086 00:00:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:18:02.086 00:00:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:18:02.086 00:00:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:02.342 00:00:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:02.598 [ 00:18:02.598 { 00:18:02.598 "name": "NewBaseBdev", 00:18:02.598 "aliases": [ 00:18:02.598 "e30593e2-cf25-43b8-a846-a7faa6b085b4" 00:18:02.598 ], 00:18:02.598 "product_name": "Malloc disk", 00:18:02.598 "block_size": 512, 00:18:02.598 "num_blocks": 65536, 00:18:02.598 "uuid": "e30593e2-cf25-43b8-a846-a7faa6b085b4", 00:18:02.598 "assigned_rate_limits": { 00:18:02.598 "rw_ios_per_sec": 0, 00:18:02.598 "rw_mbytes_per_sec": 0, 00:18:02.598 "r_mbytes_per_sec": 0, 00:18:02.598 "w_mbytes_per_sec": 0 00:18:02.598 }, 00:18:02.598 "claimed": true, 00:18:02.598 "claim_type": "exclusive_write", 00:18:02.599 "zoned": false, 00:18:02.599 "supported_io_types": { 00:18:02.599 "read": true, 00:18:02.599 "write": true, 00:18:02.599 "unmap": true, 00:18:02.599 "write_zeroes": true, 00:18:02.599 "flush": true, 00:18:02.599 "reset": true, 00:18:02.599 "compare": false, 00:18:02.599 "compare_and_write": false, 00:18:02.599 "abort": true, 00:18:02.599 "nvme_admin": false, 00:18:02.599 "nvme_io": false 00:18:02.599 }, 00:18:02.599 "memory_domains": [ 00:18:02.599 { 00:18:02.599 "dma_device_id": "system", 00:18:02.599 "dma_device_type": 1 00:18:02.599 }, 00:18:02.599 { 00:18:02.599 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:02.599 "dma_device_type": 2 00:18:02.599 } 00:18:02.599 ], 00:18:02.599 "driver_specific": {} 00:18:02.599 } 00:18:02.599 ] 00:18:02.599 00:00:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:18:02.599 00:00:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:18:02.599 00:00:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:02.599 00:00:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:18:02.599 00:00:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:18:02.599 00:00:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:18:02.599 00:00:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:02.599 00:00:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:02.599 00:00:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:02.599 00:00:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:02.599 00:00:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:02.599 00:00:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:02.599 00:00:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:02.856 00:00:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:02.856 "name": "Existed_Raid", 00:18:02.856 "uuid": "ef29b586-fe9e-411c-96ef-053a73892fb1", 00:18:02.856 "strip_size_kb": 64, 00:18:02.856 "state": "online", 00:18:02.856 "raid_level": "concat", 00:18:02.856 "superblock": true, 00:18:02.856 "num_base_bdevs": 4, 00:18:02.856 "num_base_bdevs_discovered": 4, 00:18:02.856 "num_base_bdevs_operational": 4, 00:18:02.856 "base_bdevs_list": [ 00:18:02.856 { 00:18:02.856 "name": "NewBaseBdev", 00:18:02.856 "uuid": "e30593e2-cf25-43b8-a846-a7faa6b085b4", 00:18:02.856 "is_configured": true, 00:18:02.856 "data_offset": 2048, 00:18:02.856 "data_size": 63488 00:18:02.856 }, 00:18:02.856 { 00:18:02.856 "name": "BaseBdev2", 00:18:02.856 "uuid": "da462556-2429-4aef-b1fc-38c1946a3514", 00:18:02.856 "is_configured": true, 00:18:02.856 "data_offset": 2048, 00:18:02.856 "data_size": 63488 00:18:02.856 }, 00:18:02.856 { 00:18:02.856 "name": "BaseBdev3", 00:18:02.856 "uuid": "a8251f6c-32b0-48a3-91ae-820fd264ed23", 00:18:02.856 "is_configured": true, 00:18:02.856 "data_offset": 2048, 00:18:02.856 "data_size": 63488 00:18:02.856 }, 00:18:02.856 { 00:18:02.856 "name": "BaseBdev4", 00:18:02.857 "uuid": "39861eae-0c78-42db-a417-1364be87aba9", 00:18:02.857 "is_configured": true, 00:18:02.857 "data_offset": 2048, 00:18:02.857 "data_size": 63488 00:18:02.857 } 00:18:02.857 ] 00:18:02.857 }' 00:18:02.857 00:00:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:02.857 00:00:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:03.423 00:00:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:18:03.423 00:00:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:18:03.423 00:00:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:18:03.423 00:00:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:18:03.423 00:00:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:18:03.423 00:00:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:18:03.423 00:00:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:03.423 00:00:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:18:03.681 [2024-05-15 00:00:04.046474] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:03.681 00:00:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:18:03.681 "name": "Existed_Raid", 00:18:03.681 "aliases": [ 00:18:03.681 "ef29b586-fe9e-411c-96ef-053a73892fb1" 00:18:03.681 ], 00:18:03.681 "product_name": "Raid Volume", 00:18:03.681 "block_size": 512, 00:18:03.681 "num_blocks": 253952, 00:18:03.681 "uuid": "ef29b586-fe9e-411c-96ef-053a73892fb1", 00:18:03.681 "assigned_rate_limits": { 00:18:03.681 "rw_ios_per_sec": 0, 00:18:03.681 "rw_mbytes_per_sec": 0, 00:18:03.681 "r_mbytes_per_sec": 0, 00:18:03.681 "w_mbytes_per_sec": 0 00:18:03.681 }, 00:18:03.681 "claimed": false, 00:18:03.681 "zoned": false, 00:18:03.681 "supported_io_types": { 00:18:03.681 "read": true, 00:18:03.681 "write": true, 00:18:03.681 "unmap": true, 00:18:03.681 "write_zeroes": true, 00:18:03.681 "flush": true, 00:18:03.681 "reset": true, 00:18:03.681 "compare": false, 00:18:03.681 "compare_and_write": false, 00:18:03.681 "abort": false, 00:18:03.681 "nvme_admin": false, 00:18:03.681 "nvme_io": false 00:18:03.681 }, 00:18:03.681 "memory_domains": [ 00:18:03.681 { 00:18:03.681 "dma_device_id": "system", 00:18:03.681 "dma_device_type": 1 00:18:03.681 }, 00:18:03.681 { 00:18:03.681 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:03.681 "dma_device_type": 2 00:18:03.681 }, 00:18:03.681 { 00:18:03.681 "dma_device_id": "system", 00:18:03.681 "dma_device_type": 1 00:18:03.681 }, 00:18:03.681 { 00:18:03.681 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:03.681 "dma_device_type": 2 00:18:03.681 }, 00:18:03.681 { 00:18:03.681 "dma_device_id": "system", 00:18:03.681 "dma_device_type": 1 00:18:03.681 }, 00:18:03.681 { 00:18:03.681 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:03.681 "dma_device_type": 2 00:18:03.681 }, 00:18:03.681 { 00:18:03.681 "dma_device_id": "system", 00:18:03.681 "dma_device_type": 1 00:18:03.681 }, 00:18:03.681 { 00:18:03.681 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:03.681 "dma_device_type": 2 00:18:03.681 } 00:18:03.681 ], 00:18:03.681 "driver_specific": { 00:18:03.681 "raid": { 00:18:03.681 "uuid": "ef29b586-fe9e-411c-96ef-053a73892fb1", 00:18:03.681 "strip_size_kb": 64, 00:18:03.681 "state": "online", 00:18:03.681 "raid_level": "concat", 00:18:03.681 "superblock": true, 00:18:03.681 "num_base_bdevs": 4, 00:18:03.681 "num_base_bdevs_discovered": 4, 00:18:03.681 "num_base_bdevs_operational": 4, 00:18:03.681 "base_bdevs_list": [ 00:18:03.681 { 00:18:03.681 "name": "NewBaseBdev", 00:18:03.681 "uuid": "e30593e2-cf25-43b8-a846-a7faa6b085b4", 00:18:03.681 "is_configured": true, 00:18:03.681 "data_offset": 2048, 00:18:03.681 "data_size": 63488 00:18:03.681 }, 00:18:03.681 { 00:18:03.681 "name": "BaseBdev2", 00:18:03.681 "uuid": "da462556-2429-4aef-b1fc-38c1946a3514", 00:18:03.681 "is_configured": true, 00:18:03.681 "data_offset": 2048, 00:18:03.681 "data_size": 63488 00:18:03.681 }, 00:18:03.681 { 00:18:03.681 "name": "BaseBdev3", 00:18:03.681 "uuid": "a8251f6c-32b0-48a3-91ae-820fd264ed23", 00:18:03.681 "is_configured": true, 00:18:03.681 "data_offset": 2048, 00:18:03.681 "data_size": 63488 00:18:03.681 }, 00:18:03.681 { 00:18:03.681 "name": "BaseBdev4", 00:18:03.681 "uuid": "39861eae-0c78-42db-a417-1364be87aba9", 00:18:03.681 "is_configured": true, 00:18:03.681 "data_offset": 2048, 00:18:03.681 "data_size": 63488 00:18:03.681 } 00:18:03.681 ] 00:18:03.681 } 00:18:03.681 } 00:18:03.681 }' 00:18:03.681 00:00:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:03.681 00:00:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:18:03.681 BaseBdev2 00:18:03.681 BaseBdev3 00:18:03.681 BaseBdev4' 00:18:03.681 00:00:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:03.681 00:00:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:03.681 00:00:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:03.940 00:00:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:03.940 "name": "NewBaseBdev", 00:18:03.940 "aliases": [ 00:18:03.940 "e30593e2-cf25-43b8-a846-a7faa6b085b4" 00:18:03.940 ], 00:18:03.940 "product_name": "Malloc disk", 00:18:03.940 "block_size": 512, 00:18:03.940 "num_blocks": 65536, 00:18:03.940 "uuid": "e30593e2-cf25-43b8-a846-a7faa6b085b4", 00:18:03.940 "assigned_rate_limits": { 00:18:03.940 "rw_ios_per_sec": 0, 00:18:03.940 "rw_mbytes_per_sec": 0, 00:18:03.940 "r_mbytes_per_sec": 0, 00:18:03.940 "w_mbytes_per_sec": 0 00:18:03.940 }, 00:18:03.940 "claimed": true, 00:18:03.940 "claim_type": "exclusive_write", 00:18:03.940 "zoned": false, 00:18:03.940 "supported_io_types": { 00:18:03.940 "read": true, 00:18:03.940 "write": true, 00:18:03.940 "unmap": true, 00:18:03.940 "write_zeroes": true, 00:18:03.940 "flush": true, 00:18:03.940 "reset": true, 00:18:03.940 "compare": false, 00:18:03.940 "compare_and_write": false, 00:18:03.940 "abort": true, 00:18:03.940 "nvme_admin": false, 00:18:03.940 "nvme_io": false 00:18:03.940 }, 00:18:03.940 "memory_domains": [ 00:18:03.940 { 00:18:03.940 "dma_device_id": "system", 00:18:03.940 "dma_device_type": 1 00:18:03.940 }, 00:18:03.940 { 00:18:03.940 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:03.940 "dma_device_type": 2 00:18:03.940 } 00:18:03.940 ], 00:18:03.940 "driver_specific": {} 00:18:03.940 }' 00:18:03.940 00:00:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:03.940 00:00:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:03.940 00:00:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:03.940 00:00:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:03.940 00:00:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:04.198 00:00:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:04.198 00:00:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:04.198 00:00:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:04.198 00:00:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:04.198 00:00:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:04.198 00:00:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:04.198 00:00:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:04.198 00:00:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:04.198 00:00:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:04.198 00:00:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:04.457 00:00:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:04.457 "name": "BaseBdev2", 00:18:04.457 "aliases": [ 00:18:04.457 "da462556-2429-4aef-b1fc-38c1946a3514" 00:18:04.457 ], 00:18:04.457 "product_name": "Malloc disk", 00:18:04.457 "block_size": 512, 00:18:04.457 "num_blocks": 65536, 00:18:04.457 "uuid": "da462556-2429-4aef-b1fc-38c1946a3514", 00:18:04.457 "assigned_rate_limits": { 00:18:04.457 "rw_ios_per_sec": 0, 00:18:04.457 "rw_mbytes_per_sec": 0, 00:18:04.457 "r_mbytes_per_sec": 0, 00:18:04.457 "w_mbytes_per_sec": 0 00:18:04.457 }, 00:18:04.457 "claimed": true, 00:18:04.457 "claim_type": "exclusive_write", 00:18:04.457 "zoned": false, 00:18:04.457 "supported_io_types": { 00:18:04.457 "read": true, 00:18:04.457 "write": true, 00:18:04.457 "unmap": true, 00:18:04.457 "write_zeroes": true, 00:18:04.457 "flush": true, 00:18:04.457 "reset": true, 00:18:04.457 "compare": false, 00:18:04.457 "compare_and_write": false, 00:18:04.457 "abort": true, 00:18:04.457 "nvme_admin": false, 00:18:04.457 "nvme_io": false 00:18:04.457 }, 00:18:04.457 "memory_domains": [ 00:18:04.457 { 00:18:04.457 "dma_device_id": "system", 00:18:04.457 "dma_device_type": 1 00:18:04.457 }, 00:18:04.457 { 00:18:04.457 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:04.457 "dma_device_type": 2 00:18:04.457 } 00:18:04.457 ], 00:18:04.457 "driver_specific": {} 00:18:04.457 }' 00:18:04.457 00:00:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:04.457 00:00:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:04.457 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:04.457 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:04.715 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:04.715 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:04.715 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:04.715 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:04.715 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:04.715 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:04.715 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:04.715 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:04.715 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:04.715 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:04.715 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:04.973 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:04.973 "name": "BaseBdev3", 00:18:04.973 "aliases": [ 00:18:04.973 "a8251f6c-32b0-48a3-91ae-820fd264ed23" 00:18:04.973 ], 00:18:04.973 "product_name": "Malloc disk", 00:18:04.973 "block_size": 512, 00:18:04.973 "num_blocks": 65536, 00:18:04.974 "uuid": "a8251f6c-32b0-48a3-91ae-820fd264ed23", 00:18:04.974 "assigned_rate_limits": { 00:18:04.974 "rw_ios_per_sec": 0, 00:18:04.974 "rw_mbytes_per_sec": 0, 00:18:04.974 "r_mbytes_per_sec": 0, 00:18:04.974 "w_mbytes_per_sec": 0 00:18:04.974 }, 00:18:04.974 "claimed": true, 00:18:04.974 "claim_type": "exclusive_write", 00:18:04.974 "zoned": false, 00:18:04.974 "supported_io_types": { 00:18:04.974 "read": true, 00:18:04.974 "write": true, 00:18:04.974 "unmap": true, 00:18:04.974 "write_zeroes": true, 00:18:04.974 "flush": true, 00:18:04.974 "reset": true, 00:18:04.974 "compare": false, 00:18:04.974 "compare_and_write": false, 00:18:04.974 "abort": true, 00:18:04.974 "nvme_admin": false, 00:18:04.974 "nvme_io": false 00:18:04.974 }, 00:18:04.974 "memory_domains": [ 00:18:04.974 { 00:18:04.974 "dma_device_id": "system", 00:18:04.974 "dma_device_type": 1 00:18:04.974 }, 00:18:04.974 { 00:18:04.974 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:04.974 "dma_device_type": 2 00:18:04.974 } 00:18:04.974 ], 00:18:04.974 "driver_specific": {} 00:18:04.974 }' 00:18:04.974 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:05.232 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:05.232 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:05.232 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:05.232 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:05.232 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:05.232 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:05.232 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:05.232 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:05.232 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:05.491 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:05.491 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:05.491 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:05.491 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:05.491 00:00:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:05.749 00:00:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:05.750 "name": "BaseBdev4", 00:18:05.750 "aliases": [ 00:18:05.750 "39861eae-0c78-42db-a417-1364be87aba9" 00:18:05.750 ], 00:18:05.750 "product_name": "Malloc disk", 00:18:05.750 "block_size": 512, 00:18:05.750 "num_blocks": 65536, 00:18:05.750 "uuid": "39861eae-0c78-42db-a417-1364be87aba9", 00:18:05.750 "assigned_rate_limits": { 00:18:05.750 "rw_ios_per_sec": 0, 00:18:05.750 "rw_mbytes_per_sec": 0, 00:18:05.750 "r_mbytes_per_sec": 0, 00:18:05.750 "w_mbytes_per_sec": 0 00:18:05.750 }, 00:18:05.750 "claimed": true, 00:18:05.750 "claim_type": "exclusive_write", 00:18:05.750 "zoned": false, 00:18:05.750 "supported_io_types": { 00:18:05.750 "read": true, 00:18:05.750 "write": true, 00:18:05.750 "unmap": true, 00:18:05.750 "write_zeroes": true, 00:18:05.750 "flush": true, 00:18:05.750 "reset": true, 00:18:05.750 "compare": false, 00:18:05.750 "compare_and_write": false, 00:18:05.750 "abort": true, 00:18:05.750 "nvme_admin": false, 00:18:05.750 "nvme_io": false 00:18:05.750 }, 00:18:05.750 "memory_domains": [ 00:18:05.750 { 00:18:05.750 "dma_device_id": "system", 00:18:05.750 "dma_device_type": 1 00:18:05.750 }, 00:18:05.750 { 00:18:05.750 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:05.750 "dma_device_type": 2 00:18:05.750 } 00:18:05.750 ], 00:18:05.750 "driver_specific": {} 00:18:05.750 }' 00:18:05.750 00:00:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:05.750 00:00:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:05.750 00:00:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:05.750 00:00:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:05.750 00:00:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:05.750 00:00:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:05.750 00:00:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:05.750 00:00:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:06.008 00:00:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:06.008 00:00:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:06.008 00:00:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:06.008 00:00:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:06.008 00:00:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:06.265 [2024-05-15 00:00:06.685196] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:06.265 [2024-05-15 00:00:06.685226] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:06.265 [2024-05-15 00:00:06.685288] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:06.265 [2024-05-15 00:00:06.685351] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:06.266 [2024-05-15 00:00:06.685363] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x191b370 name Existed_Raid, state offline 00:18:06.266 00:00:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@342 -- # killprocess 451113 00:18:06.266 00:00:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@946 -- # '[' -z 451113 ']' 00:18:06.266 00:00:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # kill -0 451113 00:18:06.266 00:00:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # uname 00:18:06.266 00:00:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:18:06.266 00:00:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 451113 00:18:06.266 00:00:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:18:06.266 00:00:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:18:06.266 00:00:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 451113' 00:18:06.266 killing process with pid 451113 00:18:06.266 00:00:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@965 -- # kill 451113 00:18:06.266 [2024-05-15 00:00:06.749587] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:06.266 00:00:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@970 -- # wait 451113 00:18:06.266 [2024-05-15 00:00:06.791578] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:06.523 00:00:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@344 -- # return 0 00:18:06.523 00:18:06.523 real 0m31.971s 00:18:06.523 user 0m58.558s 00:18:06.523 sys 0m5.808s 00:18:06.523 00:00:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:18:06.523 00:00:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:06.523 ************************************ 00:18:06.523 END TEST raid_state_function_test_sb 00:18:06.523 ************************************ 00:18:06.523 00:00:07 bdev_raid -- bdev/bdev_raid.sh@817 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:18:06.523 00:00:07 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:18:06.523 00:00:07 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:18:06.523 00:00:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:06.781 ************************************ 00:18:06.781 START TEST raid_superblock_test 00:18:06.781 ************************************ 00:18:06.781 00:00:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1121 -- # raid_superblock_test concat 4 00:18:06.781 00:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local raid_level=concat 00:18:06.781 00:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=4 00:18:06.781 00:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:18:06.781 00:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:18:06.781 00:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:18:06.781 00:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:18:06.781 00:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:18:06.781 00:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:18:06.781 00:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:18:06.781 00:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size 00:18:06.781 00:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:18:06.781 00:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:18:06.781 00:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:18:06.781 00:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # '[' concat '!=' raid1 ']' 00:18:06.781 00:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size=64 00:18:06.781 00:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@406 -- # strip_size_create_arg='-z 64' 00:18:06.781 00:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # raid_pid=456468 00:18:06.781 00:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # waitforlisten 456468 /var/tmp/spdk-raid.sock 00:18:06.781 00:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:18:06.782 00:00:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@827 -- # '[' -z 456468 ']' 00:18:06.782 00:00:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:06.782 00:00:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:18:06.782 00:00:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:06.782 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:06.782 00:00:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:18:06.782 00:00:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:06.782 [2024-05-15 00:00:07.201084] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:18:06.782 [2024-05-15 00:00:07.201155] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid456468 ] 00:18:06.782 [2024-05-15 00:00:07.330968] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:07.040 [2024-05-15 00:00:07.437061] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:18:07.040 [2024-05-15 00:00:07.504023] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:07.040 [2024-05-15 00:00:07.504063] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:07.606 00:00:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:18:07.606 00:00:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # return 0 00:18:07.606 00:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:18:07.606 00:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:18:07.606 00:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:18:07.606 00:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:18:07.606 00:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:18:07.606 00:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:07.606 00:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:18:07.606 00:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:07.606 00:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:18:07.864 malloc1 00:18:07.864 00:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:08.129 [2024-05-15 00:00:08.604594] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:08.129 [2024-05-15 00:00:08.604645] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:08.129 [2024-05-15 00:00:08.604667] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1aa6780 00:18:08.129 [2024-05-15 00:00:08.604680] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:08.129 [2024-05-15 00:00:08.606443] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:08.129 [2024-05-15 00:00:08.606475] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:08.129 pt1 00:18:08.129 00:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:18:08.129 00:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:18:08.129 00:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:18:08.129 00:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:18:08.129 00:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:18:08.129 00:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:08.129 00:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:18:08.129 00:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:08.129 00:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:18:08.400 malloc2 00:18:08.400 00:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:08.662 [2024-05-15 00:00:09.099972] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:08.662 [2024-05-15 00:00:09.100019] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:08.662 [2024-05-15 00:00:09.100039] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1aa7b60 00:18:08.662 [2024-05-15 00:00:09.100052] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:08.662 [2024-05-15 00:00:09.101616] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:08.662 [2024-05-15 00:00:09.101647] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:08.662 pt2 00:18:08.662 00:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:18:08.662 00:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:18:08.662 00:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc3 00:18:08.662 00:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt3 00:18:08.662 00:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:18:08.662 00:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:08.662 00:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:18:08.662 00:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:08.662 00:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:18:08.929 malloc3 00:18:08.929 00:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:09.188 [2024-05-15 00:00:09.573824] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:09.188 [2024-05-15 00:00:09.573873] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:09.188 [2024-05-15 00:00:09.573894] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c52080 00:18:09.188 [2024-05-15 00:00:09.573907] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:09.188 [2024-05-15 00:00:09.575483] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:09.188 [2024-05-15 00:00:09.575515] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:09.188 pt3 00:18:09.188 00:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:18:09.188 00:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:18:09.188 00:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc4 00:18:09.188 00:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt4 00:18:09.188 00:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:18:09.188 00:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:09.188 00:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:18:09.188 00:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:09.188 00:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:18:09.451 malloc4 00:18:09.451 00:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:09.711 [2024-05-15 00:00:10.064952] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:09.711 [2024-05-15 00:00:10.065012] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:09.711 [2024-05-15 00:00:10.065037] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c54610 00:18:09.711 [2024-05-15 00:00:10.065049] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:09.711 [2024-05-15 00:00:10.066672] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:09.711 [2024-05-15 00:00:10.066702] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:09.711 pt4 00:18:09.711 00:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:18:09.711 00:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:18:09.711 00:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:18:09.969 [2024-05-15 00:00:10.309624] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:09.969 [2024-05-15 00:00:10.310987] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:09.969 [2024-05-15 00:00:10.311042] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:09.969 [2024-05-15 00:00:10.311086] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:09.969 [2024-05-15 00:00:10.311264] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c56480 00:18:09.969 [2024-05-15 00:00:10.311276] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:09.969 [2024-05-15 00:00:10.311496] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1abd670 00:18:09.969 [2024-05-15 00:00:10.311648] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c56480 00:18:09.969 [2024-05-15 00:00:10.311657] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c56480 00:18:09.969 [2024-05-15 00:00:10.311765] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:09.969 00:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:18:09.969 00:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:18:09.969 00:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:18:09.969 00:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:18:09.969 00:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:18:09.969 00:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:09.969 00:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:09.969 00:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:09.969 00:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:09.969 00:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:09.969 00:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:09.969 00:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:10.241 00:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:10.241 "name": "raid_bdev1", 00:18:10.241 "uuid": "586689d2-d9fd-434a-aaaa-572430550058", 00:18:10.241 "strip_size_kb": 64, 00:18:10.241 "state": "online", 00:18:10.241 "raid_level": "concat", 00:18:10.241 "superblock": true, 00:18:10.241 "num_base_bdevs": 4, 00:18:10.241 "num_base_bdevs_discovered": 4, 00:18:10.241 "num_base_bdevs_operational": 4, 00:18:10.241 "base_bdevs_list": [ 00:18:10.241 { 00:18:10.241 "name": "pt1", 00:18:10.241 "uuid": "b84e654d-bf12-598d-a010-c33a398514e6", 00:18:10.241 "is_configured": true, 00:18:10.241 "data_offset": 2048, 00:18:10.241 "data_size": 63488 00:18:10.241 }, 00:18:10.241 { 00:18:10.241 "name": "pt2", 00:18:10.241 "uuid": "4c3f38d2-c13c-5452-83a3-4db073e419f5", 00:18:10.241 "is_configured": true, 00:18:10.241 "data_offset": 2048, 00:18:10.241 "data_size": 63488 00:18:10.241 }, 00:18:10.241 { 00:18:10.241 "name": "pt3", 00:18:10.241 "uuid": "efcc5234-19a6-5e12-a19e-6a27229f7d84", 00:18:10.241 "is_configured": true, 00:18:10.241 "data_offset": 2048, 00:18:10.241 "data_size": 63488 00:18:10.241 }, 00:18:10.241 { 00:18:10.241 "name": "pt4", 00:18:10.241 "uuid": "898bdea4-90fe-58ad-a719-6bda49ac0b69", 00:18:10.241 "is_configured": true, 00:18:10.241 "data_offset": 2048, 00:18:10.241 "data_size": 63488 00:18:10.241 } 00:18:10.241 ] 00:18:10.241 }' 00:18:10.241 00:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:10.241 00:00:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:10.822 00:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:18:10.822 00:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:18:10.822 00:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:18:10.822 00:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:18:10.822 00:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:18:10.822 00:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:18:10.822 00:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:10.822 00:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:18:10.822 [2024-05-15 00:00:11.364658] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:10.822 00:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:18:10.822 "name": "raid_bdev1", 00:18:10.822 "aliases": [ 00:18:10.822 "586689d2-d9fd-434a-aaaa-572430550058" 00:18:10.822 ], 00:18:10.822 "product_name": "Raid Volume", 00:18:10.822 "block_size": 512, 00:18:10.822 "num_blocks": 253952, 00:18:10.822 "uuid": "586689d2-d9fd-434a-aaaa-572430550058", 00:18:10.823 "assigned_rate_limits": { 00:18:10.823 "rw_ios_per_sec": 0, 00:18:10.823 "rw_mbytes_per_sec": 0, 00:18:10.823 "r_mbytes_per_sec": 0, 00:18:10.823 "w_mbytes_per_sec": 0 00:18:10.823 }, 00:18:10.823 "claimed": false, 00:18:10.823 "zoned": false, 00:18:10.823 "supported_io_types": { 00:18:10.823 "read": true, 00:18:10.823 "write": true, 00:18:10.823 "unmap": true, 00:18:10.823 "write_zeroes": true, 00:18:10.823 "flush": true, 00:18:10.823 "reset": true, 00:18:10.823 "compare": false, 00:18:10.823 "compare_and_write": false, 00:18:10.823 "abort": false, 00:18:10.823 "nvme_admin": false, 00:18:10.823 "nvme_io": false 00:18:10.823 }, 00:18:10.823 "memory_domains": [ 00:18:10.823 { 00:18:10.823 "dma_device_id": "system", 00:18:10.823 "dma_device_type": 1 00:18:10.823 }, 00:18:10.823 { 00:18:10.823 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:10.823 "dma_device_type": 2 00:18:10.823 }, 00:18:10.823 { 00:18:10.823 "dma_device_id": "system", 00:18:10.823 "dma_device_type": 1 00:18:10.823 }, 00:18:10.823 { 00:18:10.823 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:10.823 "dma_device_type": 2 00:18:10.823 }, 00:18:10.823 { 00:18:10.823 "dma_device_id": "system", 00:18:10.823 "dma_device_type": 1 00:18:10.823 }, 00:18:10.823 { 00:18:10.823 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:10.823 "dma_device_type": 2 00:18:10.823 }, 00:18:10.823 { 00:18:10.823 "dma_device_id": "system", 00:18:10.823 "dma_device_type": 1 00:18:10.823 }, 00:18:10.823 { 00:18:10.823 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:10.823 "dma_device_type": 2 00:18:10.823 } 00:18:10.823 ], 00:18:10.823 "driver_specific": { 00:18:10.823 "raid": { 00:18:10.823 "uuid": "586689d2-d9fd-434a-aaaa-572430550058", 00:18:10.823 "strip_size_kb": 64, 00:18:10.823 "state": "online", 00:18:10.823 "raid_level": "concat", 00:18:10.823 "superblock": true, 00:18:10.823 "num_base_bdevs": 4, 00:18:10.823 "num_base_bdevs_discovered": 4, 00:18:10.823 "num_base_bdevs_operational": 4, 00:18:10.823 "base_bdevs_list": [ 00:18:10.823 { 00:18:10.823 "name": "pt1", 00:18:10.823 "uuid": "b84e654d-bf12-598d-a010-c33a398514e6", 00:18:10.823 "is_configured": true, 00:18:10.823 "data_offset": 2048, 00:18:10.823 "data_size": 63488 00:18:10.823 }, 00:18:10.823 { 00:18:10.823 "name": "pt2", 00:18:10.823 "uuid": "4c3f38d2-c13c-5452-83a3-4db073e419f5", 00:18:10.823 "is_configured": true, 00:18:10.823 "data_offset": 2048, 00:18:10.823 "data_size": 63488 00:18:10.823 }, 00:18:10.823 { 00:18:10.823 "name": "pt3", 00:18:10.823 "uuid": "efcc5234-19a6-5e12-a19e-6a27229f7d84", 00:18:10.823 "is_configured": true, 00:18:10.823 "data_offset": 2048, 00:18:10.823 "data_size": 63488 00:18:10.823 }, 00:18:10.823 { 00:18:10.823 "name": "pt4", 00:18:10.823 "uuid": "898bdea4-90fe-58ad-a719-6bda49ac0b69", 00:18:10.823 "is_configured": true, 00:18:10.823 "data_offset": 2048, 00:18:10.823 "data_size": 63488 00:18:10.823 } 00:18:10.823 ] 00:18:10.823 } 00:18:10.823 } 00:18:10.823 }' 00:18:10.823 00:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:11.083 00:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:18:11.083 pt2 00:18:11.083 pt3 00:18:11.083 pt4' 00:18:11.083 00:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:11.083 00:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:11.083 00:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:11.349 00:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:11.349 "name": "pt1", 00:18:11.349 "aliases": [ 00:18:11.349 "b84e654d-bf12-598d-a010-c33a398514e6" 00:18:11.349 ], 00:18:11.349 "product_name": "passthru", 00:18:11.349 "block_size": 512, 00:18:11.349 "num_blocks": 65536, 00:18:11.349 "uuid": "b84e654d-bf12-598d-a010-c33a398514e6", 00:18:11.349 "assigned_rate_limits": { 00:18:11.349 "rw_ios_per_sec": 0, 00:18:11.349 "rw_mbytes_per_sec": 0, 00:18:11.349 "r_mbytes_per_sec": 0, 00:18:11.349 "w_mbytes_per_sec": 0 00:18:11.349 }, 00:18:11.349 "claimed": true, 00:18:11.349 "claim_type": "exclusive_write", 00:18:11.349 "zoned": false, 00:18:11.349 "supported_io_types": { 00:18:11.349 "read": true, 00:18:11.349 "write": true, 00:18:11.349 "unmap": true, 00:18:11.349 "write_zeroes": true, 00:18:11.349 "flush": true, 00:18:11.350 "reset": true, 00:18:11.350 "compare": false, 00:18:11.350 "compare_and_write": false, 00:18:11.350 "abort": true, 00:18:11.350 "nvme_admin": false, 00:18:11.350 "nvme_io": false 00:18:11.350 }, 00:18:11.350 "memory_domains": [ 00:18:11.350 { 00:18:11.350 "dma_device_id": "system", 00:18:11.350 "dma_device_type": 1 00:18:11.350 }, 00:18:11.350 { 00:18:11.350 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:11.350 "dma_device_type": 2 00:18:11.350 } 00:18:11.350 ], 00:18:11.350 "driver_specific": { 00:18:11.350 "passthru": { 00:18:11.350 "name": "pt1", 00:18:11.350 "base_bdev_name": "malloc1" 00:18:11.350 } 00:18:11.350 } 00:18:11.350 }' 00:18:11.350 00:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:11.350 00:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:11.350 00:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:11.350 00:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:11.350 00:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:11.350 00:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:11.350 00:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:11.350 00:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:11.613 00:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:11.613 00:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:11.613 00:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:11.613 00:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:11.613 00:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:11.613 00:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:11.613 00:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:11.912 00:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:11.912 "name": "pt2", 00:18:11.912 "aliases": [ 00:18:11.912 "4c3f38d2-c13c-5452-83a3-4db073e419f5" 00:18:11.912 ], 00:18:11.912 "product_name": "passthru", 00:18:11.912 "block_size": 512, 00:18:11.912 "num_blocks": 65536, 00:18:11.912 "uuid": "4c3f38d2-c13c-5452-83a3-4db073e419f5", 00:18:11.912 "assigned_rate_limits": { 00:18:11.912 "rw_ios_per_sec": 0, 00:18:11.912 "rw_mbytes_per_sec": 0, 00:18:11.912 "r_mbytes_per_sec": 0, 00:18:11.912 "w_mbytes_per_sec": 0 00:18:11.912 }, 00:18:11.912 "claimed": true, 00:18:11.912 "claim_type": "exclusive_write", 00:18:11.912 "zoned": false, 00:18:11.912 "supported_io_types": { 00:18:11.912 "read": true, 00:18:11.912 "write": true, 00:18:11.912 "unmap": true, 00:18:11.912 "write_zeroes": true, 00:18:11.912 "flush": true, 00:18:11.912 "reset": true, 00:18:11.912 "compare": false, 00:18:11.912 "compare_and_write": false, 00:18:11.912 "abort": true, 00:18:11.912 "nvme_admin": false, 00:18:11.912 "nvme_io": false 00:18:11.912 }, 00:18:11.912 "memory_domains": [ 00:18:11.912 { 00:18:11.912 "dma_device_id": "system", 00:18:11.912 "dma_device_type": 1 00:18:11.912 }, 00:18:11.912 { 00:18:11.912 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:11.912 "dma_device_type": 2 00:18:11.912 } 00:18:11.912 ], 00:18:11.912 "driver_specific": { 00:18:11.912 "passthru": { 00:18:11.912 "name": "pt2", 00:18:11.912 "base_bdev_name": "malloc2" 00:18:11.912 } 00:18:11.912 } 00:18:11.912 }' 00:18:11.912 00:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:11.912 00:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:11.912 00:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:11.912 00:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:11.912 00:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:11.912 00:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:11.912 00:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:12.173 00:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:12.173 00:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:12.173 00:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:12.173 00:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:12.173 00:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:12.173 00:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:12.173 00:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:12.173 00:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:12.439 00:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:12.439 "name": "pt3", 00:18:12.439 "aliases": [ 00:18:12.439 "efcc5234-19a6-5e12-a19e-6a27229f7d84" 00:18:12.439 ], 00:18:12.439 "product_name": "passthru", 00:18:12.439 "block_size": 512, 00:18:12.439 "num_blocks": 65536, 00:18:12.439 "uuid": "efcc5234-19a6-5e12-a19e-6a27229f7d84", 00:18:12.439 "assigned_rate_limits": { 00:18:12.439 "rw_ios_per_sec": 0, 00:18:12.439 "rw_mbytes_per_sec": 0, 00:18:12.439 "r_mbytes_per_sec": 0, 00:18:12.439 "w_mbytes_per_sec": 0 00:18:12.439 }, 00:18:12.439 "claimed": true, 00:18:12.439 "claim_type": "exclusive_write", 00:18:12.439 "zoned": false, 00:18:12.439 "supported_io_types": { 00:18:12.439 "read": true, 00:18:12.439 "write": true, 00:18:12.439 "unmap": true, 00:18:12.439 "write_zeroes": true, 00:18:12.439 "flush": true, 00:18:12.439 "reset": true, 00:18:12.439 "compare": false, 00:18:12.440 "compare_and_write": false, 00:18:12.440 "abort": true, 00:18:12.440 "nvme_admin": false, 00:18:12.440 "nvme_io": false 00:18:12.440 }, 00:18:12.440 "memory_domains": [ 00:18:12.440 { 00:18:12.440 "dma_device_id": "system", 00:18:12.440 "dma_device_type": 1 00:18:12.440 }, 00:18:12.440 { 00:18:12.440 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:12.440 "dma_device_type": 2 00:18:12.440 } 00:18:12.440 ], 00:18:12.440 "driver_specific": { 00:18:12.440 "passthru": { 00:18:12.440 "name": "pt3", 00:18:12.440 "base_bdev_name": "malloc3" 00:18:12.440 } 00:18:12.440 } 00:18:12.440 }' 00:18:12.440 00:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:12.440 00:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:12.440 00:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:12.440 00:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:12.440 00:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:12.704 00:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:12.704 00:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:12.704 00:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:12.704 00:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:12.704 00:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:12.704 00:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:12.704 00:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:12.704 00:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:12.704 00:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:18:12.704 00:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:12.968 00:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:12.968 "name": "pt4", 00:18:12.968 "aliases": [ 00:18:12.968 "898bdea4-90fe-58ad-a719-6bda49ac0b69" 00:18:12.968 ], 00:18:12.968 "product_name": "passthru", 00:18:12.968 "block_size": 512, 00:18:12.968 "num_blocks": 65536, 00:18:12.968 "uuid": "898bdea4-90fe-58ad-a719-6bda49ac0b69", 00:18:12.968 "assigned_rate_limits": { 00:18:12.968 "rw_ios_per_sec": 0, 00:18:12.968 "rw_mbytes_per_sec": 0, 00:18:12.968 "r_mbytes_per_sec": 0, 00:18:12.968 "w_mbytes_per_sec": 0 00:18:12.968 }, 00:18:12.968 "claimed": true, 00:18:12.968 "claim_type": "exclusive_write", 00:18:12.968 "zoned": false, 00:18:12.968 "supported_io_types": { 00:18:12.968 "read": true, 00:18:12.968 "write": true, 00:18:12.968 "unmap": true, 00:18:12.968 "write_zeroes": true, 00:18:12.968 "flush": true, 00:18:12.968 "reset": true, 00:18:12.968 "compare": false, 00:18:12.968 "compare_and_write": false, 00:18:12.968 "abort": true, 00:18:12.968 "nvme_admin": false, 00:18:12.968 "nvme_io": false 00:18:12.968 }, 00:18:12.968 "memory_domains": [ 00:18:12.968 { 00:18:12.968 "dma_device_id": "system", 00:18:12.968 "dma_device_type": 1 00:18:12.968 }, 00:18:12.968 { 00:18:12.968 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:12.968 "dma_device_type": 2 00:18:12.968 } 00:18:12.968 ], 00:18:12.968 "driver_specific": { 00:18:12.968 "passthru": { 00:18:12.968 "name": "pt4", 00:18:12.968 "base_bdev_name": "malloc4" 00:18:12.968 } 00:18:12.968 } 00:18:12.968 }' 00:18:12.968 00:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:12.968 00:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:12.968 00:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:12.968 00:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:13.233 00:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:13.233 00:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:13.233 00:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:13.233 00:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:13.233 00:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:13.233 00:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:13.233 00:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:13.233 00:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:13.233 00:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:13.233 00:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:18:13.510 [2024-05-15 00:00:14.039723] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:13.510 00:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=586689d2-d9fd-434a-aaaa-572430550058 00:18:13.510 00:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # '[' -z 586689d2-d9fd-434a-aaaa-572430550058 ']' 00:18:13.510 00:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:13.776 [2024-05-15 00:00:14.284080] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:13.776 [2024-05-15 00:00:14.284105] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:13.776 [2024-05-15 00:00:14.284160] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:13.776 [2024-05-15 00:00:14.284221] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:13.776 [2024-05-15 00:00:14.284233] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c56480 name raid_bdev1, state offline 00:18:13.776 00:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:13.776 00:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:18:14.040 00:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:18:14.040 00:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:18:14.040 00:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:18:14.040 00:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:18:14.307 00:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:18:14.307 00:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:14.566 00:00:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:18:14.566 00:00:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:14.823 00:00:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:18:14.823 00:00:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:18:15.085 00:00:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:18:15.085 00:00:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:18:15.345 00:00:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:18:15.345 00:00:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:15.345 00:00:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:18:15.345 00:00:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:15.345 00:00:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:15.345 00:00:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:15.345 00:00:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:15.345 00:00:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:15.345 00:00:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:15.345 00:00:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:15.345 00:00:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:15.345 00:00:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:18:15.345 00:00:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:15.603 [2024-05-15 00:00:15.972479] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:18:15.603 [2024-05-15 00:00:15.973835] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:18:15.603 [2024-05-15 00:00:15.973878] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:18:15.603 [2024-05-15 00:00:15.973913] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:18:15.603 [2024-05-15 00:00:15.973960] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:18:15.603 [2024-05-15 00:00:15.973999] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:18:15.603 [2024-05-15 00:00:15.974022] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:18:15.603 [2024-05-15 00:00:15.974050] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:18:15.603 [2024-05-15 00:00:15.974069] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:15.603 [2024-05-15 00:00:15.974079] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c57f60 name raid_bdev1, state configuring 00:18:15.603 request: 00:18:15.603 { 00:18:15.603 "name": "raid_bdev1", 00:18:15.603 "raid_level": "concat", 00:18:15.603 "base_bdevs": [ 00:18:15.603 "malloc1", 00:18:15.603 "malloc2", 00:18:15.603 "malloc3", 00:18:15.603 "malloc4" 00:18:15.603 ], 00:18:15.603 "superblock": false, 00:18:15.603 "strip_size_kb": 64, 00:18:15.603 "method": "bdev_raid_create", 00:18:15.603 "req_id": 1 00:18:15.603 } 00:18:15.603 Got JSON-RPC error response 00:18:15.603 response: 00:18:15.603 { 00:18:15.603 "code": -17, 00:18:15.603 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:18:15.603 } 00:18:15.603 00:00:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:18:15.603 00:00:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:18:15.603 00:00:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:18:15.603 00:00:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:18:15.603 00:00:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:15.603 00:00:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:18:15.861 00:00:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:18:15.861 00:00:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:18:15.861 00:00:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:16.118 [2024-05-15 00:00:16.461692] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:16.118 [2024-05-15 00:00:16.461745] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:16.118 [2024-05-15 00:00:16.461766] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c4faa0 00:18:16.118 [2024-05-15 00:00:16.461779] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:16.118 [2024-05-15 00:00:16.463456] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:16.118 [2024-05-15 00:00:16.463486] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:16.118 [2024-05-15 00:00:16.463556] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt1 00:18:16.118 [2024-05-15 00:00:16.463584] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:16.118 pt1 00:18:16.118 00:00:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:18:16.118 00:00:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:18:16.118 00:00:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:16.118 00:00:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:18:16.118 00:00:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:18:16.118 00:00:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:16.118 00:00:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:16.118 00:00:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:16.118 00:00:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:16.118 00:00:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:16.118 00:00:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:16.119 00:00:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:16.377 00:00:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:16.377 "name": "raid_bdev1", 00:18:16.377 "uuid": "586689d2-d9fd-434a-aaaa-572430550058", 00:18:16.377 "strip_size_kb": 64, 00:18:16.377 "state": "configuring", 00:18:16.377 "raid_level": "concat", 00:18:16.378 "superblock": true, 00:18:16.378 "num_base_bdevs": 4, 00:18:16.378 "num_base_bdevs_discovered": 1, 00:18:16.378 "num_base_bdevs_operational": 4, 00:18:16.378 "base_bdevs_list": [ 00:18:16.378 { 00:18:16.378 "name": "pt1", 00:18:16.378 "uuid": "b84e654d-bf12-598d-a010-c33a398514e6", 00:18:16.378 "is_configured": true, 00:18:16.378 "data_offset": 2048, 00:18:16.378 "data_size": 63488 00:18:16.378 }, 00:18:16.378 { 00:18:16.378 "name": null, 00:18:16.378 "uuid": "4c3f38d2-c13c-5452-83a3-4db073e419f5", 00:18:16.378 "is_configured": false, 00:18:16.378 "data_offset": 2048, 00:18:16.378 "data_size": 63488 00:18:16.378 }, 00:18:16.378 { 00:18:16.378 "name": null, 00:18:16.378 "uuid": "efcc5234-19a6-5e12-a19e-6a27229f7d84", 00:18:16.378 "is_configured": false, 00:18:16.378 "data_offset": 2048, 00:18:16.378 "data_size": 63488 00:18:16.378 }, 00:18:16.378 { 00:18:16.378 "name": null, 00:18:16.378 "uuid": "898bdea4-90fe-58ad-a719-6bda49ac0b69", 00:18:16.378 "is_configured": false, 00:18:16.378 "data_offset": 2048, 00:18:16.378 "data_size": 63488 00:18:16.378 } 00:18:16.378 ] 00:18:16.378 }' 00:18:16.378 00:00:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:16.378 00:00:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:16.944 00:00:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@470 -- # '[' 4 -gt 2 ']' 00:18:16.944 00:00:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:16.944 [2024-05-15 00:00:17.528520] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:16.944 [2024-05-15 00:00:17.528571] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:16.944 [2024-05-15 00:00:17.528592] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c51320 00:18:16.944 [2024-05-15 00:00:17.528605] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:16.944 [2024-05-15 00:00:17.528942] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:16.944 [2024-05-15 00:00:17.528961] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:16.944 [2024-05-15 00:00:17.529028] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:18:16.944 [2024-05-15 00:00:17.529047] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:16.944 pt2 00:18:17.202 00:00:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:17.202 [2024-05-15 00:00:17.773177] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:18:17.462 00:00:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:18:17.462 00:00:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:18:17.462 00:00:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:17.462 00:00:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:18:17.462 00:00:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:18:17.462 00:00:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:17.462 00:00:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:17.462 00:00:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:17.462 00:00:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:17.462 00:00:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:17.462 00:00:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:17.462 00:00:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:17.462 00:00:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:17.462 "name": "raid_bdev1", 00:18:17.462 "uuid": "586689d2-d9fd-434a-aaaa-572430550058", 00:18:17.462 "strip_size_kb": 64, 00:18:17.462 "state": "configuring", 00:18:17.462 "raid_level": "concat", 00:18:17.462 "superblock": true, 00:18:17.462 "num_base_bdevs": 4, 00:18:17.462 "num_base_bdevs_discovered": 1, 00:18:17.462 "num_base_bdevs_operational": 4, 00:18:17.462 "base_bdevs_list": [ 00:18:17.462 { 00:18:17.462 "name": "pt1", 00:18:17.462 "uuid": "b84e654d-bf12-598d-a010-c33a398514e6", 00:18:17.462 "is_configured": true, 00:18:17.462 "data_offset": 2048, 00:18:17.462 "data_size": 63488 00:18:17.462 }, 00:18:17.462 { 00:18:17.462 "name": null, 00:18:17.462 "uuid": "4c3f38d2-c13c-5452-83a3-4db073e419f5", 00:18:17.462 "is_configured": false, 00:18:17.462 "data_offset": 2048, 00:18:17.462 "data_size": 63488 00:18:17.462 }, 00:18:17.462 { 00:18:17.462 "name": null, 00:18:17.462 "uuid": "efcc5234-19a6-5e12-a19e-6a27229f7d84", 00:18:17.462 "is_configured": false, 00:18:17.462 "data_offset": 2048, 00:18:17.462 "data_size": 63488 00:18:17.462 }, 00:18:17.462 { 00:18:17.462 "name": null, 00:18:17.462 "uuid": "898bdea4-90fe-58ad-a719-6bda49ac0b69", 00:18:17.462 "is_configured": false, 00:18:17.462 "data_offset": 2048, 00:18:17.462 "data_size": 63488 00:18:17.462 } 00:18:17.462 ] 00:18:17.462 }' 00:18:17.462 00:00:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:17.462 00:00:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:18.395 00:00:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:18:18.395 00:00:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:18:18.395 00:00:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:18.395 [2024-05-15 00:00:18.860055] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:18.395 [2024-05-15 00:00:18.860109] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:18.395 [2024-05-15 00:00:18.860131] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c54840 00:18:18.395 [2024-05-15 00:00:18.860143] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:18.395 [2024-05-15 00:00:18.860491] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:18.395 [2024-05-15 00:00:18.860512] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:18.395 [2024-05-15 00:00:18.860578] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:18:18.395 [2024-05-15 00:00:18.860597] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:18.395 pt2 00:18:18.395 00:00:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:18:18.395 00:00:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:18:18.395 00:00:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:18.654 [2024-05-15 00:00:19.108713] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:18.654 [2024-05-15 00:00:19.108753] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:18.654 [2024-05-15 00:00:19.108772] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1aa50e0 00:18:18.654 [2024-05-15 00:00:19.108784] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:18.654 [2024-05-15 00:00:19.109071] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:18.654 [2024-05-15 00:00:19.109088] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:18.654 [2024-05-15 00:00:19.109145] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt3 00:18:18.654 [2024-05-15 00:00:19.109161] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:18.654 pt3 00:18:18.654 00:00:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:18:18.654 00:00:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:18:18.654 00:00:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:18.912 [2024-05-15 00:00:19.353357] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:18.912 [2024-05-15 00:00:19.353395] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:18.912 [2024-05-15 00:00:19.353430] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c538d0 00:18:18.912 [2024-05-15 00:00:19.353443] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:18.912 [2024-05-15 00:00:19.353726] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:18.912 [2024-05-15 00:00:19.353744] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:18.912 [2024-05-15 00:00:19.353795] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt4 00:18:18.912 [2024-05-15 00:00:19.353812] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:18.912 [2024-05-15 00:00:19.353925] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c55cb0 00:18:18.912 [2024-05-15 00:00:19.353936] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:18.912 [2024-05-15 00:00:19.354099] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c48d00 00:18:18.912 [2024-05-15 00:00:19.354227] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c55cb0 00:18:18.912 [2024-05-15 00:00:19.354236] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c55cb0 00:18:18.912 [2024-05-15 00:00:19.354328] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:18.912 pt4 00:18:18.912 00:00:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:18:18.912 00:00:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:18:18.912 00:00:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:18:18.912 00:00:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:18:18.912 00:00:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:18:18.912 00:00:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:18:18.912 00:00:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:18:18.912 00:00:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:18.912 00:00:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:18.912 00:00:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:18.912 00:00:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:18.912 00:00:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:18.912 00:00:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:18.912 00:00:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:19.170 00:00:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:19.170 "name": "raid_bdev1", 00:18:19.170 "uuid": "586689d2-d9fd-434a-aaaa-572430550058", 00:18:19.170 "strip_size_kb": 64, 00:18:19.170 "state": "online", 00:18:19.170 "raid_level": "concat", 00:18:19.170 "superblock": true, 00:18:19.170 "num_base_bdevs": 4, 00:18:19.170 "num_base_bdevs_discovered": 4, 00:18:19.170 "num_base_bdevs_operational": 4, 00:18:19.170 "base_bdevs_list": [ 00:18:19.170 { 00:18:19.170 "name": "pt1", 00:18:19.170 "uuid": "b84e654d-bf12-598d-a010-c33a398514e6", 00:18:19.170 "is_configured": true, 00:18:19.170 "data_offset": 2048, 00:18:19.170 "data_size": 63488 00:18:19.170 }, 00:18:19.170 { 00:18:19.170 "name": "pt2", 00:18:19.170 "uuid": "4c3f38d2-c13c-5452-83a3-4db073e419f5", 00:18:19.170 "is_configured": true, 00:18:19.170 "data_offset": 2048, 00:18:19.170 "data_size": 63488 00:18:19.170 }, 00:18:19.170 { 00:18:19.170 "name": "pt3", 00:18:19.170 "uuid": "efcc5234-19a6-5e12-a19e-6a27229f7d84", 00:18:19.170 "is_configured": true, 00:18:19.170 "data_offset": 2048, 00:18:19.170 "data_size": 63488 00:18:19.170 }, 00:18:19.170 { 00:18:19.170 "name": "pt4", 00:18:19.170 "uuid": "898bdea4-90fe-58ad-a719-6bda49ac0b69", 00:18:19.170 "is_configured": true, 00:18:19.170 "data_offset": 2048, 00:18:19.170 "data_size": 63488 00:18:19.170 } 00:18:19.170 ] 00:18:19.170 }' 00:18:19.170 00:00:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:19.170 00:00:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:19.738 00:00:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:18:19.738 00:00:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:18:19.738 00:00:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:18:19.738 00:00:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:18:19.738 00:00:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:18:19.738 00:00:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:18:19.738 00:00:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:19.738 00:00:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:18:19.996 [2024-05-15 00:00:20.428500] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:19.996 00:00:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:18:19.996 "name": "raid_bdev1", 00:18:19.996 "aliases": [ 00:18:19.996 "586689d2-d9fd-434a-aaaa-572430550058" 00:18:19.996 ], 00:18:19.996 "product_name": "Raid Volume", 00:18:19.996 "block_size": 512, 00:18:19.996 "num_blocks": 253952, 00:18:19.996 "uuid": "586689d2-d9fd-434a-aaaa-572430550058", 00:18:19.996 "assigned_rate_limits": { 00:18:19.996 "rw_ios_per_sec": 0, 00:18:19.996 "rw_mbytes_per_sec": 0, 00:18:19.996 "r_mbytes_per_sec": 0, 00:18:19.996 "w_mbytes_per_sec": 0 00:18:19.996 }, 00:18:19.996 "claimed": false, 00:18:19.996 "zoned": false, 00:18:19.996 "supported_io_types": { 00:18:19.996 "read": true, 00:18:19.996 "write": true, 00:18:19.996 "unmap": true, 00:18:19.996 "write_zeroes": true, 00:18:19.996 "flush": true, 00:18:19.996 "reset": true, 00:18:19.996 "compare": false, 00:18:19.996 "compare_and_write": false, 00:18:19.996 "abort": false, 00:18:19.996 "nvme_admin": false, 00:18:19.996 "nvme_io": false 00:18:19.996 }, 00:18:19.996 "memory_domains": [ 00:18:19.996 { 00:18:19.996 "dma_device_id": "system", 00:18:19.996 "dma_device_type": 1 00:18:19.996 }, 00:18:19.996 { 00:18:19.996 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:19.996 "dma_device_type": 2 00:18:19.996 }, 00:18:19.996 { 00:18:19.996 "dma_device_id": "system", 00:18:19.996 "dma_device_type": 1 00:18:19.997 }, 00:18:19.997 { 00:18:19.997 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:19.997 "dma_device_type": 2 00:18:19.997 }, 00:18:19.997 { 00:18:19.997 "dma_device_id": "system", 00:18:19.997 "dma_device_type": 1 00:18:19.997 }, 00:18:19.997 { 00:18:19.997 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:19.997 "dma_device_type": 2 00:18:19.997 }, 00:18:19.997 { 00:18:19.997 "dma_device_id": "system", 00:18:19.997 "dma_device_type": 1 00:18:19.997 }, 00:18:19.997 { 00:18:19.997 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:19.997 "dma_device_type": 2 00:18:19.997 } 00:18:19.997 ], 00:18:19.997 "driver_specific": { 00:18:19.997 "raid": { 00:18:19.997 "uuid": "586689d2-d9fd-434a-aaaa-572430550058", 00:18:19.997 "strip_size_kb": 64, 00:18:19.997 "state": "online", 00:18:19.997 "raid_level": "concat", 00:18:19.997 "superblock": true, 00:18:19.997 "num_base_bdevs": 4, 00:18:19.997 "num_base_bdevs_discovered": 4, 00:18:19.997 "num_base_bdevs_operational": 4, 00:18:19.997 "base_bdevs_list": [ 00:18:19.997 { 00:18:19.997 "name": "pt1", 00:18:19.997 "uuid": "b84e654d-bf12-598d-a010-c33a398514e6", 00:18:19.997 "is_configured": true, 00:18:19.997 "data_offset": 2048, 00:18:19.997 "data_size": 63488 00:18:19.997 }, 00:18:19.997 { 00:18:19.997 "name": "pt2", 00:18:19.997 "uuid": "4c3f38d2-c13c-5452-83a3-4db073e419f5", 00:18:19.997 "is_configured": true, 00:18:19.997 "data_offset": 2048, 00:18:19.997 "data_size": 63488 00:18:19.997 }, 00:18:19.997 { 00:18:19.997 "name": "pt3", 00:18:19.997 "uuid": "efcc5234-19a6-5e12-a19e-6a27229f7d84", 00:18:19.997 "is_configured": true, 00:18:19.997 "data_offset": 2048, 00:18:19.997 "data_size": 63488 00:18:19.997 }, 00:18:19.997 { 00:18:19.997 "name": "pt4", 00:18:19.997 "uuid": "898bdea4-90fe-58ad-a719-6bda49ac0b69", 00:18:19.997 "is_configured": true, 00:18:19.997 "data_offset": 2048, 00:18:19.997 "data_size": 63488 00:18:19.997 } 00:18:19.997 ] 00:18:19.997 } 00:18:19.997 } 00:18:19.997 }' 00:18:19.997 00:00:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:19.997 00:00:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:18:19.997 pt2 00:18:19.997 pt3 00:18:19.997 pt4' 00:18:19.997 00:00:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:19.997 00:00:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:19.997 00:00:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:20.255 00:00:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:20.255 "name": "pt1", 00:18:20.255 "aliases": [ 00:18:20.255 "b84e654d-bf12-598d-a010-c33a398514e6" 00:18:20.255 ], 00:18:20.255 "product_name": "passthru", 00:18:20.255 "block_size": 512, 00:18:20.255 "num_blocks": 65536, 00:18:20.255 "uuid": "b84e654d-bf12-598d-a010-c33a398514e6", 00:18:20.255 "assigned_rate_limits": { 00:18:20.255 "rw_ios_per_sec": 0, 00:18:20.255 "rw_mbytes_per_sec": 0, 00:18:20.255 "r_mbytes_per_sec": 0, 00:18:20.255 "w_mbytes_per_sec": 0 00:18:20.255 }, 00:18:20.255 "claimed": true, 00:18:20.255 "claim_type": "exclusive_write", 00:18:20.255 "zoned": false, 00:18:20.255 "supported_io_types": { 00:18:20.255 "read": true, 00:18:20.255 "write": true, 00:18:20.255 "unmap": true, 00:18:20.255 "write_zeroes": true, 00:18:20.255 "flush": true, 00:18:20.255 "reset": true, 00:18:20.255 "compare": false, 00:18:20.255 "compare_and_write": false, 00:18:20.255 "abort": true, 00:18:20.255 "nvme_admin": false, 00:18:20.255 "nvme_io": false 00:18:20.255 }, 00:18:20.255 "memory_domains": [ 00:18:20.255 { 00:18:20.255 "dma_device_id": "system", 00:18:20.255 "dma_device_type": 1 00:18:20.255 }, 00:18:20.255 { 00:18:20.255 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:20.255 "dma_device_type": 2 00:18:20.255 } 00:18:20.255 ], 00:18:20.255 "driver_specific": { 00:18:20.255 "passthru": { 00:18:20.255 "name": "pt1", 00:18:20.255 "base_bdev_name": "malloc1" 00:18:20.255 } 00:18:20.255 } 00:18:20.255 }' 00:18:20.255 00:00:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:20.255 00:00:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:20.255 00:00:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:20.255 00:00:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:20.523 00:00:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:20.523 00:00:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:20.523 00:00:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:20.523 00:00:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:20.523 00:00:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:20.523 00:00:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:20.523 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:20.523 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:20.523 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:20.523 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:20.523 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:20.780 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:20.780 "name": "pt2", 00:18:20.780 "aliases": [ 00:18:20.780 "4c3f38d2-c13c-5452-83a3-4db073e419f5" 00:18:20.780 ], 00:18:20.780 "product_name": "passthru", 00:18:20.780 "block_size": 512, 00:18:20.780 "num_blocks": 65536, 00:18:20.780 "uuid": "4c3f38d2-c13c-5452-83a3-4db073e419f5", 00:18:20.780 "assigned_rate_limits": { 00:18:20.780 "rw_ios_per_sec": 0, 00:18:20.780 "rw_mbytes_per_sec": 0, 00:18:20.780 "r_mbytes_per_sec": 0, 00:18:20.780 "w_mbytes_per_sec": 0 00:18:20.780 }, 00:18:20.780 "claimed": true, 00:18:20.780 "claim_type": "exclusive_write", 00:18:20.780 "zoned": false, 00:18:20.780 "supported_io_types": { 00:18:20.780 "read": true, 00:18:20.780 "write": true, 00:18:20.780 "unmap": true, 00:18:20.780 "write_zeroes": true, 00:18:20.780 "flush": true, 00:18:20.780 "reset": true, 00:18:20.780 "compare": false, 00:18:20.780 "compare_and_write": false, 00:18:20.780 "abort": true, 00:18:20.781 "nvme_admin": false, 00:18:20.781 "nvme_io": false 00:18:20.781 }, 00:18:20.781 "memory_domains": [ 00:18:20.781 { 00:18:20.781 "dma_device_id": "system", 00:18:20.781 "dma_device_type": 1 00:18:20.781 }, 00:18:20.781 { 00:18:20.781 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:20.781 "dma_device_type": 2 00:18:20.781 } 00:18:20.781 ], 00:18:20.781 "driver_specific": { 00:18:20.781 "passthru": { 00:18:20.781 "name": "pt2", 00:18:20.781 "base_bdev_name": "malloc2" 00:18:20.781 } 00:18:20.781 } 00:18:20.781 }' 00:18:20.781 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:20.781 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:20.781 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:20.781 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:20.781 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:21.039 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:21.039 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:21.039 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:21.039 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:21.039 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:21.039 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:21.039 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:21.039 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:21.039 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:21.039 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:21.297 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:21.297 "name": "pt3", 00:18:21.297 "aliases": [ 00:18:21.297 "efcc5234-19a6-5e12-a19e-6a27229f7d84" 00:18:21.297 ], 00:18:21.297 "product_name": "passthru", 00:18:21.297 "block_size": 512, 00:18:21.297 "num_blocks": 65536, 00:18:21.297 "uuid": "efcc5234-19a6-5e12-a19e-6a27229f7d84", 00:18:21.297 "assigned_rate_limits": { 00:18:21.297 "rw_ios_per_sec": 0, 00:18:21.297 "rw_mbytes_per_sec": 0, 00:18:21.297 "r_mbytes_per_sec": 0, 00:18:21.297 "w_mbytes_per_sec": 0 00:18:21.297 }, 00:18:21.297 "claimed": true, 00:18:21.297 "claim_type": "exclusive_write", 00:18:21.297 "zoned": false, 00:18:21.297 "supported_io_types": { 00:18:21.297 "read": true, 00:18:21.297 "write": true, 00:18:21.297 "unmap": true, 00:18:21.297 "write_zeroes": true, 00:18:21.297 "flush": true, 00:18:21.297 "reset": true, 00:18:21.297 "compare": false, 00:18:21.297 "compare_and_write": false, 00:18:21.297 "abort": true, 00:18:21.297 "nvme_admin": false, 00:18:21.297 "nvme_io": false 00:18:21.297 }, 00:18:21.297 "memory_domains": [ 00:18:21.297 { 00:18:21.297 "dma_device_id": "system", 00:18:21.297 "dma_device_type": 1 00:18:21.297 }, 00:18:21.297 { 00:18:21.297 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:21.297 "dma_device_type": 2 00:18:21.297 } 00:18:21.297 ], 00:18:21.297 "driver_specific": { 00:18:21.297 "passthru": { 00:18:21.297 "name": "pt3", 00:18:21.297 "base_bdev_name": "malloc3" 00:18:21.297 } 00:18:21.297 } 00:18:21.297 }' 00:18:21.297 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:21.297 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:21.555 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:21.555 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:21.555 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:21.555 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:21.555 00:00:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:21.555 00:00:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:21.555 00:00:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:21.555 00:00:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:21.555 00:00:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:21.555 00:00:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:21.555 00:00:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:21.555 00:00:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:18:21.555 00:00:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:21.812 00:00:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:21.812 "name": "pt4", 00:18:21.812 "aliases": [ 00:18:21.812 "898bdea4-90fe-58ad-a719-6bda49ac0b69" 00:18:21.812 ], 00:18:21.812 "product_name": "passthru", 00:18:21.812 "block_size": 512, 00:18:21.812 "num_blocks": 65536, 00:18:21.812 "uuid": "898bdea4-90fe-58ad-a719-6bda49ac0b69", 00:18:21.812 "assigned_rate_limits": { 00:18:21.812 "rw_ios_per_sec": 0, 00:18:21.812 "rw_mbytes_per_sec": 0, 00:18:21.812 "r_mbytes_per_sec": 0, 00:18:21.812 "w_mbytes_per_sec": 0 00:18:21.812 }, 00:18:21.812 "claimed": true, 00:18:21.812 "claim_type": "exclusive_write", 00:18:21.812 "zoned": false, 00:18:21.812 "supported_io_types": { 00:18:21.812 "read": true, 00:18:21.812 "write": true, 00:18:21.812 "unmap": true, 00:18:21.812 "write_zeroes": true, 00:18:21.812 "flush": true, 00:18:21.812 "reset": true, 00:18:21.812 "compare": false, 00:18:21.812 "compare_and_write": false, 00:18:21.812 "abort": true, 00:18:21.812 "nvme_admin": false, 00:18:21.812 "nvme_io": false 00:18:21.812 }, 00:18:21.812 "memory_domains": [ 00:18:21.812 { 00:18:21.812 "dma_device_id": "system", 00:18:21.812 "dma_device_type": 1 00:18:21.812 }, 00:18:21.813 { 00:18:21.813 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:21.813 "dma_device_type": 2 00:18:21.813 } 00:18:21.813 ], 00:18:21.813 "driver_specific": { 00:18:21.813 "passthru": { 00:18:21.813 "name": "pt4", 00:18:21.813 "base_bdev_name": "malloc4" 00:18:21.813 } 00:18:21.813 } 00:18:21.813 }' 00:18:21.813 00:00:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:22.071 00:00:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:22.071 00:00:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:22.071 00:00:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:22.071 00:00:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:22.071 00:00:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:22.071 00:00:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:22.071 00:00:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:22.071 00:00:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:22.071 00:00:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:22.071 00:00:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:22.329 00:00:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:22.329 00:00:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:22.329 00:00:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:18:22.329 [2024-05-15 00:00:22.907071] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:22.588 00:00:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # '[' 586689d2-d9fd-434a-aaaa-572430550058 '!=' 586689d2-d9fd-434a-aaaa-572430550058 ']' 00:18:22.588 00:00:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@491 -- # has_redundancy concat 00:18:22.588 00:00:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:18:22.588 00:00:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@216 -- # return 1 00:18:22.588 00:00:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@568 -- # killprocess 456468 00:18:22.588 00:00:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@946 -- # '[' -z 456468 ']' 00:18:22.588 00:00:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # kill -0 456468 00:18:22.588 00:00:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # uname 00:18:22.588 00:00:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:18:22.588 00:00:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 456468 00:18:22.588 00:00:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:18:22.588 00:00:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:18:22.588 00:00:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 456468' 00:18:22.588 killing process with pid 456468 00:18:22.588 00:00:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@965 -- # kill 456468 00:18:22.588 [2024-05-15 00:00:22.973832] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:22.588 [2024-05-15 00:00:22.973905] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:22.588 [2024-05-15 00:00:22.973973] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:22.588 [2024-05-15 00:00:22.973985] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c55cb0 name raid_bdev1, state offline 00:18:22.588 00:00:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@970 -- # wait 456468 00:18:22.588 [2024-05-15 00:00:23.016164] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:22.847 00:00:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # return 0 00:18:22.847 00:18:22.847 real 0m16.123s 00:18:22.847 user 0m29.047s 00:18:22.847 sys 0m2.917s 00:18:22.847 00:00:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:18:22.847 00:00:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:22.847 ************************************ 00:18:22.847 END TEST raid_superblock_test 00:18:22.847 ************************************ 00:18:22.847 00:00:23 bdev_raid -- bdev/bdev_raid.sh@814 -- # for level in raid0 concat raid1 00:18:22.847 00:00:23 bdev_raid -- bdev/bdev_raid.sh@815 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:18:22.847 00:00:23 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:18:22.847 00:00:23 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:18:22.847 00:00:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:22.847 ************************************ 00:18:22.847 START TEST raid_state_function_test 00:18:22.847 ************************************ 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1121 -- # raid_state_function_test raid1 4 false 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local raid_level=raid1 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=4 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local superblock=false 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev4 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # '[' raid1 '!=' raid1 ']' 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@235 -- # strip_size=0 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@238 -- # '[' false = true ']' 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@241 -- # superblock_create_arg= 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # raid_pid=458906 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 458906' 00:18:22.847 Process raid pid: 458906 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@247 -- # waitforlisten 458906 /var/tmp/spdk-raid.sock 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@827 -- # '[' -z 458906 ']' 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:22.847 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:18:22.847 00:00:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:22.847 [2024-05-15 00:00:23.417859] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:18:22.847 [2024-05-15 00:00:23.417923] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:23.105 [2024-05-15 00:00:23.542259] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:23.105 [2024-05-15 00:00:23.648616] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:18:23.363 [2024-05-15 00:00:23.715728] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:23.363 [2024-05-15 00:00:23.715764] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:23.930 00:00:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:18:23.930 00:00:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # return 0 00:18:23.930 00:00:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:24.189 [2024-05-15 00:00:24.553069] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:24.189 [2024-05-15 00:00:24.553113] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:24.189 [2024-05-15 00:00:24.553125] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:24.189 [2024-05-15 00:00:24.553137] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:24.189 [2024-05-15 00:00:24.553147] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:24.189 [2024-05-15 00:00:24.553158] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:24.189 [2024-05-15 00:00:24.553168] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:24.189 [2024-05-15 00:00:24.553180] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:24.189 00:00:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:24.189 00:00:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:24.189 00:00:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:24.189 00:00:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:24.189 00:00:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:24.189 00:00:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:24.189 00:00:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:24.189 00:00:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:24.189 00:00:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:24.189 00:00:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:24.189 00:00:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:24.189 00:00:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:24.447 00:00:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:24.447 "name": "Existed_Raid", 00:18:24.447 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:24.447 "strip_size_kb": 0, 00:18:24.447 "state": "configuring", 00:18:24.447 "raid_level": "raid1", 00:18:24.447 "superblock": false, 00:18:24.447 "num_base_bdevs": 4, 00:18:24.447 "num_base_bdevs_discovered": 0, 00:18:24.447 "num_base_bdevs_operational": 4, 00:18:24.447 "base_bdevs_list": [ 00:18:24.447 { 00:18:24.447 "name": "BaseBdev1", 00:18:24.447 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:24.447 "is_configured": false, 00:18:24.447 "data_offset": 0, 00:18:24.447 "data_size": 0 00:18:24.447 }, 00:18:24.447 { 00:18:24.447 "name": "BaseBdev2", 00:18:24.447 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:24.447 "is_configured": false, 00:18:24.447 "data_offset": 0, 00:18:24.447 "data_size": 0 00:18:24.447 }, 00:18:24.447 { 00:18:24.447 "name": "BaseBdev3", 00:18:24.447 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:24.447 "is_configured": false, 00:18:24.447 "data_offset": 0, 00:18:24.447 "data_size": 0 00:18:24.447 }, 00:18:24.447 { 00:18:24.447 "name": "BaseBdev4", 00:18:24.447 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:24.447 "is_configured": false, 00:18:24.447 "data_offset": 0, 00:18:24.447 "data_size": 0 00:18:24.447 } 00:18:24.447 ] 00:18:24.447 }' 00:18:24.447 00:00:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:24.448 00:00:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:25.014 00:00:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:25.014 [2024-05-15 00:00:25.551570] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:25.014 [2024-05-15 00:00:25.551603] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf00c00 name Existed_Raid, state configuring 00:18:25.014 00:00:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:25.273 [2024-05-15 00:00:25.792229] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:25.273 [2024-05-15 00:00:25.792258] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:25.273 [2024-05-15 00:00:25.792269] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:25.273 [2024-05-15 00:00:25.792281] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:25.273 [2024-05-15 00:00:25.792289] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:25.273 [2024-05-15 00:00:25.792301] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:25.273 [2024-05-15 00:00:25.792310] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:25.273 [2024-05-15 00:00:25.792321] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:25.273 00:00:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:25.531 [2024-05-15 00:00:25.978580] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:25.531 BaseBdev1 00:18:25.531 00:00:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:18:25.531 00:00:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:18:25.531 00:00:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:18:25.531 00:00:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:18:25.531 00:00:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:18:25.531 00:00:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:18:25.531 00:00:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:25.796 00:00:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:25.796 [ 00:18:25.796 { 00:18:25.796 "name": "BaseBdev1", 00:18:25.796 "aliases": [ 00:18:25.796 "a20060fc-fabb-4478-a71e-64f26f7cf071" 00:18:25.796 ], 00:18:25.796 "product_name": "Malloc disk", 00:18:25.796 "block_size": 512, 00:18:25.796 "num_blocks": 65536, 00:18:25.797 "uuid": "a20060fc-fabb-4478-a71e-64f26f7cf071", 00:18:25.797 "assigned_rate_limits": { 00:18:25.797 "rw_ios_per_sec": 0, 00:18:25.797 "rw_mbytes_per_sec": 0, 00:18:25.797 "r_mbytes_per_sec": 0, 00:18:25.797 "w_mbytes_per_sec": 0 00:18:25.797 }, 00:18:25.797 "claimed": true, 00:18:25.797 "claim_type": "exclusive_write", 00:18:25.797 "zoned": false, 00:18:25.797 "supported_io_types": { 00:18:25.797 "read": true, 00:18:25.797 "write": true, 00:18:25.797 "unmap": true, 00:18:25.797 "write_zeroes": true, 00:18:25.797 "flush": true, 00:18:25.797 "reset": true, 00:18:25.797 "compare": false, 00:18:25.797 "compare_and_write": false, 00:18:25.797 "abort": true, 00:18:25.797 "nvme_admin": false, 00:18:25.797 "nvme_io": false 00:18:25.797 }, 00:18:25.797 "memory_domains": [ 00:18:25.797 { 00:18:25.797 "dma_device_id": "system", 00:18:25.797 "dma_device_type": 1 00:18:25.797 }, 00:18:25.797 { 00:18:25.797 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:25.797 "dma_device_type": 2 00:18:25.797 } 00:18:25.797 ], 00:18:25.797 "driver_specific": {} 00:18:25.797 } 00:18:25.797 ] 00:18:25.797 00:00:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:18:25.797 00:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:25.797 00:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:25.797 00:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:25.797 00:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:25.797 00:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:25.797 00:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:25.797 00:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:25.797 00:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:25.797 00:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:25.797 00:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:25.797 00:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:25.797 00:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:26.105 00:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:26.105 "name": "Existed_Raid", 00:18:26.105 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:26.105 "strip_size_kb": 0, 00:18:26.105 "state": "configuring", 00:18:26.105 "raid_level": "raid1", 00:18:26.105 "superblock": false, 00:18:26.105 "num_base_bdevs": 4, 00:18:26.105 "num_base_bdevs_discovered": 1, 00:18:26.105 "num_base_bdevs_operational": 4, 00:18:26.105 "base_bdevs_list": [ 00:18:26.105 { 00:18:26.105 "name": "BaseBdev1", 00:18:26.105 "uuid": "a20060fc-fabb-4478-a71e-64f26f7cf071", 00:18:26.105 "is_configured": true, 00:18:26.105 "data_offset": 0, 00:18:26.105 "data_size": 65536 00:18:26.105 }, 00:18:26.105 { 00:18:26.105 "name": "BaseBdev2", 00:18:26.105 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:26.105 "is_configured": false, 00:18:26.105 "data_offset": 0, 00:18:26.105 "data_size": 0 00:18:26.105 }, 00:18:26.105 { 00:18:26.105 "name": "BaseBdev3", 00:18:26.105 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:26.105 "is_configured": false, 00:18:26.105 "data_offset": 0, 00:18:26.105 "data_size": 0 00:18:26.105 }, 00:18:26.105 { 00:18:26.105 "name": "BaseBdev4", 00:18:26.105 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:26.105 "is_configured": false, 00:18:26.105 "data_offset": 0, 00:18:26.105 "data_size": 0 00:18:26.105 } 00:18:26.105 ] 00:18:26.106 }' 00:18:26.106 00:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:26.106 00:00:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:26.672 00:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:26.932 [2024-05-15 00:00:27.354205] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:26.932 [2024-05-15 00:00:27.354243] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf00ea0 name Existed_Raid, state configuring 00:18:26.932 00:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:27.190 [2024-05-15 00:00:27.598884] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:27.190 [2024-05-15 00:00:27.600351] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:27.190 [2024-05-15 00:00:27.600382] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:27.191 [2024-05-15 00:00:27.600393] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:27.191 [2024-05-15 00:00:27.600412] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:27.191 [2024-05-15 00:00:27.600421] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:27.191 [2024-05-15 00:00:27.600433] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:27.191 00:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:18:27.191 00:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:18:27.191 00:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:27.191 00:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:27.191 00:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:27.191 00:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:27.191 00:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:27.191 00:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:27.191 00:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:27.191 00:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:27.191 00:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:27.191 00:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:27.191 00:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:27.191 00:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:27.449 00:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:27.449 "name": "Existed_Raid", 00:18:27.449 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.449 "strip_size_kb": 0, 00:18:27.449 "state": "configuring", 00:18:27.449 "raid_level": "raid1", 00:18:27.449 "superblock": false, 00:18:27.449 "num_base_bdevs": 4, 00:18:27.449 "num_base_bdevs_discovered": 1, 00:18:27.449 "num_base_bdevs_operational": 4, 00:18:27.449 "base_bdevs_list": [ 00:18:27.449 { 00:18:27.449 "name": "BaseBdev1", 00:18:27.449 "uuid": "a20060fc-fabb-4478-a71e-64f26f7cf071", 00:18:27.449 "is_configured": true, 00:18:27.449 "data_offset": 0, 00:18:27.449 "data_size": 65536 00:18:27.449 }, 00:18:27.449 { 00:18:27.449 "name": "BaseBdev2", 00:18:27.449 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.449 "is_configured": false, 00:18:27.449 "data_offset": 0, 00:18:27.449 "data_size": 0 00:18:27.449 }, 00:18:27.449 { 00:18:27.449 "name": "BaseBdev3", 00:18:27.449 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.449 "is_configured": false, 00:18:27.449 "data_offset": 0, 00:18:27.449 "data_size": 0 00:18:27.449 }, 00:18:27.449 { 00:18:27.449 "name": "BaseBdev4", 00:18:27.449 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.449 "is_configured": false, 00:18:27.449 "data_offset": 0, 00:18:27.449 "data_size": 0 00:18:27.449 } 00:18:27.449 ] 00:18:27.449 }' 00:18:27.449 00:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:27.449 00:00:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:28.016 00:00:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:28.275 [2024-05-15 00:00:28.697117] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:28.275 BaseBdev2 00:18:28.275 00:00:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:18:28.275 00:00:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:18:28.275 00:00:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:18:28.275 00:00:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:18:28.275 00:00:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:18:28.275 00:00:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:18:28.275 00:00:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:28.535 00:00:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:28.793 [ 00:18:28.793 { 00:18:28.793 "name": "BaseBdev2", 00:18:28.793 "aliases": [ 00:18:28.793 "911e49c9-62f0-4a3b-844f-0c97881d98df" 00:18:28.793 ], 00:18:28.793 "product_name": "Malloc disk", 00:18:28.793 "block_size": 512, 00:18:28.793 "num_blocks": 65536, 00:18:28.793 "uuid": "911e49c9-62f0-4a3b-844f-0c97881d98df", 00:18:28.793 "assigned_rate_limits": { 00:18:28.793 "rw_ios_per_sec": 0, 00:18:28.793 "rw_mbytes_per_sec": 0, 00:18:28.793 "r_mbytes_per_sec": 0, 00:18:28.793 "w_mbytes_per_sec": 0 00:18:28.793 }, 00:18:28.793 "claimed": true, 00:18:28.793 "claim_type": "exclusive_write", 00:18:28.793 "zoned": false, 00:18:28.793 "supported_io_types": { 00:18:28.793 "read": true, 00:18:28.793 "write": true, 00:18:28.793 "unmap": true, 00:18:28.793 "write_zeroes": true, 00:18:28.793 "flush": true, 00:18:28.793 "reset": true, 00:18:28.793 "compare": false, 00:18:28.793 "compare_and_write": false, 00:18:28.793 "abort": true, 00:18:28.793 "nvme_admin": false, 00:18:28.793 "nvme_io": false 00:18:28.793 }, 00:18:28.793 "memory_domains": [ 00:18:28.793 { 00:18:28.793 "dma_device_id": "system", 00:18:28.793 "dma_device_type": 1 00:18:28.793 }, 00:18:28.793 { 00:18:28.793 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:28.793 "dma_device_type": 2 00:18:28.793 } 00:18:28.793 ], 00:18:28.793 "driver_specific": {} 00:18:28.793 } 00:18:28.793 ] 00:18:28.793 00:00:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:18:28.793 00:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:18:28.793 00:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:18:28.793 00:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:28.793 00:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:28.793 00:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:28.793 00:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:28.793 00:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:28.793 00:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:28.793 00:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:28.793 00:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:28.793 00:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:28.793 00:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:28.793 00:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:28.793 00:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:29.050 00:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:29.050 "name": "Existed_Raid", 00:18:29.050 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:29.050 "strip_size_kb": 0, 00:18:29.050 "state": "configuring", 00:18:29.050 "raid_level": "raid1", 00:18:29.050 "superblock": false, 00:18:29.050 "num_base_bdevs": 4, 00:18:29.050 "num_base_bdevs_discovered": 2, 00:18:29.050 "num_base_bdevs_operational": 4, 00:18:29.050 "base_bdevs_list": [ 00:18:29.050 { 00:18:29.050 "name": "BaseBdev1", 00:18:29.050 "uuid": "a20060fc-fabb-4478-a71e-64f26f7cf071", 00:18:29.050 "is_configured": true, 00:18:29.050 "data_offset": 0, 00:18:29.050 "data_size": 65536 00:18:29.050 }, 00:18:29.050 { 00:18:29.050 "name": "BaseBdev2", 00:18:29.051 "uuid": "911e49c9-62f0-4a3b-844f-0c97881d98df", 00:18:29.051 "is_configured": true, 00:18:29.051 "data_offset": 0, 00:18:29.051 "data_size": 65536 00:18:29.051 }, 00:18:29.051 { 00:18:29.051 "name": "BaseBdev3", 00:18:29.051 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:29.051 "is_configured": false, 00:18:29.051 "data_offset": 0, 00:18:29.051 "data_size": 0 00:18:29.051 }, 00:18:29.051 { 00:18:29.051 "name": "BaseBdev4", 00:18:29.051 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:29.051 "is_configured": false, 00:18:29.051 "data_offset": 0, 00:18:29.051 "data_size": 0 00:18:29.051 } 00:18:29.051 ] 00:18:29.051 }' 00:18:29.051 00:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:29.051 00:00:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:29.619 00:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:29.878 [2024-05-15 00:00:30.264746] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:29.878 BaseBdev3 00:18:29.878 00:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:18:29.878 00:00:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:18:29.878 00:00:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:18:29.878 00:00:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:18:29.878 00:00:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:18:29.878 00:00:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:18:29.878 00:00:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:30.136 00:00:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:30.395 [ 00:18:30.395 { 00:18:30.395 "name": "BaseBdev3", 00:18:30.395 "aliases": [ 00:18:30.395 "c4bdbcf5-32e3-47d9-a74b-095cd9a6931f" 00:18:30.395 ], 00:18:30.395 "product_name": "Malloc disk", 00:18:30.395 "block_size": 512, 00:18:30.395 "num_blocks": 65536, 00:18:30.395 "uuid": "c4bdbcf5-32e3-47d9-a74b-095cd9a6931f", 00:18:30.395 "assigned_rate_limits": { 00:18:30.395 "rw_ios_per_sec": 0, 00:18:30.395 "rw_mbytes_per_sec": 0, 00:18:30.395 "r_mbytes_per_sec": 0, 00:18:30.395 "w_mbytes_per_sec": 0 00:18:30.395 }, 00:18:30.395 "claimed": true, 00:18:30.395 "claim_type": "exclusive_write", 00:18:30.395 "zoned": false, 00:18:30.395 "supported_io_types": { 00:18:30.395 "read": true, 00:18:30.395 "write": true, 00:18:30.395 "unmap": true, 00:18:30.395 "write_zeroes": true, 00:18:30.395 "flush": true, 00:18:30.395 "reset": true, 00:18:30.395 "compare": false, 00:18:30.395 "compare_and_write": false, 00:18:30.395 "abort": true, 00:18:30.395 "nvme_admin": false, 00:18:30.395 "nvme_io": false 00:18:30.395 }, 00:18:30.395 "memory_domains": [ 00:18:30.395 { 00:18:30.395 "dma_device_id": "system", 00:18:30.395 "dma_device_type": 1 00:18:30.395 }, 00:18:30.395 { 00:18:30.395 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.395 "dma_device_type": 2 00:18:30.395 } 00:18:30.395 ], 00:18:30.395 "driver_specific": {} 00:18:30.395 } 00:18:30.395 ] 00:18:30.395 00:00:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:18:30.395 00:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:18:30.395 00:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:18:30.395 00:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:30.395 00:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:30.395 00:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:30.395 00:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:30.395 00:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:30.395 00:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:30.395 00:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:30.395 00:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:30.395 00:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:30.395 00:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:30.395 00:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:30.395 00:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:30.653 00:00:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:30.653 "name": "Existed_Raid", 00:18:30.653 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:30.653 "strip_size_kb": 0, 00:18:30.653 "state": "configuring", 00:18:30.654 "raid_level": "raid1", 00:18:30.654 "superblock": false, 00:18:30.654 "num_base_bdevs": 4, 00:18:30.654 "num_base_bdevs_discovered": 3, 00:18:30.654 "num_base_bdevs_operational": 4, 00:18:30.654 "base_bdevs_list": [ 00:18:30.654 { 00:18:30.654 "name": "BaseBdev1", 00:18:30.654 "uuid": "a20060fc-fabb-4478-a71e-64f26f7cf071", 00:18:30.654 "is_configured": true, 00:18:30.654 "data_offset": 0, 00:18:30.654 "data_size": 65536 00:18:30.654 }, 00:18:30.654 { 00:18:30.654 "name": "BaseBdev2", 00:18:30.654 "uuid": "911e49c9-62f0-4a3b-844f-0c97881d98df", 00:18:30.654 "is_configured": true, 00:18:30.654 "data_offset": 0, 00:18:30.654 "data_size": 65536 00:18:30.654 }, 00:18:30.654 { 00:18:30.654 "name": "BaseBdev3", 00:18:30.654 "uuid": "c4bdbcf5-32e3-47d9-a74b-095cd9a6931f", 00:18:30.654 "is_configured": true, 00:18:30.654 "data_offset": 0, 00:18:30.654 "data_size": 65536 00:18:30.654 }, 00:18:30.654 { 00:18:30.654 "name": "BaseBdev4", 00:18:30.654 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:30.654 "is_configured": false, 00:18:30.654 "data_offset": 0, 00:18:30.654 "data_size": 0 00:18:30.654 } 00:18:30.654 ] 00:18:30.654 }' 00:18:30.654 00:00:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:30.654 00:00:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:31.221 00:00:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:31.480 [2024-05-15 00:00:31.812237] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:31.480 [2024-05-15 00:00:31.812276] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xf00470 00:18:31.480 [2024-05-15 00:00:31.812285] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:18:31.480 [2024-05-15 00:00:31.812495] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf00b40 00:18:31.480 [2024-05-15 00:00:31.812632] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf00470 00:18:31.480 [2024-05-15 00:00:31.812643] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xf00470 00:18:31.480 [2024-05-15 00:00:31.812810] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:31.480 BaseBdev4 00:18:31.480 00:00:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev4 00:18:31.480 00:00:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:18:31.480 00:00:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:18:31.480 00:00:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:18:31.480 00:00:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:18:31.480 00:00:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:18:31.480 00:00:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:31.738 00:00:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:31.738 [ 00:18:31.738 { 00:18:31.738 "name": "BaseBdev4", 00:18:31.738 "aliases": [ 00:18:31.738 "0f01d44c-54f9-4c6b-a6f4-fe715f911c6b" 00:18:31.738 ], 00:18:31.738 "product_name": "Malloc disk", 00:18:31.738 "block_size": 512, 00:18:31.738 "num_blocks": 65536, 00:18:31.738 "uuid": "0f01d44c-54f9-4c6b-a6f4-fe715f911c6b", 00:18:31.738 "assigned_rate_limits": { 00:18:31.738 "rw_ios_per_sec": 0, 00:18:31.738 "rw_mbytes_per_sec": 0, 00:18:31.738 "r_mbytes_per_sec": 0, 00:18:31.738 "w_mbytes_per_sec": 0 00:18:31.738 }, 00:18:31.738 "claimed": true, 00:18:31.738 "claim_type": "exclusive_write", 00:18:31.738 "zoned": false, 00:18:31.738 "supported_io_types": { 00:18:31.738 "read": true, 00:18:31.738 "write": true, 00:18:31.738 "unmap": true, 00:18:31.738 "write_zeroes": true, 00:18:31.738 "flush": true, 00:18:31.738 "reset": true, 00:18:31.738 "compare": false, 00:18:31.738 "compare_and_write": false, 00:18:31.738 "abort": true, 00:18:31.738 "nvme_admin": false, 00:18:31.738 "nvme_io": false 00:18:31.738 }, 00:18:31.738 "memory_domains": [ 00:18:31.738 { 00:18:31.738 "dma_device_id": "system", 00:18:31.738 "dma_device_type": 1 00:18:31.738 }, 00:18:31.738 { 00:18:31.738 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:31.738 "dma_device_type": 2 00:18:31.738 } 00:18:31.738 ], 00:18:31.738 "driver_specific": {} 00:18:31.738 } 00:18:31.738 ] 00:18:31.738 00:00:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:18:31.738 00:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:18:31.738 00:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:18:31.738 00:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:18:31.738 00:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:31.738 00:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:18:31.738 00:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:31.738 00:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:31.738 00:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:31.738 00:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:31.738 00:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:31.738 00:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:31.738 00:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:31.997 00:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:31.997 00:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:31.997 00:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:31.997 "name": "Existed_Raid", 00:18:31.997 "uuid": "fcc26411-25e5-42ab-b09e-db6d051eed58", 00:18:31.997 "strip_size_kb": 0, 00:18:31.997 "state": "online", 00:18:31.997 "raid_level": "raid1", 00:18:31.997 "superblock": false, 00:18:31.997 "num_base_bdevs": 4, 00:18:31.997 "num_base_bdevs_discovered": 4, 00:18:31.997 "num_base_bdevs_operational": 4, 00:18:31.997 "base_bdevs_list": [ 00:18:31.997 { 00:18:31.997 "name": "BaseBdev1", 00:18:31.997 "uuid": "a20060fc-fabb-4478-a71e-64f26f7cf071", 00:18:31.997 "is_configured": true, 00:18:31.997 "data_offset": 0, 00:18:31.997 "data_size": 65536 00:18:31.997 }, 00:18:31.997 { 00:18:31.997 "name": "BaseBdev2", 00:18:31.997 "uuid": "911e49c9-62f0-4a3b-844f-0c97881d98df", 00:18:31.997 "is_configured": true, 00:18:31.997 "data_offset": 0, 00:18:31.997 "data_size": 65536 00:18:31.997 }, 00:18:31.997 { 00:18:31.997 "name": "BaseBdev3", 00:18:31.997 "uuid": "c4bdbcf5-32e3-47d9-a74b-095cd9a6931f", 00:18:31.997 "is_configured": true, 00:18:31.997 "data_offset": 0, 00:18:31.997 "data_size": 65536 00:18:31.997 }, 00:18:31.997 { 00:18:31.997 "name": "BaseBdev4", 00:18:31.997 "uuid": "0f01d44c-54f9-4c6b-a6f4-fe715f911c6b", 00:18:31.997 "is_configured": true, 00:18:31.997 "data_offset": 0, 00:18:31.997 "data_size": 65536 00:18:31.997 } 00:18:31.997 ] 00:18:31.997 }' 00:18:31.997 00:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:31.997 00:00:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:32.936 00:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:18:32.936 00:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:18:32.936 00:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:18:32.937 00:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:18:32.937 00:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:18:32.937 00:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:18:32.937 00:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:32.937 00:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:18:32.937 [2024-05-15 00:00:33.376695] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:32.937 00:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:18:32.937 "name": "Existed_Raid", 00:18:32.937 "aliases": [ 00:18:32.937 "fcc26411-25e5-42ab-b09e-db6d051eed58" 00:18:32.937 ], 00:18:32.937 "product_name": "Raid Volume", 00:18:32.937 "block_size": 512, 00:18:32.937 "num_blocks": 65536, 00:18:32.937 "uuid": "fcc26411-25e5-42ab-b09e-db6d051eed58", 00:18:32.937 "assigned_rate_limits": { 00:18:32.937 "rw_ios_per_sec": 0, 00:18:32.937 "rw_mbytes_per_sec": 0, 00:18:32.937 "r_mbytes_per_sec": 0, 00:18:32.937 "w_mbytes_per_sec": 0 00:18:32.937 }, 00:18:32.937 "claimed": false, 00:18:32.937 "zoned": false, 00:18:32.937 "supported_io_types": { 00:18:32.937 "read": true, 00:18:32.937 "write": true, 00:18:32.937 "unmap": false, 00:18:32.937 "write_zeroes": true, 00:18:32.937 "flush": false, 00:18:32.937 "reset": true, 00:18:32.937 "compare": false, 00:18:32.937 "compare_and_write": false, 00:18:32.937 "abort": false, 00:18:32.937 "nvme_admin": false, 00:18:32.937 "nvme_io": false 00:18:32.937 }, 00:18:32.937 "memory_domains": [ 00:18:32.937 { 00:18:32.937 "dma_device_id": "system", 00:18:32.937 "dma_device_type": 1 00:18:32.937 }, 00:18:32.937 { 00:18:32.937 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:32.937 "dma_device_type": 2 00:18:32.937 }, 00:18:32.937 { 00:18:32.937 "dma_device_id": "system", 00:18:32.937 "dma_device_type": 1 00:18:32.937 }, 00:18:32.937 { 00:18:32.937 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:32.937 "dma_device_type": 2 00:18:32.937 }, 00:18:32.937 { 00:18:32.937 "dma_device_id": "system", 00:18:32.937 "dma_device_type": 1 00:18:32.937 }, 00:18:32.937 { 00:18:32.937 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:32.937 "dma_device_type": 2 00:18:32.937 }, 00:18:32.937 { 00:18:32.937 "dma_device_id": "system", 00:18:32.937 "dma_device_type": 1 00:18:32.937 }, 00:18:32.937 { 00:18:32.937 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:32.937 "dma_device_type": 2 00:18:32.937 } 00:18:32.937 ], 00:18:32.937 "driver_specific": { 00:18:32.937 "raid": { 00:18:32.937 "uuid": "fcc26411-25e5-42ab-b09e-db6d051eed58", 00:18:32.937 "strip_size_kb": 0, 00:18:32.937 "state": "online", 00:18:32.937 "raid_level": "raid1", 00:18:32.937 "superblock": false, 00:18:32.937 "num_base_bdevs": 4, 00:18:32.937 "num_base_bdevs_discovered": 4, 00:18:32.937 "num_base_bdevs_operational": 4, 00:18:32.937 "base_bdevs_list": [ 00:18:32.937 { 00:18:32.937 "name": "BaseBdev1", 00:18:32.937 "uuid": "a20060fc-fabb-4478-a71e-64f26f7cf071", 00:18:32.937 "is_configured": true, 00:18:32.937 "data_offset": 0, 00:18:32.937 "data_size": 65536 00:18:32.937 }, 00:18:32.937 { 00:18:32.937 "name": "BaseBdev2", 00:18:32.937 "uuid": "911e49c9-62f0-4a3b-844f-0c97881d98df", 00:18:32.937 "is_configured": true, 00:18:32.937 "data_offset": 0, 00:18:32.937 "data_size": 65536 00:18:32.937 }, 00:18:32.937 { 00:18:32.937 "name": "BaseBdev3", 00:18:32.937 "uuid": "c4bdbcf5-32e3-47d9-a74b-095cd9a6931f", 00:18:32.937 "is_configured": true, 00:18:32.937 "data_offset": 0, 00:18:32.937 "data_size": 65536 00:18:32.937 }, 00:18:32.937 { 00:18:32.937 "name": "BaseBdev4", 00:18:32.937 "uuid": "0f01d44c-54f9-4c6b-a6f4-fe715f911c6b", 00:18:32.937 "is_configured": true, 00:18:32.937 "data_offset": 0, 00:18:32.937 "data_size": 65536 00:18:32.937 } 00:18:32.937 ] 00:18:32.937 } 00:18:32.937 } 00:18:32.937 }' 00:18:32.937 00:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:32.937 00:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:18:32.937 BaseBdev2 00:18:32.937 BaseBdev3 00:18:32.937 BaseBdev4' 00:18:32.937 00:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:32.937 00:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:32.937 00:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:33.195 00:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:33.195 "name": "BaseBdev1", 00:18:33.195 "aliases": [ 00:18:33.195 "a20060fc-fabb-4478-a71e-64f26f7cf071" 00:18:33.195 ], 00:18:33.195 "product_name": "Malloc disk", 00:18:33.195 "block_size": 512, 00:18:33.195 "num_blocks": 65536, 00:18:33.195 "uuid": "a20060fc-fabb-4478-a71e-64f26f7cf071", 00:18:33.195 "assigned_rate_limits": { 00:18:33.195 "rw_ios_per_sec": 0, 00:18:33.195 "rw_mbytes_per_sec": 0, 00:18:33.195 "r_mbytes_per_sec": 0, 00:18:33.195 "w_mbytes_per_sec": 0 00:18:33.195 }, 00:18:33.195 "claimed": true, 00:18:33.195 "claim_type": "exclusive_write", 00:18:33.195 "zoned": false, 00:18:33.195 "supported_io_types": { 00:18:33.195 "read": true, 00:18:33.195 "write": true, 00:18:33.195 "unmap": true, 00:18:33.195 "write_zeroes": true, 00:18:33.195 "flush": true, 00:18:33.195 "reset": true, 00:18:33.195 "compare": false, 00:18:33.195 "compare_and_write": false, 00:18:33.195 "abort": true, 00:18:33.195 "nvme_admin": false, 00:18:33.195 "nvme_io": false 00:18:33.195 }, 00:18:33.195 "memory_domains": [ 00:18:33.195 { 00:18:33.195 "dma_device_id": "system", 00:18:33.195 "dma_device_type": 1 00:18:33.195 }, 00:18:33.195 { 00:18:33.195 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:33.195 "dma_device_type": 2 00:18:33.195 } 00:18:33.195 ], 00:18:33.195 "driver_specific": {} 00:18:33.195 }' 00:18:33.195 00:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:33.195 00:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:33.195 00:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:33.195 00:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:33.454 00:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:33.454 00:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:33.454 00:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:33.454 00:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:33.454 00:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:33.454 00:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:33.454 00:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:33.712 00:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:33.712 00:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:33.712 00:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:33.712 00:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:33.712 00:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:33.712 "name": "BaseBdev2", 00:18:33.712 "aliases": [ 00:18:33.712 "911e49c9-62f0-4a3b-844f-0c97881d98df" 00:18:33.712 ], 00:18:33.712 "product_name": "Malloc disk", 00:18:33.712 "block_size": 512, 00:18:33.712 "num_blocks": 65536, 00:18:33.712 "uuid": "911e49c9-62f0-4a3b-844f-0c97881d98df", 00:18:33.712 "assigned_rate_limits": { 00:18:33.712 "rw_ios_per_sec": 0, 00:18:33.712 "rw_mbytes_per_sec": 0, 00:18:33.712 "r_mbytes_per_sec": 0, 00:18:33.712 "w_mbytes_per_sec": 0 00:18:33.712 }, 00:18:33.712 "claimed": true, 00:18:33.712 "claim_type": "exclusive_write", 00:18:33.712 "zoned": false, 00:18:33.712 "supported_io_types": { 00:18:33.712 "read": true, 00:18:33.712 "write": true, 00:18:33.712 "unmap": true, 00:18:33.712 "write_zeroes": true, 00:18:33.712 "flush": true, 00:18:33.712 "reset": true, 00:18:33.712 "compare": false, 00:18:33.712 "compare_and_write": false, 00:18:33.712 "abort": true, 00:18:33.712 "nvme_admin": false, 00:18:33.712 "nvme_io": false 00:18:33.712 }, 00:18:33.712 "memory_domains": [ 00:18:33.712 { 00:18:33.712 "dma_device_id": "system", 00:18:33.712 "dma_device_type": 1 00:18:33.712 }, 00:18:33.712 { 00:18:33.712 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:33.712 "dma_device_type": 2 00:18:33.712 } 00:18:33.712 ], 00:18:33.712 "driver_specific": {} 00:18:33.712 }' 00:18:33.712 00:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:33.970 00:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:33.970 00:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:33.971 00:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:33.971 00:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:33.971 00:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:33.971 00:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:33.971 00:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:34.238 00:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:34.238 00:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:34.238 00:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:34.238 00:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:34.238 00:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:34.238 00:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:34.238 00:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:34.505 00:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:34.505 "name": "BaseBdev3", 00:18:34.505 "aliases": [ 00:18:34.505 "c4bdbcf5-32e3-47d9-a74b-095cd9a6931f" 00:18:34.505 ], 00:18:34.505 "product_name": "Malloc disk", 00:18:34.505 "block_size": 512, 00:18:34.505 "num_blocks": 65536, 00:18:34.505 "uuid": "c4bdbcf5-32e3-47d9-a74b-095cd9a6931f", 00:18:34.505 "assigned_rate_limits": { 00:18:34.505 "rw_ios_per_sec": 0, 00:18:34.505 "rw_mbytes_per_sec": 0, 00:18:34.505 "r_mbytes_per_sec": 0, 00:18:34.505 "w_mbytes_per_sec": 0 00:18:34.505 }, 00:18:34.505 "claimed": true, 00:18:34.505 "claim_type": "exclusive_write", 00:18:34.505 "zoned": false, 00:18:34.505 "supported_io_types": { 00:18:34.505 "read": true, 00:18:34.505 "write": true, 00:18:34.505 "unmap": true, 00:18:34.505 "write_zeroes": true, 00:18:34.505 "flush": true, 00:18:34.505 "reset": true, 00:18:34.505 "compare": false, 00:18:34.505 "compare_and_write": false, 00:18:34.505 "abort": true, 00:18:34.505 "nvme_admin": false, 00:18:34.505 "nvme_io": false 00:18:34.505 }, 00:18:34.505 "memory_domains": [ 00:18:34.505 { 00:18:34.505 "dma_device_id": "system", 00:18:34.505 "dma_device_type": 1 00:18:34.505 }, 00:18:34.505 { 00:18:34.505 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:34.505 "dma_device_type": 2 00:18:34.505 } 00:18:34.505 ], 00:18:34.505 "driver_specific": {} 00:18:34.505 }' 00:18:34.505 00:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:34.505 00:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:34.505 00:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:34.505 00:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:34.505 00:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:34.505 00:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:34.505 00:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:34.764 00:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:34.764 00:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:34.764 00:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:34.764 00:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:34.764 00:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:34.764 00:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:34.764 00:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:34.764 00:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:35.022 00:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:35.023 "name": "BaseBdev4", 00:18:35.023 "aliases": [ 00:18:35.023 "0f01d44c-54f9-4c6b-a6f4-fe715f911c6b" 00:18:35.023 ], 00:18:35.023 "product_name": "Malloc disk", 00:18:35.023 "block_size": 512, 00:18:35.023 "num_blocks": 65536, 00:18:35.023 "uuid": "0f01d44c-54f9-4c6b-a6f4-fe715f911c6b", 00:18:35.023 "assigned_rate_limits": { 00:18:35.023 "rw_ios_per_sec": 0, 00:18:35.023 "rw_mbytes_per_sec": 0, 00:18:35.023 "r_mbytes_per_sec": 0, 00:18:35.023 "w_mbytes_per_sec": 0 00:18:35.023 }, 00:18:35.023 "claimed": true, 00:18:35.023 "claim_type": "exclusive_write", 00:18:35.023 "zoned": false, 00:18:35.023 "supported_io_types": { 00:18:35.023 "read": true, 00:18:35.023 "write": true, 00:18:35.023 "unmap": true, 00:18:35.023 "write_zeroes": true, 00:18:35.023 "flush": true, 00:18:35.023 "reset": true, 00:18:35.023 "compare": false, 00:18:35.023 "compare_and_write": false, 00:18:35.023 "abort": true, 00:18:35.023 "nvme_admin": false, 00:18:35.023 "nvme_io": false 00:18:35.023 }, 00:18:35.023 "memory_domains": [ 00:18:35.023 { 00:18:35.023 "dma_device_id": "system", 00:18:35.023 "dma_device_type": 1 00:18:35.023 }, 00:18:35.023 { 00:18:35.023 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:35.023 "dma_device_type": 2 00:18:35.023 } 00:18:35.023 ], 00:18:35.023 "driver_specific": {} 00:18:35.023 }' 00:18:35.023 00:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:35.023 00:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:35.023 00:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:35.023 00:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:35.023 00:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:35.281 00:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:35.281 00:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:35.281 00:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:35.281 00:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:35.281 00:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:35.281 00:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:35.281 00:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:35.281 00:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:35.540 [2024-05-15 00:00:36.031577] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:35.540 00:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # local expected_state 00:18:35.540 00:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # has_redundancy raid1 00:18:35.540 00:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:18:35.540 00:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 0 00:18:35.540 00:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@280 -- # expected_state=online 00:18:35.540 00:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:18:35.540 00:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:35.540 00:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:18:35.540 00:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:35.540 00:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:35.540 00:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:18:35.540 00:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:35.540 00:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:35.540 00:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:35.540 00:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:35.540 00:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:35.540 00:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:35.799 00:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:35.799 "name": "Existed_Raid", 00:18:35.799 "uuid": "fcc26411-25e5-42ab-b09e-db6d051eed58", 00:18:35.799 "strip_size_kb": 0, 00:18:35.799 "state": "online", 00:18:35.799 "raid_level": "raid1", 00:18:35.799 "superblock": false, 00:18:35.799 "num_base_bdevs": 4, 00:18:35.799 "num_base_bdevs_discovered": 3, 00:18:35.799 "num_base_bdevs_operational": 3, 00:18:35.799 "base_bdevs_list": [ 00:18:35.799 { 00:18:35.799 "name": null, 00:18:35.799 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:35.799 "is_configured": false, 00:18:35.799 "data_offset": 0, 00:18:35.799 "data_size": 65536 00:18:35.799 }, 00:18:35.799 { 00:18:35.799 "name": "BaseBdev2", 00:18:35.799 "uuid": "911e49c9-62f0-4a3b-844f-0c97881d98df", 00:18:35.799 "is_configured": true, 00:18:35.799 "data_offset": 0, 00:18:35.799 "data_size": 65536 00:18:35.799 }, 00:18:35.799 { 00:18:35.799 "name": "BaseBdev3", 00:18:35.799 "uuid": "c4bdbcf5-32e3-47d9-a74b-095cd9a6931f", 00:18:35.799 "is_configured": true, 00:18:35.799 "data_offset": 0, 00:18:35.799 "data_size": 65536 00:18:35.799 }, 00:18:35.799 { 00:18:35.799 "name": "BaseBdev4", 00:18:35.799 "uuid": "0f01d44c-54f9-4c6b-a6f4-fe715f911c6b", 00:18:35.799 "is_configured": true, 00:18:35.799 "data_offset": 0, 00:18:35.799 "data_size": 65536 00:18:35.799 } 00:18:35.799 ] 00:18:35.799 }' 00:18:35.799 00:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:35.799 00:00:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:36.365 00:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:18:36.365 00:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:18:36.365 00:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:36.365 00:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:18:36.623 00:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:18:36.623 00:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:36.623 00:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:36.882 [2024-05-15 00:00:37.356162] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:36.882 00:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:18:36.882 00:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:18:36.882 00:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:36.882 00:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:18:37.140 00:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:18:37.140 00:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:37.140 00:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:37.399 [2024-05-15 00:00:37.868229] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:37.399 00:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:18:37.399 00:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:18:37.399 00:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:37.399 00:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:18:37.657 00:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:18:37.657 00:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:37.657 00:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:18:37.915 [2024-05-15 00:00:38.353771] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:18:37.915 [2024-05-15 00:00:38.353836] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:37.915 [2024-05-15 00:00:38.366288] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:37.915 [2024-05-15 00:00:38.366350] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:37.915 [2024-05-15 00:00:38.366363] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf00470 name Existed_Raid, state offline 00:18:37.915 00:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:18:37.915 00:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:18:37.915 00:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:37.915 00:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:18:38.173 00:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:18:38.173 00:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:18:38.173 00:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@300 -- # '[' 4 -gt 2 ']' 00:18:38.173 00:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:18:38.173 00:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:18:38.173 00:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:38.431 BaseBdev2 00:18:38.431 00:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:18:38.431 00:00:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:18:38.431 00:00:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:18:38.431 00:00:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:18:38.431 00:00:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:18:38.431 00:00:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:18:38.431 00:00:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:38.689 00:00:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:38.946 [ 00:18:38.946 { 00:18:38.946 "name": "BaseBdev2", 00:18:38.946 "aliases": [ 00:18:38.946 "cc0f8a24-3a79-49d9-bb55-8d34c7885efa" 00:18:38.946 ], 00:18:38.946 "product_name": "Malloc disk", 00:18:38.946 "block_size": 512, 00:18:38.946 "num_blocks": 65536, 00:18:38.946 "uuid": "cc0f8a24-3a79-49d9-bb55-8d34c7885efa", 00:18:38.946 "assigned_rate_limits": { 00:18:38.946 "rw_ios_per_sec": 0, 00:18:38.946 "rw_mbytes_per_sec": 0, 00:18:38.946 "r_mbytes_per_sec": 0, 00:18:38.946 "w_mbytes_per_sec": 0 00:18:38.946 }, 00:18:38.946 "claimed": false, 00:18:38.946 "zoned": false, 00:18:38.946 "supported_io_types": { 00:18:38.946 "read": true, 00:18:38.946 "write": true, 00:18:38.946 "unmap": true, 00:18:38.946 "write_zeroes": true, 00:18:38.946 "flush": true, 00:18:38.946 "reset": true, 00:18:38.946 "compare": false, 00:18:38.946 "compare_and_write": false, 00:18:38.946 "abort": true, 00:18:38.946 "nvme_admin": false, 00:18:38.946 "nvme_io": false 00:18:38.946 }, 00:18:38.946 "memory_domains": [ 00:18:38.946 { 00:18:38.946 "dma_device_id": "system", 00:18:38.946 "dma_device_type": 1 00:18:38.946 }, 00:18:38.946 { 00:18:38.946 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:38.946 "dma_device_type": 2 00:18:38.946 } 00:18:38.946 ], 00:18:38.946 "driver_specific": {} 00:18:38.946 } 00:18:38.946 ] 00:18:38.946 00:00:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:18:38.946 00:00:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:18:38.946 00:00:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:18:38.946 00:00:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:39.204 BaseBdev3 00:18:39.204 00:00:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:18:39.204 00:00:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:18:39.204 00:00:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:18:39.204 00:00:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:18:39.204 00:00:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:18:39.204 00:00:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:18:39.204 00:00:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:39.467 00:00:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:39.467 [ 00:18:39.467 { 00:18:39.467 "name": "BaseBdev3", 00:18:39.467 "aliases": [ 00:18:39.467 "163ba573-8540-430b-b816-59505d0a6cbe" 00:18:39.467 ], 00:18:39.467 "product_name": "Malloc disk", 00:18:39.467 "block_size": 512, 00:18:39.467 "num_blocks": 65536, 00:18:39.467 "uuid": "163ba573-8540-430b-b816-59505d0a6cbe", 00:18:39.467 "assigned_rate_limits": { 00:18:39.467 "rw_ios_per_sec": 0, 00:18:39.467 "rw_mbytes_per_sec": 0, 00:18:39.467 "r_mbytes_per_sec": 0, 00:18:39.467 "w_mbytes_per_sec": 0 00:18:39.467 }, 00:18:39.467 "claimed": false, 00:18:39.467 "zoned": false, 00:18:39.467 "supported_io_types": { 00:18:39.467 "read": true, 00:18:39.467 "write": true, 00:18:39.467 "unmap": true, 00:18:39.467 "write_zeroes": true, 00:18:39.467 "flush": true, 00:18:39.467 "reset": true, 00:18:39.467 "compare": false, 00:18:39.467 "compare_and_write": false, 00:18:39.467 "abort": true, 00:18:39.467 "nvme_admin": false, 00:18:39.467 "nvme_io": false 00:18:39.467 }, 00:18:39.467 "memory_domains": [ 00:18:39.467 { 00:18:39.467 "dma_device_id": "system", 00:18:39.467 "dma_device_type": 1 00:18:39.467 }, 00:18:39.467 { 00:18:39.467 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:39.467 "dma_device_type": 2 00:18:39.467 } 00:18:39.467 ], 00:18:39.467 "driver_specific": {} 00:18:39.467 } 00:18:39.467 ] 00:18:39.725 00:00:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:18:39.725 00:00:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:18:39.725 00:00:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:18:39.725 00:00:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:39.725 BaseBdev4 00:18:40.048 00:00:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev4 00:18:40.048 00:00:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:18:40.048 00:00:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:18:40.048 00:00:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:18:40.048 00:00:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:18:40.048 00:00:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:18:40.048 00:00:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:40.048 00:00:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:40.307 [ 00:18:40.307 { 00:18:40.307 "name": "BaseBdev4", 00:18:40.307 "aliases": [ 00:18:40.307 "8c744de9-1295-498b-9b90-23943f6e231d" 00:18:40.307 ], 00:18:40.307 "product_name": "Malloc disk", 00:18:40.307 "block_size": 512, 00:18:40.307 "num_blocks": 65536, 00:18:40.307 "uuid": "8c744de9-1295-498b-9b90-23943f6e231d", 00:18:40.307 "assigned_rate_limits": { 00:18:40.307 "rw_ios_per_sec": 0, 00:18:40.307 "rw_mbytes_per_sec": 0, 00:18:40.307 "r_mbytes_per_sec": 0, 00:18:40.307 "w_mbytes_per_sec": 0 00:18:40.307 }, 00:18:40.307 "claimed": false, 00:18:40.307 "zoned": false, 00:18:40.307 "supported_io_types": { 00:18:40.307 "read": true, 00:18:40.307 "write": true, 00:18:40.307 "unmap": true, 00:18:40.307 "write_zeroes": true, 00:18:40.307 "flush": true, 00:18:40.307 "reset": true, 00:18:40.307 "compare": false, 00:18:40.307 "compare_and_write": false, 00:18:40.307 "abort": true, 00:18:40.307 "nvme_admin": false, 00:18:40.307 "nvme_io": false 00:18:40.307 }, 00:18:40.307 "memory_domains": [ 00:18:40.307 { 00:18:40.307 "dma_device_id": "system", 00:18:40.307 "dma_device_type": 1 00:18:40.307 }, 00:18:40.307 { 00:18:40.307 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:40.307 "dma_device_type": 2 00:18:40.307 } 00:18:40.307 ], 00:18:40.307 "driver_specific": {} 00:18:40.307 } 00:18:40.307 ] 00:18:40.307 00:00:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:18:40.307 00:00:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:18:40.307 00:00:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:18:40.307 00:00:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:40.566 [2024-05-15 00:00:41.028056] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:40.566 [2024-05-15 00:00:41.028100] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:40.566 [2024-05-15 00:00:41.028121] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:40.566 [2024-05-15 00:00:41.029535] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:40.566 [2024-05-15 00:00:41.029576] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:40.566 00:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:40.566 00:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:40.566 00:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:40.566 00:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:40.566 00:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:40.566 00:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:40.566 00:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:40.566 00:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:40.566 00:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:40.566 00:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:40.566 00:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:40.566 00:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:40.826 00:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:40.826 "name": "Existed_Raid", 00:18:40.826 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:40.826 "strip_size_kb": 0, 00:18:40.826 "state": "configuring", 00:18:40.826 "raid_level": "raid1", 00:18:40.826 "superblock": false, 00:18:40.826 "num_base_bdevs": 4, 00:18:40.826 "num_base_bdevs_discovered": 3, 00:18:40.826 "num_base_bdevs_operational": 4, 00:18:40.826 "base_bdevs_list": [ 00:18:40.826 { 00:18:40.826 "name": "BaseBdev1", 00:18:40.826 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:40.826 "is_configured": false, 00:18:40.826 "data_offset": 0, 00:18:40.826 "data_size": 0 00:18:40.826 }, 00:18:40.826 { 00:18:40.826 "name": "BaseBdev2", 00:18:40.826 "uuid": "cc0f8a24-3a79-49d9-bb55-8d34c7885efa", 00:18:40.826 "is_configured": true, 00:18:40.826 "data_offset": 0, 00:18:40.826 "data_size": 65536 00:18:40.826 }, 00:18:40.826 { 00:18:40.826 "name": "BaseBdev3", 00:18:40.826 "uuid": "163ba573-8540-430b-b816-59505d0a6cbe", 00:18:40.826 "is_configured": true, 00:18:40.826 "data_offset": 0, 00:18:40.826 "data_size": 65536 00:18:40.826 }, 00:18:40.826 { 00:18:40.826 "name": "BaseBdev4", 00:18:40.826 "uuid": "8c744de9-1295-498b-9b90-23943f6e231d", 00:18:40.826 "is_configured": true, 00:18:40.826 "data_offset": 0, 00:18:40.826 "data_size": 65536 00:18:40.826 } 00:18:40.826 ] 00:18:40.826 }' 00:18:40.826 00:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:40.826 00:00:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:41.395 00:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:41.654 [2024-05-15 00:00:42.110921] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:41.654 00:00:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:41.654 00:00:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:41.654 00:00:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:41.654 00:00:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:41.654 00:00:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:41.654 00:00:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:41.654 00:00:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:41.654 00:00:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:41.654 00:00:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:41.654 00:00:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:41.654 00:00:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:41.654 00:00:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:41.912 00:00:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:41.912 "name": "Existed_Raid", 00:18:41.912 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:41.912 "strip_size_kb": 0, 00:18:41.912 "state": "configuring", 00:18:41.912 "raid_level": "raid1", 00:18:41.912 "superblock": false, 00:18:41.912 "num_base_bdevs": 4, 00:18:41.912 "num_base_bdevs_discovered": 2, 00:18:41.912 "num_base_bdevs_operational": 4, 00:18:41.912 "base_bdevs_list": [ 00:18:41.912 { 00:18:41.912 "name": "BaseBdev1", 00:18:41.912 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:41.912 "is_configured": false, 00:18:41.912 "data_offset": 0, 00:18:41.912 "data_size": 0 00:18:41.912 }, 00:18:41.912 { 00:18:41.912 "name": null, 00:18:41.912 "uuid": "cc0f8a24-3a79-49d9-bb55-8d34c7885efa", 00:18:41.912 "is_configured": false, 00:18:41.912 "data_offset": 0, 00:18:41.912 "data_size": 65536 00:18:41.912 }, 00:18:41.912 { 00:18:41.912 "name": "BaseBdev3", 00:18:41.912 "uuid": "163ba573-8540-430b-b816-59505d0a6cbe", 00:18:41.912 "is_configured": true, 00:18:41.912 "data_offset": 0, 00:18:41.912 "data_size": 65536 00:18:41.912 }, 00:18:41.912 { 00:18:41.912 "name": "BaseBdev4", 00:18:41.912 "uuid": "8c744de9-1295-498b-9b90-23943f6e231d", 00:18:41.912 "is_configured": true, 00:18:41.912 "data_offset": 0, 00:18:41.912 "data_size": 65536 00:18:41.912 } 00:18:41.912 ] 00:18:41.912 }' 00:18:41.912 00:00:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:41.912 00:00:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:42.477 00:00:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:42.477 00:00:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:42.736 00:00:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:18:42.736 00:00:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:43.007 [2024-05-15 00:00:43.447024] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:43.007 BaseBdev1 00:18:43.007 00:00:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:18:43.007 00:00:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:18:43.007 00:00:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:18:43.007 00:00:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:18:43.007 00:00:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:18:43.007 00:00:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:18:43.007 00:00:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:43.279 00:00:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:43.538 [ 00:18:43.538 { 00:18:43.538 "name": "BaseBdev1", 00:18:43.538 "aliases": [ 00:18:43.538 "75ab35b0-1a2a-4816-b5cf-08fd1d9f0a74" 00:18:43.538 ], 00:18:43.538 "product_name": "Malloc disk", 00:18:43.538 "block_size": 512, 00:18:43.538 "num_blocks": 65536, 00:18:43.538 "uuid": "75ab35b0-1a2a-4816-b5cf-08fd1d9f0a74", 00:18:43.538 "assigned_rate_limits": { 00:18:43.538 "rw_ios_per_sec": 0, 00:18:43.538 "rw_mbytes_per_sec": 0, 00:18:43.538 "r_mbytes_per_sec": 0, 00:18:43.538 "w_mbytes_per_sec": 0 00:18:43.538 }, 00:18:43.538 "claimed": true, 00:18:43.538 "claim_type": "exclusive_write", 00:18:43.538 "zoned": false, 00:18:43.538 "supported_io_types": { 00:18:43.538 "read": true, 00:18:43.538 "write": true, 00:18:43.538 "unmap": true, 00:18:43.538 "write_zeroes": true, 00:18:43.538 "flush": true, 00:18:43.538 "reset": true, 00:18:43.538 "compare": false, 00:18:43.538 "compare_and_write": false, 00:18:43.538 "abort": true, 00:18:43.538 "nvme_admin": false, 00:18:43.538 "nvme_io": false 00:18:43.538 }, 00:18:43.538 "memory_domains": [ 00:18:43.538 { 00:18:43.538 "dma_device_id": "system", 00:18:43.538 "dma_device_type": 1 00:18:43.538 }, 00:18:43.538 { 00:18:43.538 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:43.538 "dma_device_type": 2 00:18:43.538 } 00:18:43.538 ], 00:18:43.538 "driver_specific": {} 00:18:43.538 } 00:18:43.538 ] 00:18:43.538 00:00:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:18:43.538 00:00:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:43.538 00:00:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:43.538 00:00:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:43.538 00:00:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:43.538 00:00:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:43.538 00:00:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:43.538 00:00:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:43.538 00:00:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:43.538 00:00:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:43.538 00:00:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:43.538 00:00:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:43.539 00:00:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:43.797 00:00:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:43.797 "name": "Existed_Raid", 00:18:43.797 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:43.797 "strip_size_kb": 0, 00:18:43.797 "state": "configuring", 00:18:43.797 "raid_level": "raid1", 00:18:43.797 "superblock": false, 00:18:43.797 "num_base_bdevs": 4, 00:18:43.797 "num_base_bdevs_discovered": 3, 00:18:43.797 "num_base_bdevs_operational": 4, 00:18:43.797 "base_bdevs_list": [ 00:18:43.797 { 00:18:43.797 "name": "BaseBdev1", 00:18:43.797 "uuid": "75ab35b0-1a2a-4816-b5cf-08fd1d9f0a74", 00:18:43.797 "is_configured": true, 00:18:43.797 "data_offset": 0, 00:18:43.797 "data_size": 65536 00:18:43.797 }, 00:18:43.797 { 00:18:43.797 "name": null, 00:18:43.797 "uuid": "cc0f8a24-3a79-49d9-bb55-8d34c7885efa", 00:18:43.797 "is_configured": false, 00:18:43.797 "data_offset": 0, 00:18:43.797 "data_size": 65536 00:18:43.797 }, 00:18:43.797 { 00:18:43.797 "name": "BaseBdev3", 00:18:43.797 "uuid": "163ba573-8540-430b-b816-59505d0a6cbe", 00:18:43.797 "is_configured": true, 00:18:43.797 "data_offset": 0, 00:18:43.797 "data_size": 65536 00:18:43.797 }, 00:18:43.797 { 00:18:43.797 "name": "BaseBdev4", 00:18:43.797 "uuid": "8c744de9-1295-498b-9b90-23943f6e231d", 00:18:43.797 "is_configured": true, 00:18:43.797 "data_offset": 0, 00:18:43.798 "data_size": 65536 00:18:43.798 } 00:18:43.798 ] 00:18:43.798 }' 00:18:43.798 00:00:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:43.798 00:00:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:44.364 00:00:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:44.364 00:00:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:44.623 00:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:18:44.623 00:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:44.881 [2024-05-15 00:00:45.275912] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:44.881 00:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:44.881 00:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:44.881 00:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:44.881 00:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:44.881 00:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:44.881 00:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:44.881 00:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:44.881 00:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:44.881 00:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:44.881 00:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:44.881 00:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:44.881 00:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:45.140 00:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:45.140 "name": "Existed_Raid", 00:18:45.140 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:45.140 "strip_size_kb": 0, 00:18:45.140 "state": "configuring", 00:18:45.140 "raid_level": "raid1", 00:18:45.140 "superblock": false, 00:18:45.140 "num_base_bdevs": 4, 00:18:45.140 "num_base_bdevs_discovered": 2, 00:18:45.140 "num_base_bdevs_operational": 4, 00:18:45.140 "base_bdevs_list": [ 00:18:45.140 { 00:18:45.140 "name": "BaseBdev1", 00:18:45.140 "uuid": "75ab35b0-1a2a-4816-b5cf-08fd1d9f0a74", 00:18:45.140 "is_configured": true, 00:18:45.140 "data_offset": 0, 00:18:45.140 "data_size": 65536 00:18:45.140 }, 00:18:45.140 { 00:18:45.140 "name": null, 00:18:45.140 "uuid": "cc0f8a24-3a79-49d9-bb55-8d34c7885efa", 00:18:45.140 "is_configured": false, 00:18:45.140 "data_offset": 0, 00:18:45.140 "data_size": 65536 00:18:45.140 }, 00:18:45.140 { 00:18:45.140 "name": null, 00:18:45.140 "uuid": "163ba573-8540-430b-b816-59505d0a6cbe", 00:18:45.140 "is_configured": false, 00:18:45.140 "data_offset": 0, 00:18:45.140 "data_size": 65536 00:18:45.140 }, 00:18:45.140 { 00:18:45.140 "name": "BaseBdev4", 00:18:45.140 "uuid": "8c744de9-1295-498b-9b90-23943f6e231d", 00:18:45.140 "is_configured": true, 00:18:45.140 "data_offset": 0, 00:18:45.140 "data_size": 65536 00:18:45.140 } 00:18:45.140 ] 00:18:45.140 }' 00:18:45.141 00:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:45.141 00:00:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:45.707 00:00:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:45.707 00:00:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:45.964 00:00:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:18:45.964 00:00:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:46.222 [2024-05-15 00:00:46.587412] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:46.222 00:00:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:46.222 00:00:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:46.222 00:00:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:46.222 00:00:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:46.222 00:00:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:46.222 00:00:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:46.222 00:00:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:46.222 00:00:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:46.222 00:00:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:46.222 00:00:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:46.222 00:00:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:46.222 00:00:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:46.480 00:00:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:46.480 "name": "Existed_Raid", 00:18:46.480 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:46.480 "strip_size_kb": 0, 00:18:46.480 "state": "configuring", 00:18:46.480 "raid_level": "raid1", 00:18:46.480 "superblock": false, 00:18:46.480 "num_base_bdevs": 4, 00:18:46.480 "num_base_bdevs_discovered": 3, 00:18:46.480 "num_base_bdevs_operational": 4, 00:18:46.480 "base_bdevs_list": [ 00:18:46.480 { 00:18:46.480 "name": "BaseBdev1", 00:18:46.480 "uuid": "75ab35b0-1a2a-4816-b5cf-08fd1d9f0a74", 00:18:46.480 "is_configured": true, 00:18:46.480 "data_offset": 0, 00:18:46.480 "data_size": 65536 00:18:46.480 }, 00:18:46.480 { 00:18:46.480 "name": null, 00:18:46.480 "uuid": "cc0f8a24-3a79-49d9-bb55-8d34c7885efa", 00:18:46.480 "is_configured": false, 00:18:46.480 "data_offset": 0, 00:18:46.480 "data_size": 65536 00:18:46.480 }, 00:18:46.480 { 00:18:46.480 "name": "BaseBdev3", 00:18:46.480 "uuid": "163ba573-8540-430b-b816-59505d0a6cbe", 00:18:46.480 "is_configured": true, 00:18:46.480 "data_offset": 0, 00:18:46.480 "data_size": 65536 00:18:46.480 }, 00:18:46.480 { 00:18:46.480 "name": "BaseBdev4", 00:18:46.480 "uuid": "8c744de9-1295-498b-9b90-23943f6e231d", 00:18:46.480 "is_configured": true, 00:18:46.480 "data_offset": 0, 00:18:46.480 "data_size": 65536 00:18:46.480 } 00:18:46.480 ] 00:18:46.480 }' 00:18:46.480 00:00:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:46.480 00:00:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:47.046 00:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:47.046 00:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:47.303 00:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:18:47.303 00:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:47.561 [2024-05-15 00:00:47.906902] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:47.561 00:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:47.561 00:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:47.561 00:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:47.561 00:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:47.561 00:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:47.561 00:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:47.561 00:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:47.561 00:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:47.561 00:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:47.561 00:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:47.561 00:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:47.561 00:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:47.819 00:00:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:47.819 "name": "Existed_Raid", 00:18:47.819 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:47.819 "strip_size_kb": 0, 00:18:47.819 "state": "configuring", 00:18:47.819 "raid_level": "raid1", 00:18:47.819 "superblock": false, 00:18:47.819 "num_base_bdevs": 4, 00:18:47.819 "num_base_bdevs_discovered": 2, 00:18:47.819 "num_base_bdevs_operational": 4, 00:18:47.819 "base_bdevs_list": [ 00:18:47.819 { 00:18:47.819 "name": null, 00:18:47.819 "uuid": "75ab35b0-1a2a-4816-b5cf-08fd1d9f0a74", 00:18:47.819 "is_configured": false, 00:18:47.819 "data_offset": 0, 00:18:47.819 "data_size": 65536 00:18:47.819 }, 00:18:47.819 { 00:18:47.819 "name": null, 00:18:47.819 "uuid": "cc0f8a24-3a79-49d9-bb55-8d34c7885efa", 00:18:47.819 "is_configured": false, 00:18:47.819 "data_offset": 0, 00:18:47.819 "data_size": 65536 00:18:47.819 }, 00:18:47.819 { 00:18:47.819 "name": "BaseBdev3", 00:18:47.819 "uuid": "163ba573-8540-430b-b816-59505d0a6cbe", 00:18:47.819 "is_configured": true, 00:18:47.819 "data_offset": 0, 00:18:47.819 "data_size": 65536 00:18:47.819 }, 00:18:47.819 { 00:18:47.819 "name": "BaseBdev4", 00:18:47.819 "uuid": "8c744de9-1295-498b-9b90-23943f6e231d", 00:18:47.819 "is_configured": true, 00:18:47.819 "data_offset": 0, 00:18:47.819 "data_size": 65536 00:18:47.819 } 00:18:47.819 ] 00:18:47.819 }' 00:18:47.819 00:00:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:47.819 00:00:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:48.386 00:00:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:48.386 00:00:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:48.644 00:00:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:18:48.644 00:00:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:48.901 [2024-05-15 00:00:49.257241] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:48.901 00:00:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:48.901 00:00:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:48.901 00:00:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:48.901 00:00:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:48.901 00:00:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:48.901 00:00:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:48.901 00:00:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:48.901 00:00:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:48.901 00:00:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:48.901 00:00:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:48.901 00:00:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:48.901 00:00:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:49.160 00:00:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:49.160 "name": "Existed_Raid", 00:18:49.160 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:49.160 "strip_size_kb": 0, 00:18:49.160 "state": "configuring", 00:18:49.160 "raid_level": "raid1", 00:18:49.160 "superblock": false, 00:18:49.160 "num_base_bdevs": 4, 00:18:49.160 "num_base_bdevs_discovered": 3, 00:18:49.160 "num_base_bdevs_operational": 4, 00:18:49.160 "base_bdevs_list": [ 00:18:49.160 { 00:18:49.160 "name": null, 00:18:49.160 "uuid": "75ab35b0-1a2a-4816-b5cf-08fd1d9f0a74", 00:18:49.160 "is_configured": false, 00:18:49.160 "data_offset": 0, 00:18:49.160 "data_size": 65536 00:18:49.160 }, 00:18:49.160 { 00:18:49.160 "name": "BaseBdev2", 00:18:49.160 "uuid": "cc0f8a24-3a79-49d9-bb55-8d34c7885efa", 00:18:49.160 "is_configured": true, 00:18:49.160 "data_offset": 0, 00:18:49.160 "data_size": 65536 00:18:49.160 }, 00:18:49.160 { 00:18:49.160 "name": "BaseBdev3", 00:18:49.160 "uuid": "163ba573-8540-430b-b816-59505d0a6cbe", 00:18:49.160 "is_configured": true, 00:18:49.160 "data_offset": 0, 00:18:49.160 "data_size": 65536 00:18:49.160 }, 00:18:49.160 { 00:18:49.160 "name": "BaseBdev4", 00:18:49.160 "uuid": "8c744de9-1295-498b-9b90-23943f6e231d", 00:18:49.160 "is_configured": true, 00:18:49.160 "data_offset": 0, 00:18:49.160 "data_size": 65536 00:18:49.160 } 00:18:49.160 ] 00:18:49.160 }' 00:18:49.160 00:00:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:49.160 00:00:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:49.727 00:00:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:49.727 00:00:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:49.986 00:00:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:18:49.986 00:00:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:49.986 00:00:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:50.245 00:00:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 75ab35b0-1a2a-4816-b5cf-08fd1d9f0a74 00:18:50.502 [2024-05-15 00:00:50.840870] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:50.502 [2024-05-15 00:00:50.840908] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x10a6a10 00:18:50.502 [2024-05-15 00:00:50.840917] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:18:50.502 [2024-05-15 00:00:50.841111] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10aab60 00:18:50.502 [2024-05-15 00:00:50.841248] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10a6a10 00:18:50.502 [2024-05-15 00:00:50.841259] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x10a6a10 00:18:50.502 [2024-05-15 00:00:50.841436] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:50.502 NewBaseBdev 00:18:50.502 00:00:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:18:50.502 00:00:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:18:50.502 00:00:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:18:50.502 00:00:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:18:50.502 00:00:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:18:50.502 00:00:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:18:50.502 00:00:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:50.760 00:00:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:51.018 [ 00:18:51.018 { 00:18:51.018 "name": "NewBaseBdev", 00:18:51.018 "aliases": [ 00:18:51.018 "75ab35b0-1a2a-4816-b5cf-08fd1d9f0a74" 00:18:51.018 ], 00:18:51.018 "product_name": "Malloc disk", 00:18:51.018 "block_size": 512, 00:18:51.019 "num_blocks": 65536, 00:18:51.019 "uuid": "75ab35b0-1a2a-4816-b5cf-08fd1d9f0a74", 00:18:51.019 "assigned_rate_limits": { 00:18:51.019 "rw_ios_per_sec": 0, 00:18:51.019 "rw_mbytes_per_sec": 0, 00:18:51.019 "r_mbytes_per_sec": 0, 00:18:51.019 "w_mbytes_per_sec": 0 00:18:51.019 }, 00:18:51.019 "claimed": true, 00:18:51.019 "claim_type": "exclusive_write", 00:18:51.019 "zoned": false, 00:18:51.019 "supported_io_types": { 00:18:51.019 "read": true, 00:18:51.019 "write": true, 00:18:51.019 "unmap": true, 00:18:51.019 "write_zeroes": true, 00:18:51.019 "flush": true, 00:18:51.019 "reset": true, 00:18:51.019 "compare": false, 00:18:51.019 "compare_and_write": false, 00:18:51.019 "abort": true, 00:18:51.019 "nvme_admin": false, 00:18:51.019 "nvme_io": false 00:18:51.019 }, 00:18:51.019 "memory_domains": [ 00:18:51.019 { 00:18:51.019 "dma_device_id": "system", 00:18:51.019 "dma_device_type": 1 00:18:51.019 }, 00:18:51.019 { 00:18:51.019 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:51.019 "dma_device_type": 2 00:18:51.019 } 00:18:51.019 ], 00:18:51.019 "driver_specific": {} 00:18:51.019 } 00:18:51.019 ] 00:18:51.019 00:00:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:18:51.019 00:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:18:51.019 00:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:51.019 00:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:18:51.019 00:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:51.019 00:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:51.019 00:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:51.019 00:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:51.019 00:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:51.019 00:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:51.019 00:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:51.019 00:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:51.019 00:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:51.277 00:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:51.277 "name": "Existed_Raid", 00:18:51.277 "uuid": "6c8c44cd-5689-4fdd-81ba-c8dae132466f", 00:18:51.277 "strip_size_kb": 0, 00:18:51.277 "state": "online", 00:18:51.277 "raid_level": "raid1", 00:18:51.277 "superblock": false, 00:18:51.277 "num_base_bdevs": 4, 00:18:51.277 "num_base_bdevs_discovered": 4, 00:18:51.277 "num_base_bdevs_operational": 4, 00:18:51.277 "base_bdevs_list": [ 00:18:51.277 { 00:18:51.277 "name": "NewBaseBdev", 00:18:51.277 "uuid": "75ab35b0-1a2a-4816-b5cf-08fd1d9f0a74", 00:18:51.277 "is_configured": true, 00:18:51.277 "data_offset": 0, 00:18:51.277 "data_size": 65536 00:18:51.277 }, 00:18:51.277 { 00:18:51.277 "name": "BaseBdev2", 00:18:51.277 "uuid": "cc0f8a24-3a79-49d9-bb55-8d34c7885efa", 00:18:51.277 "is_configured": true, 00:18:51.277 "data_offset": 0, 00:18:51.277 "data_size": 65536 00:18:51.277 }, 00:18:51.277 { 00:18:51.277 "name": "BaseBdev3", 00:18:51.277 "uuid": "163ba573-8540-430b-b816-59505d0a6cbe", 00:18:51.277 "is_configured": true, 00:18:51.277 "data_offset": 0, 00:18:51.277 "data_size": 65536 00:18:51.277 }, 00:18:51.277 { 00:18:51.277 "name": "BaseBdev4", 00:18:51.277 "uuid": "8c744de9-1295-498b-9b90-23943f6e231d", 00:18:51.277 "is_configured": true, 00:18:51.277 "data_offset": 0, 00:18:51.277 "data_size": 65536 00:18:51.277 } 00:18:51.277 ] 00:18:51.277 }' 00:18:51.277 00:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:51.277 00:00:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:51.845 00:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:18:51.845 00:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:18:51.845 00:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:18:51.845 00:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:18:51.845 00:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:18:51.845 00:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:18:51.845 00:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:51.845 00:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:18:52.104 [2024-05-15 00:00:52.437410] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:52.104 00:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:18:52.104 "name": "Existed_Raid", 00:18:52.104 "aliases": [ 00:18:52.104 "6c8c44cd-5689-4fdd-81ba-c8dae132466f" 00:18:52.104 ], 00:18:52.104 "product_name": "Raid Volume", 00:18:52.104 "block_size": 512, 00:18:52.104 "num_blocks": 65536, 00:18:52.104 "uuid": "6c8c44cd-5689-4fdd-81ba-c8dae132466f", 00:18:52.104 "assigned_rate_limits": { 00:18:52.104 "rw_ios_per_sec": 0, 00:18:52.104 "rw_mbytes_per_sec": 0, 00:18:52.104 "r_mbytes_per_sec": 0, 00:18:52.104 "w_mbytes_per_sec": 0 00:18:52.104 }, 00:18:52.104 "claimed": false, 00:18:52.104 "zoned": false, 00:18:52.104 "supported_io_types": { 00:18:52.104 "read": true, 00:18:52.104 "write": true, 00:18:52.104 "unmap": false, 00:18:52.104 "write_zeroes": true, 00:18:52.104 "flush": false, 00:18:52.104 "reset": true, 00:18:52.104 "compare": false, 00:18:52.104 "compare_and_write": false, 00:18:52.104 "abort": false, 00:18:52.104 "nvme_admin": false, 00:18:52.104 "nvme_io": false 00:18:52.104 }, 00:18:52.104 "memory_domains": [ 00:18:52.104 { 00:18:52.104 "dma_device_id": "system", 00:18:52.104 "dma_device_type": 1 00:18:52.104 }, 00:18:52.104 { 00:18:52.104 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.104 "dma_device_type": 2 00:18:52.104 }, 00:18:52.104 { 00:18:52.104 "dma_device_id": "system", 00:18:52.104 "dma_device_type": 1 00:18:52.104 }, 00:18:52.104 { 00:18:52.104 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.104 "dma_device_type": 2 00:18:52.104 }, 00:18:52.104 { 00:18:52.104 "dma_device_id": "system", 00:18:52.104 "dma_device_type": 1 00:18:52.104 }, 00:18:52.104 { 00:18:52.104 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.104 "dma_device_type": 2 00:18:52.104 }, 00:18:52.104 { 00:18:52.104 "dma_device_id": "system", 00:18:52.104 "dma_device_type": 1 00:18:52.104 }, 00:18:52.104 { 00:18:52.104 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.104 "dma_device_type": 2 00:18:52.104 } 00:18:52.104 ], 00:18:52.104 "driver_specific": { 00:18:52.104 "raid": { 00:18:52.104 "uuid": "6c8c44cd-5689-4fdd-81ba-c8dae132466f", 00:18:52.104 "strip_size_kb": 0, 00:18:52.104 "state": "online", 00:18:52.104 "raid_level": "raid1", 00:18:52.104 "superblock": false, 00:18:52.104 "num_base_bdevs": 4, 00:18:52.104 "num_base_bdevs_discovered": 4, 00:18:52.104 "num_base_bdevs_operational": 4, 00:18:52.104 "base_bdevs_list": [ 00:18:52.104 { 00:18:52.104 "name": "NewBaseBdev", 00:18:52.104 "uuid": "75ab35b0-1a2a-4816-b5cf-08fd1d9f0a74", 00:18:52.104 "is_configured": true, 00:18:52.104 "data_offset": 0, 00:18:52.104 "data_size": 65536 00:18:52.104 }, 00:18:52.104 { 00:18:52.105 "name": "BaseBdev2", 00:18:52.105 "uuid": "cc0f8a24-3a79-49d9-bb55-8d34c7885efa", 00:18:52.105 "is_configured": true, 00:18:52.105 "data_offset": 0, 00:18:52.105 "data_size": 65536 00:18:52.105 }, 00:18:52.105 { 00:18:52.105 "name": "BaseBdev3", 00:18:52.105 "uuid": "163ba573-8540-430b-b816-59505d0a6cbe", 00:18:52.105 "is_configured": true, 00:18:52.105 "data_offset": 0, 00:18:52.105 "data_size": 65536 00:18:52.105 }, 00:18:52.105 { 00:18:52.105 "name": "BaseBdev4", 00:18:52.105 "uuid": "8c744de9-1295-498b-9b90-23943f6e231d", 00:18:52.105 "is_configured": true, 00:18:52.105 "data_offset": 0, 00:18:52.105 "data_size": 65536 00:18:52.105 } 00:18:52.105 ] 00:18:52.105 } 00:18:52.105 } 00:18:52.105 }' 00:18:52.105 00:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:52.105 00:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:18:52.105 BaseBdev2 00:18:52.105 BaseBdev3 00:18:52.105 BaseBdev4' 00:18:52.105 00:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:52.105 00:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:52.105 00:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:52.363 00:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:52.363 "name": "NewBaseBdev", 00:18:52.363 "aliases": [ 00:18:52.363 "75ab35b0-1a2a-4816-b5cf-08fd1d9f0a74" 00:18:52.363 ], 00:18:52.363 "product_name": "Malloc disk", 00:18:52.363 "block_size": 512, 00:18:52.363 "num_blocks": 65536, 00:18:52.363 "uuid": "75ab35b0-1a2a-4816-b5cf-08fd1d9f0a74", 00:18:52.363 "assigned_rate_limits": { 00:18:52.363 "rw_ios_per_sec": 0, 00:18:52.363 "rw_mbytes_per_sec": 0, 00:18:52.363 "r_mbytes_per_sec": 0, 00:18:52.363 "w_mbytes_per_sec": 0 00:18:52.363 }, 00:18:52.363 "claimed": true, 00:18:52.363 "claim_type": "exclusive_write", 00:18:52.363 "zoned": false, 00:18:52.363 "supported_io_types": { 00:18:52.363 "read": true, 00:18:52.363 "write": true, 00:18:52.363 "unmap": true, 00:18:52.363 "write_zeroes": true, 00:18:52.363 "flush": true, 00:18:52.363 "reset": true, 00:18:52.363 "compare": false, 00:18:52.363 "compare_and_write": false, 00:18:52.363 "abort": true, 00:18:52.363 "nvme_admin": false, 00:18:52.363 "nvme_io": false 00:18:52.363 }, 00:18:52.363 "memory_domains": [ 00:18:52.363 { 00:18:52.363 "dma_device_id": "system", 00:18:52.363 "dma_device_type": 1 00:18:52.363 }, 00:18:52.363 { 00:18:52.363 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.363 "dma_device_type": 2 00:18:52.363 } 00:18:52.363 ], 00:18:52.363 "driver_specific": {} 00:18:52.363 }' 00:18:52.363 00:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:52.363 00:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:52.363 00:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:52.363 00:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:52.363 00:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:52.363 00:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:52.363 00:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:52.622 00:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:52.622 00:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:52.622 00:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:52.622 00:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:52.622 00:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:52.622 00:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:52.622 00:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:52.622 00:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:52.880 00:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:52.880 "name": "BaseBdev2", 00:18:52.880 "aliases": [ 00:18:52.880 "cc0f8a24-3a79-49d9-bb55-8d34c7885efa" 00:18:52.880 ], 00:18:52.880 "product_name": "Malloc disk", 00:18:52.880 "block_size": 512, 00:18:52.880 "num_blocks": 65536, 00:18:52.880 "uuid": "cc0f8a24-3a79-49d9-bb55-8d34c7885efa", 00:18:52.880 "assigned_rate_limits": { 00:18:52.880 "rw_ios_per_sec": 0, 00:18:52.880 "rw_mbytes_per_sec": 0, 00:18:52.880 "r_mbytes_per_sec": 0, 00:18:52.880 "w_mbytes_per_sec": 0 00:18:52.880 }, 00:18:52.880 "claimed": true, 00:18:52.880 "claim_type": "exclusive_write", 00:18:52.880 "zoned": false, 00:18:52.880 "supported_io_types": { 00:18:52.880 "read": true, 00:18:52.880 "write": true, 00:18:52.880 "unmap": true, 00:18:52.880 "write_zeroes": true, 00:18:52.880 "flush": true, 00:18:52.880 "reset": true, 00:18:52.880 "compare": false, 00:18:52.880 "compare_and_write": false, 00:18:52.880 "abort": true, 00:18:52.880 "nvme_admin": false, 00:18:52.880 "nvme_io": false 00:18:52.880 }, 00:18:52.880 "memory_domains": [ 00:18:52.880 { 00:18:52.880 "dma_device_id": "system", 00:18:52.880 "dma_device_type": 1 00:18:52.880 }, 00:18:52.880 { 00:18:52.880 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.880 "dma_device_type": 2 00:18:52.880 } 00:18:52.880 ], 00:18:52.880 "driver_specific": {} 00:18:52.880 }' 00:18:52.880 00:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:52.880 00:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:52.880 00:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:52.880 00:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:53.137 00:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:53.137 00:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:53.137 00:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:53.137 00:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:53.137 00:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:53.137 00:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:53.137 00:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:53.137 00:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:53.137 00:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:53.137 00:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:53.137 00:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:53.396 00:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:53.396 "name": "BaseBdev3", 00:18:53.396 "aliases": [ 00:18:53.396 "163ba573-8540-430b-b816-59505d0a6cbe" 00:18:53.396 ], 00:18:53.396 "product_name": "Malloc disk", 00:18:53.396 "block_size": 512, 00:18:53.396 "num_blocks": 65536, 00:18:53.396 "uuid": "163ba573-8540-430b-b816-59505d0a6cbe", 00:18:53.396 "assigned_rate_limits": { 00:18:53.396 "rw_ios_per_sec": 0, 00:18:53.396 "rw_mbytes_per_sec": 0, 00:18:53.396 "r_mbytes_per_sec": 0, 00:18:53.396 "w_mbytes_per_sec": 0 00:18:53.396 }, 00:18:53.396 "claimed": true, 00:18:53.396 "claim_type": "exclusive_write", 00:18:53.396 "zoned": false, 00:18:53.396 "supported_io_types": { 00:18:53.396 "read": true, 00:18:53.396 "write": true, 00:18:53.396 "unmap": true, 00:18:53.396 "write_zeroes": true, 00:18:53.396 "flush": true, 00:18:53.396 "reset": true, 00:18:53.396 "compare": false, 00:18:53.396 "compare_and_write": false, 00:18:53.396 "abort": true, 00:18:53.396 "nvme_admin": false, 00:18:53.396 "nvme_io": false 00:18:53.396 }, 00:18:53.396 "memory_domains": [ 00:18:53.396 { 00:18:53.396 "dma_device_id": "system", 00:18:53.396 "dma_device_type": 1 00:18:53.396 }, 00:18:53.396 { 00:18:53.396 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.396 "dma_device_type": 2 00:18:53.396 } 00:18:53.396 ], 00:18:53.396 "driver_specific": {} 00:18:53.396 }' 00:18:53.396 00:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:53.654 00:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:53.654 00:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:53.654 00:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:53.654 00:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:53.654 00:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:53.654 00:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:53.654 00:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:53.654 00:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:53.654 00:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:53.949 00:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:53.949 00:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:53.949 00:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:53.949 00:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:53.949 00:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:54.229 00:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:54.229 "name": "BaseBdev4", 00:18:54.229 "aliases": [ 00:18:54.230 "8c744de9-1295-498b-9b90-23943f6e231d" 00:18:54.230 ], 00:18:54.230 "product_name": "Malloc disk", 00:18:54.230 "block_size": 512, 00:18:54.230 "num_blocks": 65536, 00:18:54.230 "uuid": "8c744de9-1295-498b-9b90-23943f6e231d", 00:18:54.230 "assigned_rate_limits": { 00:18:54.230 "rw_ios_per_sec": 0, 00:18:54.230 "rw_mbytes_per_sec": 0, 00:18:54.230 "r_mbytes_per_sec": 0, 00:18:54.230 "w_mbytes_per_sec": 0 00:18:54.230 }, 00:18:54.230 "claimed": true, 00:18:54.230 "claim_type": "exclusive_write", 00:18:54.230 "zoned": false, 00:18:54.230 "supported_io_types": { 00:18:54.230 "read": true, 00:18:54.230 "write": true, 00:18:54.230 "unmap": true, 00:18:54.230 "write_zeroes": true, 00:18:54.230 "flush": true, 00:18:54.230 "reset": true, 00:18:54.230 "compare": false, 00:18:54.230 "compare_and_write": false, 00:18:54.230 "abort": true, 00:18:54.230 "nvme_admin": false, 00:18:54.230 "nvme_io": false 00:18:54.230 }, 00:18:54.230 "memory_domains": [ 00:18:54.230 { 00:18:54.230 "dma_device_id": "system", 00:18:54.230 "dma_device_type": 1 00:18:54.230 }, 00:18:54.230 { 00:18:54.230 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:54.230 "dma_device_type": 2 00:18:54.230 } 00:18:54.230 ], 00:18:54.230 "driver_specific": {} 00:18:54.230 }' 00:18:54.230 00:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:54.230 00:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:54.230 00:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:54.230 00:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:54.230 00:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:54.230 00:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:54.230 00:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:54.230 00:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:54.230 00:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:54.230 00:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:54.487 00:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:54.487 00:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:54.487 00:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:54.744 [2024-05-15 00:00:55.136276] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:54.744 [2024-05-15 00:00:55.136301] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:54.744 [2024-05-15 00:00:55.136357] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:54.744 [2024-05-15 00:00:55.136650] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:54.744 [2024-05-15 00:00:55.136663] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10a6a10 name Existed_Raid, state offline 00:18:54.744 00:00:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@342 -- # killprocess 458906 00:18:54.744 00:00:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@946 -- # '[' -z 458906 ']' 00:18:54.744 00:00:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # kill -0 458906 00:18:54.744 00:00:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # uname 00:18:54.744 00:00:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:18:54.744 00:00:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 458906 00:18:54.744 00:00:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:18:54.744 00:00:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:18:54.744 00:00:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 458906' 00:18:54.744 killing process with pid 458906 00:18:54.744 00:00:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@965 -- # kill 458906 00:18:54.744 [2024-05-15 00:00:55.206613] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:54.744 00:00:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@970 -- # wait 458906 00:18:54.744 [2024-05-15 00:00:55.243927] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@344 -- # return 0 00:18:55.003 00:18:55.003 real 0m32.138s 00:18:55.003 user 0m59.056s 00:18:55.003 sys 0m5.688s 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:55.003 ************************************ 00:18:55.003 END TEST raid_state_function_test 00:18:55.003 ************************************ 00:18:55.003 00:00:55 bdev_raid -- bdev/bdev_raid.sh@816 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:18:55.003 00:00:55 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:18:55.003 00:00:55 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:18:55.003 00:00:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:55.003 ************************************ 00:18:55.003 START TEST raid_state_function_test_sb 00:18:55.003 ************************************ 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1121 -- # raid_state_function_test raid1 4 true 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local raid_level=raid1 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=4 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev4 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # '[' raid1 '!=' raid1 ']' 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@235 -- # strip_size=0 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:18:55.003 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:18:55.261 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # raid_pid=463772 00:18:55.261 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 463772' 00:18:55.261 Process raid pid: 463772 00:18:55.261 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:55.261 00:00:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@247 -- # waitforlisten 463772 /var/tmp/spdk-raid.sock 00:18:55.261 00:00:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@827 -- # '[' -z 463772 ']' 00:18:55.261 00:00:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:55.261 00:00:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:18:55.261 00:00:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:55.261 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:55.261 00:00:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:18:55.261 00:00:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:55.261 [2024-05-15 00:00:55.652530] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:18:55.261 [2024-05-15 00:00:55.652601] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:55.261 [2024-05-15 00:00:55.785879] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:55.519 [2024-05-15 00:00:55.893873] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:18:55.519 [2024-05-15 00:00:55.953311] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:55.519 [2024-05-15 00:00:55.953346] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:56.085 00:00:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:18:56.085 00:00:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # return 0 00:18:56.085 00:00:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:56.085 [2024-05-15 00:00:56.675220] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:56.085 [2024-05-15 00:00:56.675265] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:56.085 [2024-05-15 00:00:56.675276] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:56.085 [2024-05-15 00:00:56.675288] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:56.085 [2024-05-15 00:00:56.675298] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:56.085 [2024-05-15 00:00:56.675310] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:56.085 [2024-05-15 00:00:56.675319] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:56.085 [2024-05-15 00:00:56.675331] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:56.343 00:00:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:56.343 00:00:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:56.343 00:00:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:56.343 00:00:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:56.343 00:00:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:56.343 00:00:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:56.343 00:00:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:56.343 00:00:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:56.343 00:00:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:56.343 00:00:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:56.343 00:00:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:56.343 00:00:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:56.601 00:00:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:56.601 "name": "Existed_Raid", 00:18:56.601 "uuid": "2bb65da3-c003-4372-b63f-ba1ad84acfb6", 00:18:56.601 "strip_size_kb": 0, 00:18:56.601 "state": "configuring", 00:18:56.601 "raid_level": "raid1", 00:18:56.601 "superblock": true, 00:18:56.601 "num_base_bdevs": 4, 00:18:56.601 "num_base_bdevs_discovered": 0, 00:18:56.601 "num_base_bdevs_operational": 4, 00:18:56.601 "base_bdevs_list": [ 00:18:56.601 { 00:18:56.601 "name": "BaseBdev1", 00:18:56.601 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:56.601 "is_configured": false, 00:18:56.601 "data_offset": 0, 00:18:56.601 "data_size": 0 00:18:56.601 }, 00:18:56.601 { 00:18:56.601 "name": "BaseBdev2", 00:18:56.601 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:56.601 "is_configured": false, 00:18:56.601 "data_offset": 0, 00:18:56.601 "data_size": 0 00:18:56.601 }, 00:18:56.601 { 00:18:56.601 "name": "BaseBdev3", 00:18:56.601 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:56.601 "is_configured": false, 00:18:56.601 "data_offset": 0, 00:18:56.601 "data_size": 0 00:18:56.601 }, 00:18:56.601 { 00:18:56.601 "name": "BaseBdev4", 00:18:56.601 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:56.601 "is_configured": false, 00:18:56.601 "data_offset": 0, 00:18:56.601 "data_size": 0 00:18:56.601 } 00:18:56.601 ] 00:18:56.601 }' 00:18:56.601 00:00:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:56.601 00:00:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:57.168 00:00:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:57.427 [2024-05-15 00:00:57.761933] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:57.427 [2024-05-15 00:00:57.761968] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bfac00 name Existed_Raid, state configuring 00:18:57.427 00:00:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:57.427 [2024-05-15 00:00:57.942441] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:57.427 [2024-05-15 00:00:57.942472] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:57.427 [2024-05-15 00:00:57.942482] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:57.427 [2024-05-15 00:00:57.942494] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:57.427 [2024-05-15 00:00:57.942503] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:57.427 [2024-05-15 00:00:57.942514] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:57.427 [2024-05-15 00:00:57.942523] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:57.427 [2024-05-15 00:00:57.942534] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:57.427 00:00:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:57.686 [2024-05-15 00:00:58.124754] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:57.686 BaseBdev1 00:18:57.686 00:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:18:57.686 00:00:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:18:57.686 00:00:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:18:57.686 00:00:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:18:57.686 00:00:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:18:57.686 00:00:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:18:57.686 00:00:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:57.945 00:00:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:57.945 [ 00:18:57.945 { 00:18:57.945 "name": "BaseBdev1", 00:18:57.945 "aliases": [ 00:18:57.945 "400f28d3-7db9-4eb6-927e-2b2bf45a3d33" 00:18:57.945 ], 00:18:57.945 "product_name": "Malloc disk", 00:18:57.945 "block_size": 512, 00:18:57.945 "num_blocks": 65536, 00:18:57.945 "uuid": "400f28d3-7db9-4eb6-927e-2b2bf45a3d33", 00:18:57.945 "assigned_rate_limits": { 00:18:57.945 "rw_ios_per_sec": 0, 00:18:57.945 "rw_mbytes_per_sec": 0, 00:18:57.945 "r_mbytes_per_sec": 0, 00:18:57.945 "w_mbytes_per_sec": 0 00:18:57.945 }, 00:18:57.945 "claimed": true, 00:18:57.945 "claim_type": "exclusive_write", 00:18:57.945 "zoned": false, 00:18:57.945 "supported_io_types": { 00:18:57.945 "read": true, 00:18:57.945 "write": true, 00:18:57.945 "unmap": true, 00:18:57.945 "write_zeroes": true, 00:18:57.945 "flush": true, 00:18:57.945 "reset": true, 00:18:57.945 "compare": false, 00:18:57.945 "compare_and_write": false, 00:18:57.945 "abort": true, 00:18:57.945 "nvme_admin": false, 00:18:57.945 "nvme_io": false 00:18:57.945 }, 00:18:57.945 "memory_domains": [ 00:18:57.945 { 00:18:57.945 "dma_device_id": "system", 00:18:57.945 "dma_device_type": 1 00:18:57.945 }, 00:18:57.945 { 00:18:57.945 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:57.945 "dma_device_type": 2 00:18:57.945 } 00:18:57.945 ], 00:18:57.945 "driver_specific": {} 00:18:57.945 } 00:18:57.945 ] 00:18:57.945 00:00:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:18:57.945 00:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:57.945 00:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:57.945 00:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:57.945 00:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:57.945 00:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:57.945 00:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:57.945 00:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:57.945 00:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:57.945 00:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:57.945 00:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:57.945 00:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:57.945 00:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:58.202 00:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:58.202 "name": "Existed_Raid", 00:18:58.202 "uuid": "32882863-307f-4f1f-bef4-486b1d97a801", 00:18:58.202 "strip_size_kb": 0, 00:18:58.202 "state": "configuring", 00:18:58.202 "raid_level": "raid1", 00:18:58.202 "superblock": true, 00:18:58.202 "num_base_bdevs": 4, 00:18:58.202 "num_base_bdevs_discovered": 1, 00:18:58.203 "num_base_bdevs_operational": 4, 00:18:58.203 "base_bdevs_list": [ 00:18:58.203 { 00:18:58.203 "name": "BaseBdev1", 00:18:58.203 "uuid": "400f28d3-7db9-4eb6-927e-2b2bf45a3d33", 00:18:58.203 "is_configured": true, 00:18:58.203 "data_offset": 2048, 00:18:58.203 "data_size": 63488 00:18:58.203 }, 00:18:58.203 { 00:18:58.203 "name": "BaseBdev2", 00:18:58.203 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:58.203 "is_configured": false, 00:18:58.203 "data_offset": 0, 00:18:58.203 "data_size": 0 00:18:58.203 }, 00:18:58.203 { 00:18:58.203 "name": "BaseBdev3", 00:18:58.203 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:58.203 "is_configured": false, 00:18:58.203 "data_offset": 0, 00:18:58.203 "data_size": 0 00:18:58.203 }, 00:18:58.203 { 00:18:58.203 "name": "BaseBdev4", 00:18:58.203 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:58.203 "is_configured": false, 00:18:58.203 "data_offset": 0, 00:18:58.203 "data_size": 0 00:18:58.203 } 00:18:58.203 ] 00:18:58.203 }' 00:18:58.203 00:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:58.203 00:00:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:58.767 00:00:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:59.025 [2024-05-15 00:00:59.560555] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:59.025 [2024-05-15 00:00:59.560598] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bfaea0 name Existed_Raid, state configuring 00:18:59.025 00:00:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:59.283 [2024-05-15 00:00:59.809266] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:59.283 [2024-05-15 00:00:59.810776] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:59.283 [2024-05-15 00:00:59.810811] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:59.283 [2024-05-15 00:00:59.810822] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:59.283 [2024-05-15 00:00:59.810834] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:59.283 [2024-05-15 00:00:59.810843] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:59.283 [2024-05-15 00:00:59.810859] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:59.283 00:00:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:18:59.283 00:00:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:18:59.283 00:00:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:59.283 00:00:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:59.283 00:00:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:59.283 00:00:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:59.283 00:00:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:59.283 00:00:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:59.283 00:00:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:59.283 00:00:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:59.283 00:00:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:59.283 00:00:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:59.283 00:00:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:59.284 00:00:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:59.542 00:01:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:59.542 "name": "Existed_Raid", 00:18:59.542 "uuid": "d5ad4fa7-f323-4762-ba89-3a93ff57ee39", 00:18:59.542 "strip_size_kb": 0, 00:18:59.542 "state": "configuring", 00:18:59.542 "raid_level": "raid1", 00:18:59.542 "superblock": true, 00:18:59.542 "num_base_bdevs": 4, 00:18:59.542 "num_base_bdevs_discovered": 1, 00:18:59.542 "num_base_bdevs_operational": 4, 00:18:59.542 "base_bdevs_list": [ 00:18:59.542 { 00:18:59.542 "name": "BaseBdev1", 00:18:59.542 "uuid": "400f28d3-7db9-4eb6-927e-2b2bf45a3d33", 00:18:59.542 "is_configured": true, 00:18:59.542 "data_offset": 2048, 00:18:59.542 "data_size": 63488 00:18:59.542 }, 00:18:59.542 { 00:18:59.542 "name": "BaseBdev2", 00:18:59.542 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:59.542 "is_configured": false, 00:18:59.542 "data_offset": 0, 00:18:59.542 "data_size": 0 00:18:59.542 }, 00:18:59.542 { 00:18:59.542 "name": "BaseBdev3", 00:18:59.542 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:59.542 "is_configured": false, 00:18:59.542 "data_offset": 0, 00:18:59.542 "data_size": 0 00:18:59.542 }, 00:18:59.542 { 00:18:59.542 "name": "BaseBdev4", 00:18:59.542 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:59.542 "is_configured": false, 00:18:59.542 "data_offset": 0, 00:18:59.542 "data_size": 0 00:18:59.542 } 00:18:59.542 ] 00:18:59.542 }' 00:18:59.542 00:01:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:59.542 00:01:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:00.106 00:01:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:00.364 [2024-05-15 00:01:00.915572] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:00.364 BaseBdev2 00:19:00.364 00:01:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:19:00.364 00:01:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:19:00.364 00:01:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:19:00.364 00:01:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:19:00.364 00:01:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:19:00.364 00:01:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:19:00.364 00:01:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:00.622 00:01:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:00.880 [ 00:19:00.880 { 00:19:00.880 "name": "BaseBdev2", 00:19:00.880 "aliases": [ 00:19:00.880 "79490254-059e-4d9a-8ad9-078caf748b5d" 00:19:00.880 ], 00:19:00.880 "product_name": "Malloc disk", 00:19:00.880 "block_size": 512, 00:19:00.880 "num_blocks": 65536, 00:19:00.880 "uuid": "79490254-059e-4d9a-8ad9-078caf748b5d", 00:19:00.880 "assigned_rate_limits": { 00:19:00.880 "rw_ios_per_sec": 0, 00:19:00.880 "rw_mbytes_per_sec": 0, 00:19:00.880 "r_mbytes_per_sec": 0, 00:19:00.880 "w_mbytes_per_sec": 0 00:19:00.880 }, 00:19:00.880 "claimed": true, 00:19:00.880 "claim_type": "exclusive_write", 00:19:00.880 "zoned": false, 00:19:00.880 "supported_io_types": { 00:19:00.880 "read": true, 00:19:00.880 "write": true, 00:19:00.880 "unmap": true, 00:19:00.880 "write_zeroes": true, 00:19:00.880 "flush": true, 00:19:00.880 "reset": true, 00:19:00.880 "compare": false, 00:19:00.880 "compare_and_write": false, 00:19:00.880 "abort": true, 00:19:00.880 "nvme_admin": false, 00:19:00.880 "nvme_io": false 00:19:00.880 }, 00:19:00.880 "memory_domains": [ 00:19:00.880 { 00:19:00.880 "dma_device_id": "system", 00:19:00.880 "dma_device_type": 1 00:19:00.880 }, 00:19:00.880 { 00:19:00.880 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:00.880 "dma_device_type": 2 00:19:00.880 } 00:19:00.880 ], 00:19:00.880 "driver_specific": {} 00:19:00.880 } 00:19:00.880 ] 00:19:00.880 00:01:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:19:00.880 00:01:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:19:00.880 00:01:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:19:00.880 00:01:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:00.880 00:01:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:19:00.880 00:01:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:19:00.880 00:01:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:00.880 00:01:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:00.880 00:01:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:19:00.880 00:01:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:00.880 00:01:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:00.880 00:01:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:00.880 00:01:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:00.880 00:01:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:00.880 00:01:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:01.138 00:01:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:01.138 "name": "Existed_Raid", 00:19:01.138 "uuid": "d5ad4fa7-f323-4762-ba89-3a93ff57ee39", 00:19:01.138 "strip_size_kb": 0, 00:19:01.138 "state": "configuring", 00:19:01.138 "raid_level": "raid1", 00:19:01.138 "superblock": true, 00:19:01.138 "num_base_bdevs": 4, 00:19:01.138 "num_base_bdevs_discovered": 2, 00:19:01.138 "num_base_bdevs_operational": 4, 00:19:01.138 "base_bdevs_list": [ 00:19:01.138 { 00:19:01.138 "name": "BaseBdev1", 00:19:01.138 "uuid": "400f28d3-7db9-4eb6-927e-2b2bf45a3d33", 00:19:01.138 "is_configured": true, 00:19:01.138 "data_offset": 2048, 00:19:01.138 "data_size": 63488 00:19:01.138 }, 00:19:01.138 { 00:19:01.138 "name": "BaseBdev2", 00:19:01.138 "uuid": "79490254-059e-4d9a-8ad9-078caf748b5d", 00:19:01.138 "is_configured": true, 00:19:01.138 "data_offset": 2048, 00:19:01.138 "data_size": 63488 00:19:01.138 }, 00:19:01.138 { 00:19:01.138 "name": "BaseBdev3", 00:19:01.138 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:01.138 "is_configured": false, 00:19:01.138 "data_offset": 0, 00:19:01.138 "data_size": 0 00:19:01.138 }, 00:19:01.138 { 00:19:01.138 "name": "BaseBdev4", 00:19:01.138 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:01.138 "is_configured": false, 00:19:01.138 "data_offset": 0, 00:19:01.138 "data_size": 0 00:19:01.138 } 00:19:01.138 ] 00:19:01.138 }' 00:19:01.138 00:01:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:01.138 00:01:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:01.704 00:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:01.962 [2024-05-15 00:01:02.443082] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:01.962 BaseBdev3 00:19:01.962 00:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:19:01.962 00:01:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:19:01.962 00:01:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:19:01.962 00:01:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:19:01.962 00:01:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:19:01.962 00:01:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:19:01.962 00:01:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:02.220 00:01:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:02.477 [ 00:19:02.477 { 00:19:02.477 "name": "BaseBdev3", 00:19:02.477 "aliases": [ 00:19:02.477 "c40fe600-d233-41ba-b540-a9eaf6311ab6" 00:19:02.477 ], 00:19:02.477 "product_name": "Malloc disk", 00:19:02.477 "block_size": 512, 00:19:02.477 "num_blocks": 65536, 00:19:02.477 "uuid": "c40fe600-d233-41ba-b540-a9eaf6311ab6", 00:19:02.477 "assigned_rate_limits": { 00:19:02.477 "rw_ios_per_sec": 0, 00:19:02.477 "rw_mbytes_per_sec": 0, 00:19:02.477 "r_mbytes_per_sec": 0, 00:19:02.477 "w_mbytes_per_sec": 0 00:19:02.477 }, 00:19:02.477 "claimed": true, 00:19:02.477 "claim_type": "exclusive_write", 00:19:02.477 "zoned": false, 00:19:02.477 "supported_io_types": { 00:19:02.477 "read": true, 00:19:02.478 "write": true, 00:19:02.478 "unmap": true, 00:19:02.478 "write_zeroes": true, 00:19:02.478 "flush": true, 00:19:02.478 "reset": true, 00:19:02.478 "compare": false, 00:19:02.478 "compare_and_write": false, 00:19:02.478 "abort": true, 00:19:02.478 "nvme_admin": false, 00:19:02.478 "nvme_io": false 00:19:02.478 }, 00:19:02.478 "memory_domains": [ 00:19:02.478 { 00:19:02.478 "dma_device_id": "system", 00:19:02.478 "dma_device_type": 1 00:19:02.478 }, 00:19:02.478 { 00:19:02.478 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:02.478 "dma_device_type": 2 00:19:02.478 } 00:19:02.478 ], 00:19:02.478 "driver_specific": {} 00:19:02.478 } 00:19:02.478 ] 00:19:02.478 00:01:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:19:02.478 00:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:19:02.478 00:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:19:02.478 00:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:02.478 00:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:19:02.478 00:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:19:02.478 00:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:02.478 00:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:02.478 00:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:19:02.478 00:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:02.478 00:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:02.478 00:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:02.478 00:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:02.478 00:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:02.478 00:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:02.736 00:01:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:02.736 "name": "Existed_Raid", 00:19:02.736 "uuid": "d5ad4fa7-f323-4762-ba89-3a93ff57ee39", 00:19:02.736 "strip_size_kb": 0, 00:19:02.736 "state": "configuring", 00:19:02.736 "raid_level": "raid1", 00:19:02.736 "superblock": true, 00:19:02.736 "num_base_bdevs": 4, 00:19:02.736 "num_base_bdevs_discovered": 3, 00:19:02.736 "num_base_bdevs_operational": 4, 00:19:02.736 "base_bdevs_list": [ 00:19:02.736 { 00:19:02.736 "name": "BaseBdev1", 00:19:02.736 "uuid": "400f28d3-7db9-4eb6-927e-2b2bf45a3d33", 00:19:02.736 "is_configured": true, 00:19:02.736 "data_offset": 2048, 00:19:02.736 "data_size": 63488 00:19:02.736 }, 00:19:02.736 { 00:19:02.736 "name": "BaseBdev2", 00:19:02.736 "uuid": "79490254-059e-4d9a-8ad9-078caf748b5d", 00:19:02.736 "is_configured": true, 00:19:02.736 "data_offset": 2048, 00:19:02.736 "data_size": 63488 00:19:02.736 }, 00:19:02.736 { 00:19:02.736 "name": "BaseBdev3", 00:19:02.736 "uuid": "c40fe600-d233-41ba-b540-a9eaf6311ab6", 00:19:02.736 "is_configured": true, 00:19:02.736 "data_offset": 2048, 00:19:02.736 "data_size": 63488 00:19:02.736 }, 00:19:02.736 { 00:19:02.736 "name": "BaseBdev4", 00:19:02.736 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:02.736 "is_configured": false, 00:19:02.736 "data_offset": 0, 00:19:02.736 "data_size": 0 00:19:02.736 } 00:19:02.736 ] 00:19:02.737 }' 00:19:02.737 00:01:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:02.737 00:01:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:03.304 00:01:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:03.562 [2024-05-15 00:01:04.042915] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:03.562 [2024-05-15 00:01:04.043094] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bfa470 00:19:03.562 [2024-05-15 00:01:04.043107] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:03.562 [2024-05-15 00:01:04.043298] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bfab40 00:19:03.562 [2024-05-15 00:01:04.043448] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bfa470 00:19:03.562 [2024-05-15 00:01:04.043459] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1bfa470 00:19:03.562 [2024-05-15 00:01:04.043561] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:03.562 BaseBdev4 00:19:03.562 00:01:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev4 00:19:03.562 00:01:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:19:03.562 00:01:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:19:03.562 00:01:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:19:03.562 00:01:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:19:03.562 00:01:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:19:03.562 00:01:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:03.820 00:01:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:04.078 [ 00:19:04.078 { 00:19:04.078 "name": "BaseBdev4", 00:19:04.078 "aliases": [ 00:19:04.078 "7bc88930-c855-4efa-92d8-7dbc1a8efd93" 00:19:04.078 ], 00:19:04.078 "product_name": "Malloc disk", 00:19:04.078 "block_size": 512, 00:19:04.078 "num_blocks": 65536, 00:19:04.078 "uuid": "7bc88930-c855-4efa-92d8-7dbc1a8efd93", 00:19:04.078 "assigned_rate_limits": { 00:19:04.078 "rw_ios_per_sec": 0, 00:19:04.078 "rw_mbytes_per_sec": 0, 00:19:04.078 "r_mbytes_per_sec": 0, 00:19:04.078 "w_mbytes_per_sec": 0 00:19:04.078 }, 00:19:04.078 "claimed": true, 00:19:04.078 "claim_type": "exclusive_write", 00:19:04.078 "zoned": false, 00:19:04.078 "supported_io_types": { 00:19:04.078 "read": true, 00:19:04.078 "write": true, 00:19:04.078 "unmap": true, 00:19:04.078 "write_zeroes": true, 00:19:04.078 "flush": true, 00:19:04.078 "reset": true, 00:19:04.078 "compare": false, 00:19:04.078 "compare_and_write": false, 00:19:04.078 "abort": true, 00:19:04.078 "nvme_admin": false, 00:19:04.078 "nvme_io": false 00:19:04.078 }, 00:19:04.078 "memory_domains": [ 00:19:04.078 { 00:19:04.078 "dma_device_id": "system", 00:19:04.078 "dma_device_type": 1 00:19:04.078 }, 00:19:04.078 { 00:19:04.078 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:04.078 "dma_device_type": 2 00:19:04.078 } 00:19:04.078 ], 00:19:04.078 "driver_specific": {} 00:19:04.078 } 00:19:04.078 ] 00:19:04.078 00:01:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:19:04.078 00:01:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:19:04.078 00:01:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:19:04.078 00:01:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:19:04.078 00:01:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:19:04.078 00:01:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:04.078 00:01:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:04.078 00:01:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:04.078 00:01:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:19:04.078 00:01:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:04.078 00:01:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:04.078 00:01:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:04.078 00:01:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:04.078 00:01:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:04.078 00:01:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:04.336 00:01:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:04.336 "name": "Existed_Raid", 00:19:04.336 "uuid": "d5ad4fa7-f323-4762-ba89-3a93ff57ee39", 00:19:04.336 "strip_size_kb": 0, 00:19:04.336 "state": "online", 00:19:04.336 "raid_level": "raid1", 00:19:04.336 "superblock": true, 00:19:04.336 "num_base_bdevs": 4, 00:19:04.336 "num_base_bdevs_discovered": 4, 00:19:04.336 "num_base_bdevs_operational": 4, 00:19:04.336 "base_bdevs_list": [ 00:19:04.336 { 00:19:04.336 "name": "BaseBdev1", 00:19:04.336 "uuid": "400f28d3-7db9-4eb6-927e-2b2bf45a3d33", 00:19:04.336 "is_configured": true, 00:19:04.336 "data_offset": 2048, 00:19:04.336 "data_size": 63488 00:19:04.336 }, 00:19:04.336 { 00:19:04.336 "name": "BaseBdev2", 00:19:04.336 "uuid": "79490254-059e-4d9a-8ad9-078caf748b5d", 00:19:04.336 "is_configured": true, 00:19:04.336 "data_offset": 2048, 00:19:04.336 "data_size": 63488 00:19:04.336 }, 00:19:04.336 { 00:19:04.336 "name": "BaseBdev3", 00:19:04.336 "uuid": "c40fe600-d233-41ba-b540-a9eaf6311ab6", 00:19:04.336 "is_configured": true, 00:19:04.336 "data_offset": 2048, 00:19:04.336 "data_size": 63488 00:19:04.336 }, 00:19:04.336 { 00:19:04.336 "name": "BaseBdev4", 00:19:04.336 "uuid": "7bc88930-c855-4efa-92d8-7dbc1a8efd93", 00:19:04.336 "is_configured": true, 00:19:04.336 "data_offset": 2048, 00:19:04.337 "data_size": 63488 00:19:04.337 } 00:19:04.337 ] 00:19:04.337 }' 00:19:04.337 00:01:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:04.337 00:01:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:04.903 00:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:19:04.903 00:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:19:04.903 00:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:19:04.903 00:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:19:04.903 00:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:19:04.903 00:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:19:04.903 00:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:04.903 00:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:19:05.161 [2024-05-15 00:01:05.503097] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:05.161 00:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:19:05.161 "name": "Existed_Raid", 00:19:05.161 "aliases": [ 00:19:05.161 "d5ad4fa7-f323-4762-ba89-3a93ff57ee39" 00:19:05.161 ], 00:19:05.161 "product_name": "Raid Volume", 00:19:05.161 "block_size": 512, 00:19:05.161 "num_blocks": 63488, 00:19:05.161 "uuid": "d5ad4fa7-f323-4762-ba89-3a93ff57ee39", 00:19:05.161 "assigned_rate_limits": { 00:19:05.161 "rw_ios_per_sec": 0, 00:19:05.161 "rw_mbytes_per_sec": 0, 00:19:05.161 "r_mbytes_per_sec": 0, 00:19:05.161 "w_mbytes_per_sec": 0 00:19:05.161 }, 00:19:05.161 "claimed": false, 00:19:05.161 "zoned": false, 00:19:05.161 "supported_io_types": { 00:19:05.161 "read": true, 00:19:05.161 "write": true, 00:19:05.161 "unmap": false, 00:19:05.161 "write_zeroes": true, 00:19:05.161 "flush": false, 00:19:05.161 "reset": true, 00:19:05.161 "compare": false, 00:19:05.161 "compare_and_write": false, 00:19:05.161 "abort": false, 00:19:05.161 "nvme_admin": false, 00:19:05.161 "nvme_io": false 00:19:05.161 }, 00:19:05.161 "memory_domains": [ 00:19:05.161 { 00:19:05.161 "dma_device_id": "system", 00:19:05.161 "dma_device_type": 1 00:19:05.161 }, 00:19:05.161 { 00:19:05.161 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:05.161 "dma_device_type": 2 00:19:05.161 }, 00:19:05.161 { 00:19:05.161 "dma_device_id": "system", 00:19:05.161 "dma_device_type": 1 00:19:05.161 }, 00:19:05.161 { 00:19:05.161 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:05.161 "dma_device_type": 2 00:19:05.161 }, 00:19:05.161 { 00:19:05.161 "dma_device_id": "system", 00:19:05.161 "dma_device_type": 1 00:19:05.161 }, 00:19:05.161 { 00:19:05.161 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:05.161 "dma_device_type": 2 00:19:05.161 }, 00:19:05.161 { 00:19:05.161 "dma_device_id": "system", 00:19:05.161 "dma_device_type": 1 00:19:05.161 }, 00:19:05.161 { 00:19:05.161 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:05.161 "dma_device_type": 2 00:19:05.161 } 00:19:05.161 ], 00:19:05.161 "driver_specific": { 00:19:05.161 "raid": { 00:19:05.161 "uuid": "d5ad4fa7-f323-4762-ba89-3a93ff57ee39", 00:19:05.161 "strip_size_kb": 0, 00:19:05.161 "state": "online", 00:19:05.161 "raid_level": "raid1", 00:19:05.161 "superblock": true, 00:19:05.161 "num_base_bdevs": 4, 00:19:05.161 "num_base_bdevs_discovered": 4, 00:19:05.161 "num_base_bdevs_operational": 4, 00:19:05.161 "base_bdevs_list": [ 00:19:05.161 { 00:19:05.161 "name": "BaseBdev1", 00:19:05.161 "uuid": "400f28d3-7db9-4eb6-927e-2b2bf45a3d33", 00:19:05.161 "is_configured": true, 00:19:05.161 "data_offset": 2048, 00:19:05.161 "data_size": 63488 00:19:05.161 }, 00:19:05.161 { 00:19:05.161 "name": "BaseBdev2", 00:19:05.161 "uuid": "79490254-059e-4d9a-8ad9-078caf748b5d", 00:19:05.161 "is_configured": true, 00:19:05.161 "data_offset": 2048, 00:19:05.161 "data_size": 63488 00:19:05.161 }, 00:19:05.161 { 00:19:05.161 "name": "BaseBdev3", 00:19:05.161 "uuid": "c40fe600-d233-41ba-b540-a9eaf6311ab6", 00:19:05.161 "is_configured": true, 00:19:05.161 "data_offset": 2048, 00:19:05.161 "data_size": 63488 00:19:05.161 }, 00:19:05.161 { 00:19:05.161 "name": "BaseBdev4", 00:19:05.161 "uuid": "7bc88930-c855-4efa-92d8-7dbc1a8efd93", 00:19:05.161 "is_configured": true, 00:19:05.161 "data_offset": 2048, 00:19:05.161 "data_size": 63488 00:19:05.161 } 00:19:05.161 ] 00:19:05.161 } 00:19:05.161 } 00:19:05.161 }' 00:19:05.161 00:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:05.161 00:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:19:05.161 BaseBdev2 00:19:05.161 BaseBdev3 00:19:05.161 BaseBdev4' 00:19:05.161 00:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:19:05.161 00:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:05.161 00:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:19:05.419 00:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:19:05.419 "name": "BaseBdev1", 00:19:05.419 "aliases": [ 00:19:05.419 "400f28d3-7db9-4eb6-927e-2b2bf45a3d33" 00:19:05.419 ], 00:19:05.419 "product_name": "Malloc disk", 00:19:05.419 "block_size": 512, 00:19:05.419 "num_blocks": 65536, 00:19:05.419 "uuid": "400f28d3-7db9-4eb6-927e-2b2bf45a3d33", 00:19:05.419 "assigned_rate_limits": { 00:19:05.419 "rw_ios_per_sec": 0, 00:19:05.419 "rw_mbytes_per_sec": 0, 00:19:05.419 "r_mbytes_per_sec": 0, 00:19:05.419 "w_mbytes_per_sec": 0 00:19:05.419 }, 00:19:05.419 "claimed": true, 00:19:05.419 "claim_type": "exclusive_write", 00:19:05.419 "zoned": false, 00:19:05.419 "supported_io_types": { 00:19:05.419 "read": true, 00:19:05.419 "write": true, 00:19:05.419 "unmap": true, 00:19:05.419 "write_zeroes": true, 00:19:05.419 "flush": true, 00:19:05.419 "reset": true, 00:19:05.419 "compare": false, 00:19:05.419 "compare_and_write": false, 00:19:05.419 "abort": true, 00:19:05.419 "nvme_admin": false, 00:19:05.419 "nvme_io": false 00:19:05.419 }, 00:19:05.419 "memory_domains": [ 00:19:05.419 { 00:19:05.419 "dma_device_id": "system", 00:19:05.419 "dma_device_type": 1 00:19:05.419 }, 00:19:05.419 { 00:19:05.419 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:05.419 "dma_device_type": 2 00:19:05.419 } 00:19:05.419 ], 00:19:05.419 "driver_specific": {} 00:19:05.419 }' 00:19:05.419 00:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:05.419 00:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:05.419 00:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:19:05.419 00:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:05.419 00:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:05.419 00:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:05.419 00:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:05.677 00:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:05.677 00:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:05.677 00:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:05.677 00:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:05.677 00:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:19:05.677 00:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:19:05.677 00:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:05.677 00:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:19:05.935 00:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:19:05.935 "name": "BaseBdev2", 00:19:05.935 "aliases": [ 00:19:05.935 "79490254-059e-4d9a-8ad9-078caf748b5d" 00:19:05.935 ], 00:19:05.935 "product_name": "Malloc disk", 00:19:05.935 "block_size": 512, 00:19:05.935 "num_blocks": 65536, 00:19:05.935 "uuid": "79490254-059e-4d9a-8ad9-078caf748b5d", 00:19:05.935 "assigned_rate_limits": { 00:19:05.935 "rw_ios_per_sec": 0, 00:19:05.935 "rw_mbytes_per_sec": 0, 00:19:05.935 "r_mbytes_per_sec": 0, 00:19:05.935 "w_mbytes_per_sec": 0 00:19:05.935 }, 00:19:05.935 "claimed": true, 00:19:05.935 "claim_type": "exclusive_write", 00:19:05.935 "zoned": false, 00:19:05.935 "supported_io_types": { 00:19:05.935 "read": true, 00:19:05.935 "write": true, 00:19:05.935 "unmap": true, 00:19:05.935 "write_zeroes": true, 00:19:05.935 "flush": true, 00:19:05.935 "reset": true, 00:19:05.935 "compare": false, 00:19:05.935 "compare_and_write": false, 00:19:05.935 "abort": true, 00:19:05.935 "nvme_admin": false, 00:19:05.935 "nvme_io": false 00:19:05.935 }, 00:19:05.935 "memory_domains": [ 00:19:05.935 { 00:19:05.935 "dma_device_id": "system", 00:19:05.935 "dma_device_type": 1 00:19:05.935 }, 00:19:05.935 { 00:19:05.935 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:05.935 "dma_device_type": 2 00:19:05.935 } 00:19:05.935 ], 00:19:05.935 "driver_specific": {} 00:19:05.935 }' 00:19:05.936 00:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:05.936 00:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:05.936 00:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:19:05.936 00:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:06.195 00:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:06.195 00:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:06.195 00:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:06.195 00:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:06.195 00:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:06.195 00:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:06.195 00:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:06.195 00:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:19:06.195 00:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:19:06.195 00:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:06.195 00:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:19:06.454 00:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:19:06.454 "name": "BaseBdev3", 00:19:06.454 "aliases": [ 00:19:06.454 "c40fe600-d233-41ba-b540-a9eaf6311ab6" 00:19:06.454 ], 00:19:06.454 "product_name": "Malloc disk", 00:19:06.454 "block_size": 512, 00:19:06.454 "num_blocks": 65536, 00:19:06.454 "uuid": "c40fe600-d233-41ba-b540-a9eaf6311ab6", 00:19:06.454 "assigned_rate_limits": { 00:19:06.454 "rw_ios_per_sec": 0, 00:19:06.454 "rw_mbytes_per_sec": 0, 00:19:06.454 "r_mbytes_per_sec": 0, 00:19:06.454 "w_mbytes_per_sec": 0 00:19:06.454 }, 00:19:06.454 "claimed": true, 00:19:06.454 "claim_type": "exclusive_write", 00:19:06.454 "zoned": false, 00:19:06.454 "supported_io_types": { 00:19:06.454 "read": true, 00:19:06.454 "write": true, 00:19:06.454 "unmap": true, 00:19:06.454 "write_zeroes": true, 00:19:06.454 "flush": true, 00:19:06.454 "reset": true, 00:19:06.454 "compare": false, 00:19:06.454 "compare_and_write": false, 00:19:06.454 "abort": true, 00:19:06.454 "nvme_admin": false, 00:19:06.454 "nvme_io": false 00:19:06.454 }, 00:19:06.454 "memory_domains": [ 00:19:06.454 { 00:19:06.454 "dma_device_id": "system", 00:19:06.454 "dma_device_type": 1 00:19:06.454 }, 00:19:06.454 { 00:19:06.454 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:06.454 "dma_device_type": 2 00:19:06.454 } 00:19:06.454 ], 00:19:06.454 "driver_specific": {} 00:19:06.454 }' 00:19:06.454 00:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:06.454 00:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:06.713 00:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:19:06.713 00:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:06.713 00:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:06.713 00:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:06.713 00:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:06.713 00:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:06.713 00:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:06.713 00:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:06.713 00:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:06.972 00:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:19:06.972 00:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:19:06.972 00:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:06.972 00:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:19:07.229 00:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:19:07.229 "name": "BaseBdev4", 00:19:07.229 "aliases": [ 00:19:07.229 "7bc88930-c855-4efa-92d8-7dbc1a8efd93" 00:19:07.229 ], 00:19:07.229 "product_name": "Malloc disk", 00:19:07.229 "block_size": 512, 00:19:07.229 "num_blocks": 65536, 00:19:07.229 "uuid": "7bc88930-c855-4efa-92d8-7dbc1a8efd93", 00:19:07.229 "assigned_rate_limits": { 00:19:07.229 "rw_ios_per_sec": 0, 00:19:07.229 "rw_mbytes_per_sec": 0, 00:19:07.229 "r_mbytes_per_sec": 0, 00:19:07.229 "w_mbytes_per_sec": 0 00:19:07.229 }, 00:19:07.229 "claimed": true, 00:19:07.229 "claim_type": "exclusive_write", 00:19:07.229 "zoned": false, 00:19:07.229 "supported_io_types": { 00:19:07.229 "read": true, 00:19:07.229 "write": true, 00:19:07.229 "unmap": true, 00:19:07.229 "write_zeroes": true, 00:19:07.229 "flush": true, 00:19:07.230 "reset": true, 00:19:07.230 "compare": false, 00:19:07.230 "compare_and_write": false, 00:19:07.230 "abort": true, 00:19:07.230 "nvme_admin": false, 00:19:07.230 "nvme_io": false 00:19:07.230 }, 00:19:07.230 "memory_domains": [ 00:19:07.230 { 00:19:07.230 "dma_device_id": "system", 00:19:07.230 "dma_device_type": 1 00:19:07.230 }, 00:19:07.230 { 00:19:07.230 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:07.230 "dma_device_type": 2 00:19:07.230 } 00:19:07.230 ], 00:19:07.230 "driver_specific": {} 00:19:07.230 }' 00:19:07.230 00:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:07.230 00:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:07.230 00:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:19:07.230 00:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:07.230 00:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:07.230 00:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:07.230 00:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:07.230 00:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:07.488 00:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:07.488 00:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:07.489 00:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:07.489 00:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:19:07.489 00:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:07.489 [2024-05-15 00:01:08.073671] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:07.748 00:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # local expected_state 00:19:07.748 00:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # has_redundancy raid1 00:19:07.748 00:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # case $1 in 00:19:07.748 00:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 0 00:19:07.748 00:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@280 -- # expected_state=online 00:19:07.748 00:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:19:07.748 00:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:19:07.748 00:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:07.748 00:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:07.748 00:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:07.748 00:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:19:07.748 00:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:07.748 00:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:07.748 00:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:07.748 00:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:07.748 00:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:07.748 00:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:07.748 00:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:07.748 "name": "Existed_Raid", 00:19:07.748 "uuid": "d5ad4fa7-f323-4762-ba89-3a93ff57ee39", 00:19:07.748 "strip_size_kb": 0, 00:19:07.748 "state": "online", 00:19:07.748 "raid_level": "raid1", 00:19:07.748 "superblock": true, 00:19:07.748 "num_base_bdevs": 4, 00:19:07.748 "num_base_bdevs_discovered": 3, 00:19:07.748 "num_base_bdevs_operational": 3, 00:19:07.748 "base_bdevs_list": [ 00:19:07.748 { 00:19:07.749 "name": null, 00:19:07.749 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:07.749 "is_configured": false, 00:19:07.749 "data_offset": 2048, 00:19:07.749 "data_size": 63488 00:19:07.749 }, 00:19:07.749 { 00:19:07.749 "name": "BaseBdev2", 00:19:07.749 "uuid": "79490254-059e-4d9a-8ad9-078caf748b5d", 00:19:07.749 "is_configured": true, 00:19:07.749 "data_offset": 2048, 00:19:07.749 "data_size": 63488 00:19:07.749 }, 00:19:07.749 { 00:19:07.749 "name": "BaseBdev3", 00:19:07.749 "uuid": "c40fe600-d233-41ba-b540-a9eaf6311ab6", 00:19:07.749 "is_configured": true, 00:19:07.749 "data_offset": 2048, 00:19:07.749 "data_size": 63488 00:19:07.749 }, 00:19:07.749 { 00:19:07.749 "name": "BaseBdev4", 00:19:07.749 "uuid": "7bc88930-c855-4efa-92d8-7dbc1a8efd93", 00:19:07.749 "is_configured": true, 00:19:07.749 "data_offset": 2048, 00:19:07.749 "data_size": 63488 00:19:07.749 } 00:19:07.749 ] 00:19:07.749 }' 00:19:07.749 00:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:07.749 00:01:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:08.374 00:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:19:08.374 00:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:19:08.374 00:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:08.374 00:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:19:08.632 00:01:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:19:08.632 00:01:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:08.632 00:01:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:08.890 [2024-05-15 00:01:09.330978] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:08.890 00:01:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:19:08.890 00:01:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:19:08.890 00:01:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:08.890 00:01:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:19:09.147 00:01:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:19:09.147 00:01:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:09.147 00:01:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:09.406 [2024-05-15 00:01:09.824680] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:09.406 00:01:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:19:09.406 00:01:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:19:09.406 00:01:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:09.406 00:01:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:19:09.664 00:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:19:09.664 00:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:09.664 00:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:19:09.921 [2024-05-15 00:01:10.285795] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:19:09.921 [2024-05-15 00:01:10.285866] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:09.921 [2024-05-15 00:01:10.296846] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:09.921 [2024-05-15 00:01:10.296914] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:09.921 [2024-05-15 00:01:10.296927] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bfa470 name Existed_Raid, state offline 00:19:09.921 00:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:19:09.921 00:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:19:09.921 00:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:09.921 00:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:19:10.185 00:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:19:10.185 00:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:19:10.185 00:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@300 -- # '[' 4 -gt 2 ']' 00:19:10.185 00:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:19:10.185 00:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:19:10.185 00:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:10.442 BaseBdev2 00:19:10.442 00:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:19:10.442 00:01:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:19:10.442 00:01:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:19:10.442 00:01:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:19:10.442 00:01:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:19:10.442 00:01:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:19:10.442 00:01:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:10.701 00:01:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:10.959 [ 00:19:10.959 { 00:19:10.959 "name": "BaseBdev2", 00:19:10.959 "aliases": [ 00:19:10.959 "e7b2f9d8-2946-4ebd-9824-5a2731fb3917" 00:19:10.959 ], 00:19:10.959 "product_name": "Malloc disk", 00:19:10.959 "block_size": 512, 00:19:10.959 "num_blocks": 65536, 00:19:10.959 "uuid": "e7b2f9d8-2946-4ebd-9824-5a2731fb3917", 00:19:10.959 "assigned_rate_limits": { 00:19:10.959 "rw_ios_per_sec": 0, 00:19:10.959 "rw_mbytes_per_sec": 0, 00:19:10.959 "r_mbytes_per_sec": 0, 00:19:10.959 "w_mbytes_per_sec": 0 00:19:10.959 }, 00:19:10.959 "claimed": false, 00:19:10.959 "zoned": false, 00:19:10.959 "supported_io_types": { 00:19:10.959 "read": true, 00:19:10.959 "write": true, 00:19:10.959 "unmap": true, 00:19:10.959 "write_zeroes": true, 00:19:10.959 "flush": true, 00:19:10.959 "reset": true, 00:19:10.959 "compare": false, 00:19:10.959 "compare_and_write": false, 00:19:10.959 "abort": true, 00:19:10.959 "nvme_admin": false, 00:19:10.959 "nvme_io": false 00:19:10.959 }, 00:19:10.959 "memory_domains": [ 00:19:10.959 { 00:19:10.959 "dma_device_id": "system", 00:19:10.959 "dma_device_type": 1 00:19:10.959 }, 00:19:10.959 { 00:19:10.959 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:10.959 "dma_device_type": 2 00:19:10.959 } 00:19:10.959 ], 00:19:10.959 "driver_specific": {} 00:19:10.959 } 00:19:10.959 ] 00:19:10.959 00:01:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:19:10.959 00:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:19:10.959 00:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:19:10.959 00:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:10.959 BaseBdev3 00:19:11.219 00:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:19:11.219 00:01:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:19:11.219 00:01:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:19:11.219 00:01:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:19:11.219 00:01:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:19:11.219 00:01:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:19:11.219 00:01:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:11.219 00:01:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:11.477 [ 00:19:11.477 { 00:19:11.477 "name": "BaseBdev3", 00:19:11.477 "aliases": [ 00:19:11.477 "97f221e1-3ab2-48a6-aa3d-e49017e3d8a4" 00:19:11.477 ], 00:19:11.477 "product_name": "Malloc disk", 00:19:11.477 "block_size": 512, 00:19:11.477 "num_blocks": 65536, 00:19:11.477 "uuid": "97f221e1-3ab2-48a6-aa3d-e49017e3d8a4", 00:19:11.477 "assigned_rate_limits": { 00:19:11.477 "rw_ios_per_sec": 0, 00:19:11.477 "rw_mbytes_per_sec": 0, 00:19:11.477 "r_mbytes_per_sec": 0, 00:19:11.477 "w_mbytes_per_sec": 0 00:19:11.477 }, 00:19:11.477 "claimed": false, 00:19:11.477 "zoned": false, 00:19:11.477 "supported_io_types": { 00:19:11.477 "read": true, 00:19:11.477 "write": true, 00:19:11.477 "unmap": true, 00:19:11.477 "write_zeroes": true, 00:19:11.477 "flush": true, 00:19:11.477 "reset": true, 00:19:11.477 "compare": false, 00:19:11.477 "compare_and_write": false, 00:19:11.477 "abort": true, 00:19:11.477 "nvme_admin": false, 00:19:11.477 "nvme_io": false 00:19:11.477 }, 00:19:11.477 "memory_domains": [ 00:19:11.477 { 00:19:11.477 "dma_device_id": "system", 00:19:11.477 "dma_device_type": 1 00:19:11.477 }, 00:19:11.477 { 00:19:11.477 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:11.477 "dma_device_type": 2 00:19:11.477 } 00:19:11.477 ], 00:19:11.477 "driver_specific": {} 00:19:11.477 } 00:19:11.477 ] 00:19:11.477 00:01:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:19:11.477 00:01:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:19:11.477 00:01:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:19:11.477 00:01:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:11.737 BaseBdev4 00:19:11.737 00:01:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev4 00:19:11.737 00:01:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:19:11.737 00:01:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:19:11.737 00:01:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:19:11.737 00:01:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:19:11.737 00:01:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:19:11.737 00:01:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:11.994 00:01:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:12.253 [ 00:19:12.253 { 00:19:12.253 "name": "BaseBdev4", 00:19:12.253 "aliases": [ 00:19:12.253 "c163209e-529c-4ba3-b133-54c794ba5c98" 00:19:12.253 ], 00:19:12.253 "product_name": "Malloc disk", 00:19:12.253 "block_size": 512, 00:19:12.253 "num_blocks": 65536, 00:19:12.253 "uuid": "c163209e-529c-4ba3-b133-54c794ba5c98", 00:19:12.253 "assigned_rate_limits": { 00:19:12.253 "rw_ios_per_sec": 0, 00:19:12.253 "rw_mbytes_per_sec": 0, 00:19:12.253 "r_mbytes_per_sec": 0, 00:19:12.253 "w_mbytes_per_sec": 0 00:19:12.253 }, 00:19:12.253 "claimed": false, 00:19:12.253 "zoned": false, 00:19:12.253 "supported_io_types": { 00:19:12.253 "read": true, 00:19:12.253 "write": true, 00:19:12.253 "unmap": true, 00:19:12.253 "write_zeroes": true, 00:19:12.253 "flush": true, 00:19:12.253 "reset": true, 00:19:12.253 "compare": false, 00:19:12.253 "compare_and_write": false, 00:19:12.253 "abort": true, 00:19:12.253 "nvme_admin": false, 00:19:12.253 "nvme_io": false 00:19:12.253 }, 00:19:12.253 "memory_domains": [ 00:19:12.253 { 00:19:12.253 "dma_device_id": "system", 00:19:12.253 "dma_device_type": 1 00:19:12.253 }, 00:19:12.253 { 00:19:12.253 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:12.253 "dma_device_type": 2 00:19:12.253 } 00:19:12.253 ], 00:19:12.253 "driver_specific": {} 00:19:12.253 } 00:19:12.253 ] 00:19:12.253 00:01:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:19:12.253 00:01:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:19:12.253 00:01:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:19:12.253 00:01:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:12.511 [2024-05-15 00:01:12.994471] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:12.511 [2024-05-15 00:01:12.994522] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:12.511 [2024-05-15 00:01:12.994545] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:12.511 [2024-05-15 00:01:12.995955] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:12.511 [2024-05-15 00:01:12.996000] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:12.511 00:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:12.511 00:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:19:12.511 00:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:19:12.511 00:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:12.511 00:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:12.511 00:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:19:12.511 00:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:12.511 00:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:12.511 00:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:12.511 00:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:12.511 00:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:12.511 00:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:12.769 00:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:12.769 "name": "Existed_Raid", 00:19:12.769 "uuid": "ddae3ed2-17a1-480f-87a7-3194a1d55c96", 00:19:12.769 "strip_size_kb": 0, 00:19:12.769 "state": "configuring", 00:19:12.769 "raid_level": "raid1", 00:19:12.769 "superblock": true, 00:19:12.769 "num_base_bdevs": 4, 00:19:12.769 "num_base_bdevs_discovered": 3, 00:19:12.769 "num_base_bdevs_operational": 4, 00:19:12.769 "base_bdevs_list": [ 00:19:12.769 { 00:19:12.769 "name": "BaseBdev1", 00:19:12.769 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:12.769 "is_configured": false, 00:19:12.769 "data_offset": 0, 00:19:12.769 "data_size": 0 00:19:12.769 }, 00:19:12.769 { 00:19:12.769 "name": "BaseBdev2", 00:19:12.769 "uuid": "e7b2f9d8-2946-4ebd-9824-5a2731fb3917", 00:19:12.769 "is_configured": true, 00:19:12.769 "data_offset": 2048, 00:19:12.769 "data_size": 63488 00:19:12.769 }, 00:19:12.769 { 00:19:12.769 "name": "BaseBdev3", 00:19:12.769 "uuid": "97f221e1-3ab2-48a6-aa3d-e49017e3d8a4", 00:19:12.769 "is_configured": true, 00:19:12.769 "data_offset": 2048, 00:19:12.769 "data_size": 63488 00:19:12.769 }, 00:19:12.769 { 00:19:12.769 "name": "BaseBdev4", 00:19:12.769 "uuid": "c163209e-529c-4ba3-b133-54c794ba5c98", 00:19:12.769 "is_configured": true, 00:19:12.769 "data_offset": 2048, 00:19:12.769 "data_size": 63488 00:19:12.769 } 00:19:12.769 ] 00:19:12.769 }' 00:19:12.769 00:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:12.769 00:01:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:13.336 00:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:13.595 [2024-05-15 00:01:14.061267] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:13.595 00:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:13.595 00:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:19:13.595 00:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:19:13.595 00:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:13.595 00:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:13.595 00:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:19:13.595 00:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:13.595 00:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:13.595 00:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:13.595 00:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:13.595 00:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:13.595 00:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:13.854 00:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:13.854 "name": "Existed_Raid", 00:19:13.854 "uuid": "ddae3ed2-17a1-480f-87a7-3194a1d55c96", 00:19:13.854 "strip_size_kb": 0, 00:19:13.854 "state": "configuring", 00:19:13.854 "raid_level": "raid1", 00:19:13.854 "superblock": true, 00:19:13.854 "num_base_bdevs": 4, 00:19:13.854 "num_base_bdevs_discovered": 2, 00:19:13.854 "num_base_bdevs_operational": 4, 00:19:13.854 "base_bdevs_list": [ 00:19:13.854 { 00:19:13.854 "name": "BaseBdev1", 00:19:13.854 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:13.854 "is_configured": false, 00:19:13.854 "data_offset": 0, 00:19:13.854 "data_size": 0 00:19:13.854 }, 00:19:13.854 { 00:19:13.854 "name": null, 00:19:13.854 "uuid": "e7b2f9d8-2946-4ebd-9824-5a2731fb3917", 00:19:13.854 "is_configured": false, 00:19:13.854 "data_offset": 2048, 00:19:13.854 "data_size": 63488 00:19:13.854 }, 00:19:13.854 { 00:19:13.854 "name": "BaseBdev3", 00:19:13.854 "uuid": "97f221e1-3ab2-48a6-aa3d-e49017e3d8a4", 00:19:13.854 "is_configured": true, 00:19:13.854 "data_offset": 2048, 00:19:13.854 "data_size": 63488 00:19:13.854 }, 00:19:13.854 { 00:19:13.854 "name": "BaseBdev4", 00:19:13.854 "uuid": "c163209e-529c-4ba3-b133-54c794ba5c98", 00:19:13.854 "is_configured": true, 00:19:13.854 "data_offset": 2048, 00:19:13.854 "data_size": 63488 00:19:13.855 } 00:19:13.855 ] 00:19:13.855 }' 00:19:13.855 00:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:13.855 00:01:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:14.421 00:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:14.421 00:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:14.679 00:01:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:19:14.679 00:01:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:14.936 [2024-05-15 00:01:15.409451] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:14.936 BaseBdev1 00:19:14.936 00:01:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:19:14.936 00:01:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:19:14.936 00:01:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:19:14.936 00:01:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:19:14.936 00:01:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:19:14.936 00:01:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:19:14.936 00:01:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:15.194 00:01:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:15.455 [ 00:19:15.455 { 00:19:15.455 "name": "BaseBdev1", 00:19:15.455 "aliases": [ 00:19:15.455 "3f58b040-8386-45a9-8680-2f40206adb95" 00:19:15.455 ], 00:19:15.455 "product_name": "Malloc disk", 00:19:15.455 "block_size": 512, 00:19:15.455 "num_blocks": 65536, 00:19:15.455 "uuid": "3f58b040-8386-45a9-8680-2f40206adb95", 00:19:15.455 "assigned_rate_limits": { 00:19:15.455 "rw_ios_per_sec": 0, 00:19:15.455 "rw_mbytes_per_sec": 0, 00:19:15.455 "r_mbytes_per_sec": 0, 00:19:15.455 "w_mbytes_per_sec": 0 00:19:15.455 }, 00:19:15.455 "claimed": true, 00:19:15.455 "claim_type": "exclusive_write", 00:19:15.455 "zoned": false, 00:19:15.455 "supported_io_types": { 00:19:15.455 "read": true, 00:19:15.455 "write": true, 00:19:15.455 "unmap": true, 00:19:15.455 "write_zeroes": true, 00:19:15.455 "flush": true, 00:19:15.455 "reset": true, 00:19:15.455 "compare": false, 00:19:15.455 "compare_and_write": false, 00:19:15.455 "abort": true, 00:19:15.455 "nvme_admin": false, 00:19:15.455 "nvme_io": false 00:19:15.455 }, 00:19:15.455 "memory_domains": [ 00:19:15.455 { 00:19:15.455 "dma_device_id": "system", 00:19:15.455 "dma_device_type": 1 00:19:15.455 }, 00:19:15.455 { 00:19:15.455 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:15.455 "dma_device_type": 2 00:19:15.455 } 00:19:15.455 ], 00:19:15.455 "driver_specific": {} 00:19:15.455 } 00:19:15.455 ] 00:19:15.455 00:01:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:19:15.455 00:01:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:15.455 00:01:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:19:15.455 00:01:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:19:15.455 00:01:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:15.455 00:01:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:15.455 00:01:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:19:15.455 00:01:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:15.455 00:01:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:15.455 00:01:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:15.455 00:01:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:15.455 00:01:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.455 00:01:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:15.715 00:01:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:15.715 "name": "Existed_Raid", 00:19:15.715 "uuid": "ddae3ed2-17a1-480f-87a7-3194a1d55c96", 00:19:15.715 "strip_size_kb": 0, 00:19:15.715 "state": "configuring", 00:19:15.715 "raid_level": "raid1", 00:19:15.715 "superblock": true, 00:19:15.715 "num_base_bdevs": 4, 00:19:15.715 "num_base_bdevs_discovered": 3, 00:19:15.715 "num_base_bdevs_operational": 4, 00:19:15.715 "base_bdevs_list": [ 00:19:15.715 { 00:19:15.715 "name": "BaseBdev1", 00:19:15.715 "uuid": "3f58b040-8386-45a9-8680-2f40206adb95", 00:19:15.715 "is_configured": true, 00:19:15.715 "data_offset": 2048, 00:19:15.715 "data_size": 63488 00:19:15.715 }, 00:19:15.715 { 00:19:15.715 "name": null, 00:19:15.715 "uuid": "e7b2f9d8-2946-4ebd-9824-5a2731fb3917", 00:19:15.715 "is_configured": false, 00:19:15.715 "data_offset": 2048, 00:19:15.715 "data_size": 63488 00:19:15.715 }, 00:19:15.715 { 00:19:15.715 "name": "BaseBdev3", 00:19:15.715 "uuid": "97f221e1-3ab2-48a6-aa3d-e49017e3d8a4", 00:19:15.715 "is_configured": true, 00:19:15.715 "data_offset": 2048, 00:19:15.715 "data_size": 63488 00:19:15.715 }, 00:19:15.715 { 00:19:15.715 "name": "BaseBdev4", 00:19:15.715 "uuid": "c163209e-529c-4ba3-b133-54c794ba5c98", 00:19:15.715 "is_configured": true, 00:19:15.715 "data_offset": 2048, 00:19:15.715 "data_size": 63488 00:19:15.715 } 00:19:15.715 ] 00:19:15.715 }' 00:19:15.715 00:01:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:15.715 00:01:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:16.281 00:01:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:16.281 00:01:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:16.540 00:01:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:19:16.540 00:01:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:16.796 [2024-05-15 00:01:17.222278] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:16.796 00:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:16.796 00:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:19:16.796 00:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:19:16.796 00:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:16.796 00:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:16.796 00:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:19:16.796 00:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:16.796 00:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:16.796 00:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:16.796 00:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:16.796 00:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:16.796 00:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:17.054 00:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:17.054 "name": "Existed_Raid", 00:19:17.054 "uuid": "ddae3ed2-17a1-480f-87a7-3194a1d55c96", 00:19:17.054 "strip_size_kb": 0, 00:19:17.054 "state": "configuring", 00:19:17.054 "raid_level": "raid1", 00:19:17.054 "superblock": true, 00:19:17.054 "num_base_bdevs": 4, 00:19:17.054 "num_base_bdevs_discovered": 2, 00:19:17.054 "num_base_bdevs_operational": 4, 00:19:17.054 "base_bdevs_list": [ 00:19:17.054 { 00:19:17.054 "name": "BaseBdev1", 00:19:17.054 "uuid": "3f58b040-8386-45a9-8680-2f40206adb95", 00:19:17.054 "is_configured": true, 00:19:17.054 "data_offset": 2048, 00:19:17.054 "data_size": 63488 00:19:17.054 }, 00:19:17.054 { 00:19:17.054 "name": null, 00:19:17.054 "uuid": "e7b2f9d8-2946-4ebd-9824-5a2731fb3917", 00:19:17.054 "is_configured": false, 00:19:17.054 "data_offset": 2048, 00:19:17.054 "data_size": 63488 00:19:17.054 }, 00:19:17.054 { 00:19:17.054 "name": null, 00:19:17.054 "uuid": "97f221e1-3ab2-48a6-aa3d-e49017e3d8a4", 00:19:17.054 "is_configured": false, 00:19:17.054 "data_offset": 2048, 00:19:17.054 "data_size": 63488 00:19:17.054 }, 00:19:17.054 { 00:19:17.054 "name": "BaseBdev4", 00:19:17.054 "uuid": "c163209e-529c-4ba3-b133-54c794ba5c98", 00:19:17.054 "is_configured": true, 00:19:17.054 "data_offset": 2048, 00:19:17.054 "data_size": 63488 00:19:17.054 } 00:19:17.054 ] 00:19:17.054 }' 00:19:17.054 00:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:17.054 00:01:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:17.617 00:01:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:17.617 00:01:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:17.875 00:01:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:19:17.875 00:01:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:18.132 [2024-05-15 00:01:18.509722] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:18.132 00:01:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:18.132 00:01:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:19:18.132 00:01:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:19:18.132 00:01:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:18.132 00:01:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:18.132 00:01:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:19:18.132 00:01:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:18.132 00:01:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:18.132 00:01:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:18.132 00:01:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:18.132 00:01:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:18.132 00:01:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:18.389 00:01:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:18.389 "name": "Existed_Raid", 00:19:18.389 "uuid": "ddae3ed2-17a1-480f-87a7-3194a1d55c96", 00:19:18.389 "strip_size_kb": 0, 00:19:18.389 "state": "configuring", 00:19:18.389 "raid_level": "raid1", 00:19:18.389 "superblock": true, 00:19:18.389 "num_base_bdevs": 4, 00:19:18.389 "num_base_bdevs_discovered": 3, 00:19:18.389 "num_base_bdevs_operational": 4, 00:19:18.389 "base_bdevs_list": [ 00:19:18.389 { 00:19:18.389 "name": "BaseBdev1", 00:19:18.389 "uuid": "3f58b040-8386-45a9-8680-2f40206adb95", 00:19:18.389 "is_configured": true, 00:19:18.389 "data_offset": 2048, 00:19:18.389 "data_size": 63488 00:19:18.389 }, 00:19:18.389 { 00:19:18.389 "name": null, 00:19:18.389 "uuid": "e7b2f9d8-2946-4ebd-9824-5a2731fb3917", 00:19:18.389 "is_configured": false, 00:19:18.389 "data_offset": 2048, 00:19:18.389 "data_size": 63488 00:19:18.389 }, 00:19:18.389 { 00:19:18.389 "name": "BaseBdev3", 00:19:18.389 "uuid": "97f221e1-3ab2-48a6-aa3d-e49017e3d8a4", 00:19:18.389 "is_configured": true, 00:19:18.389 "data_offset": 2048, 00:19:18.389 "data_size": 63488 00:19:18.389 }, 00:19:18.389 { 00:19:18.389 "name": "BaseBdev4", 00:19:18.389 "uuid": "c163209e-529c-4ba3-b133-54c794ba5c98", 00:19:18.389 "is_configured": true, 00:19:18.389 "data_offset": 2048, 00:19:18.389 "data_size": 63488 00:19:18.389 } 00:19:18.389 ] 00:19:18.389 }' 00:19:18.389 00:01:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:18.389 00:01:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:18.952 00:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:18.952 00:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:19.210 00:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:19:19.210 00:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:19.468 [2024-05-15 00:01:19.801173] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:19.468 00:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:19.468 00:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:19:19.468 00:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:19:19.468 00:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:19.468 00:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:19.468 00:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:19:19.468 00:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:19.468 00:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:19.468 00:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:19.468 00:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:19.468 00:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:19.468 00:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:19.725 00:01:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:19.725 "name": "Existed_Raid", 00:19:19.725 "uuid": "ddae3ed2-17a1-480f-87a7-3194a1d55c96", 00:19:19.725 "strip_size_kb": 0, 00:19:19.725 "state": "configuring", 00:19:19.725 "raid_level": "raid1", 00:19:19.726 "superblock": true, 00:19:19.726 "num_base_bdevs": 4, 00:19:19.726 "num_base_bdevs_discovered": 2, 00:19:19.726 "num_base_bdevs_operational": 4, 00:19:19.726 "base_bdevs_list": [ 00:19:19.726 { 00:19:19.726 "name": null, 00:19:19.726 "uuid": "3f58b040-8386-45a9-8680-2f40206adb95", 00:19:19.726 "is_configured": false, 00:19:19.726 "data_offset": 2048, 00:19:19.726 "data_size": 63488 00:19:19.726 }, 00:19:19.726 { 00:19:19.726 "name": null, 00:19:19.726 "uuid": "e7b2f9d8-2946-4ebd-9824-5a2731fb3917", 00:19:19.726 "is_configured": false, 00:19:19.726 "data_offset": 2048, 00:19:19.726 "data_size": 63488 00:19:19.726 }, 00:19:19.726 { 00:19:19.726 "name": "BaseBdev3", 00:19:19.726 "uuid": "97f221e1-3ab2-48a6-aa3d-e49017e3d8a4", 00:19:19.726 "is_configured": true, 00:19:19.726 "data_offset": 2048, 00:19:19.726 "data_size": 63488 00:19:19.726 }, 00:19:19.726 { 00:19:19.726 "name": "BaseBdev4", 00:19:19.726 "uuid": "c163209e-529c-4ba3-b133-54c794ba5c98", 00:19:19.726 "is_configured": true, 00:19:19.726 "data_offset": 2048, 00:19:19.726 "data_size": 63488 00:19:19.726 } 00:19:19.726 ] 00:19:19.726 }' 00:19:19.726 00:01:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:19.726 00:01:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:20.290 00:01:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:20.290 00:01:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:20.548 00:01:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:19:20.548 00:01:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:20.548 [2024-05-15 00:01:21.121007] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:20.805 00:01:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:20.805 00:01:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:19:20.805 00:01:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:19:20.805 00:01:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:20.805 00:01:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:20.805 00:01:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:19:20.805 00:01:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:20.805 00:01:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:20.805 00:01:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:20.805 00:01:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:20.805 00:01:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:20.805 00:01:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:20.805 00:01:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:20.805 "name": "Existed_Raid", 00:19:20.805 "uuid": "ddae3ed2-17a1-480f-87a7-3194a1d55c96", 00:19:20.805 "strip_size_kb": 0, 00:19:20.805 "state": "configuring", 00:19:20.805 "raid_level": "raid1", 00:19:20.805 "superblock": true, 00:19:20.805 "num_base_bdevs": 4, 00:19:20.805 "num_base_bdevs_discovered": 3, 00:19:20.805 "num_base_bdevs_operational": 4, 00:19:20.805 "base_bdevs_list": [ 00:19:20.805 { 00:19:20.805 "name": null, 00:19:20.805 "uuid": "3f58b040-8386-45a9-8680-2f40206adb95", 00:19:20.805 "is_configured": false, 00:19:20.805 "data_offset": 2048, 00:19:20.805 "data_size": 63488 00:19:20.805 }, 00:19:20.805 { 00:19:20.805 "name": "BaseBdev2", 00:19:20.805 "uuid": "e7b2f9d8-2946-4ebd-9824-5a2731fb3917", 00:19:20.805 "is_configured": true, 00:19:20.805 "data_offset": 2048, 00:19:20.805 "data_size": 63488 00:19:20.805 }, 00:19:20.805 { 00:19:20.805 "name": "BaseBdev3", 00:19:20.805 "uuid": "97f221e1-3ab2-48a6-aa3d-e49017e3d8a4", 00:19:20.805 "is_configured": true, 00:19:20.805 "data_offset": 2048, 00:19:20.805 "data_size": 63488 00:19:20.805 }, 00:19:20.805 { 00:19:20.805 "name": "BaseBdev4", 00:19:20.805 "uuid": "c163209e-529c-4ba3-b133-54c794ba5c98", 00:19:20.805 "is_configured": true, 00:19:20.805 "data_offset": 2048, 00:19:20.805 "data_size": 63488 00:19:20.805 } 00:19:20.805 ] 00:19:20.805 }' 00:19:20.805 00:01:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:20.805 00:01:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:21.372 00:01:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:21.372 00:01:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:21.646 00:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:19:21.646 00:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:21.646 00:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:21.914 00:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 3f58b040-8386-45a9-8680-2f40206adb95 00:19:22.171 [2024-05-15 00:01:22.548164] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:22.171 [2024-05-15 00:01:22.548329] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bf96d0 00:19:22.171 [2024-05-15 00:01:22.548343] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:22.171 [2024-05-15 00:01:22.548534] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b15340 00:19:22.171 [2024-05-15 00:01:22.548665] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bf96d0 00:19:22.171 [2024-05-15 00:01:22.548675] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1bf96d0 00:19:22.171 [2024-05-15 00:01:22.548771] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:22.171 NewBaseBdev 00:19:22.171 00:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:19:22.171 00:01:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:19:22.171 00:01:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:19:22.171 00:01:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:19:22.171 00:01:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:19:22.171 00:01:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:19:22.171 00:01:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:22.171 00:01:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:22.429 [ 00:19:22.430 { 00:19:22.430 "name": "NewBaseBdev", 00:19:22.430 "aliases": [ 00:19:22.430 "3f58b040-8386-45a9-8680-2f40206adb95" 00:19:22.430 ], 00:19:22.430 "product_name": "Malloc disk", 00:19:22.430 "block_size": 512, 00:19:22.430 "num_blocks": 65536, 00:19:22.430 "uuid": "3f58b040-8386-45a9-8680-2f40206adb95", 00:19:22.430 "assigned_rate_limits": { 00:19:22.430 "rw_ios_per_sec": 0, 00:19:22.430 "rw_mbytes_per_sec": 0, 00:19:22.430 "r_mbytes_per_sec": 0, 00:19:22.430 "w_mbytes_per_sec": 0 00:19:22.430 }, 00:19:22.430 "claimed": true, 00:19:22.430 "claim_type": "exclusive_write", 00:19:22.430 "zoned": false, 00:19:22.430 "supported_io_types": { 00:19:22.430 "read": true, 00:19:22.430 "write": true, 00:19:22.430 "unmap": true, 00:19:22.430 "write_zeroes": true, 00:19:22.430 "flush": true, 00:19:22.430 "reset": true, 00:19:22.430 "compare": false, 00:19:22.430 "compare_and_write": false, 00:19:22.430 "abort": true, 00:19:22.430 "nvme_admin": false, 00:19:22.430 "nvme_io": false 00:19:22.430 }, 00:19:22.430 "memory_domains": [ 00:19:22.430 { 00:19:22.430 "dma_device_id": "system", 00:19:22.430 "dma_device_type": 1 00:19:22.430 }, 00:19:22.430 { 00:19:22.430 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:22.430 "dma_device_type": 2 00:19:22.430 } 00:19:22.430 ], 00:19:22.430 "driver_specific": {} 00:19:22.430 } 00:19:22.430 ] 00:19:22.430 00:01:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:19:22.430 00:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:19:22.430 00:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:19:22.430 00:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:22.430 00:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:22.430 00:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:22.430 00:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:19:22.430 00:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:22.430 00:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:22.430 00:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:22.430 00:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:22.430 00:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:22.430 00:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:22.688 00:01:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:22.688 "name": "Existed_Raid", 00:19:22.688 "uuid": "ddae3ed2-17a1-480f-87a7-3194a1d55c96", 00:19:22.688 "strip_size_kb": 0, 00:19:22.688 "state": "online", 00:19:22.688 "raid_level": "raid1", 00:19:22.688 "superblock": true, 00:19:22.688 "num_base_bdevs": 4, 00:19:22.688 "num_base_bdevs_discovered": 4, 00:19:22.688 "num_base_bdevs_operational": 4, 00:19:22.688 "base_bdevs_list": [ 00:19:22.688 { 00:19:22.688 "name": "NewBaseBdev", 00:19:22.688 "uuid": "3f58b040-8386-45a9-8680-2f40206adb95", 00:19:22.688 "is_configured": true, 00:19:22.688 "data_offset": 2048, 00:19:22.688 "data_size": 63488 00:19:22.688 }, 00:19:22.688 { 00:19:22.688 "name": "BaseBdev2", 00:19:22.688 "uuid": "e7b2f9d8-2946-4ebd-9824-5a2731fb3917", 00:19:22.688 "is_configured": true, 00:19:22.688 "data_offset": 2048, 00:19:22.688 "data_size": 63488 00:19:22.688 }, 00:19:22.688 { 00:19:22.688 "name": "BaseBdev3", 00:19:22.688 "uuid": "97f221e1-3ab2-48a6-aa3d-e49017e3d8a4", 00:19:22.688 "is_configured": true, 00:19:22.688 "data_offset": 2048, 00:19:22.688 "data_size": 63488 00:19:22.688 }, 00:19:22.688 { 00:19:22.688 "name": "BaseBdev4", 00:19:22.688 "uuid": "c163209e-529c-4ba3-b133-54c794ba5c98", 00:19:22.688 "is_configured": true, 00:19:22.688 "data_offset": 2048, 00:19:22.688 "data_size": 63488 00:19:22.688 } 00:19:22.688 ] 00:19:22.688 }' 00:19:22.688 00:01:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:22.688 00:01:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:23.253 00:01:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:19:23.253 00:01:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:19:23.253 00:01:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:19:23.253 00:01:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:19:23.253 00:01:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:19:23.253 00:01:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:19:23.253 00:01:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:23.253 00:01:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:19:23.512 [2024-05-15 00:01:23.960217] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:23.512 00:01:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:19:23.512 "name": "Existed_Raid", 00:19:23.512 "aliases": [ 00:19:23.512 "ddae3ed2-17a1-480f-87a7-3194a1d55c96" 00:19:23.512 ], 00:19:23.512 "product_name": "Raid Volume", 00:19:23.512 "block_size": 512, 00:19:23.512 "num_blocks": 63488, 00:19:23.512 "uuid": "ddae3ed2-17a1-480f-87a7-3194a1d55c96", 00:19:23.512 "assigned_rate_limits": { 00:19:23.512 "rw_ios_per_sec": 0, 00:19:23.512 "rw_mbytes_per_sec": 0, 00:19:23.512 "r_mbytes_per_sec": 0, 00:19:23.512 "w_mbytes_per_sec": 0 00:19:23.512 }, 00:19:23.512 "claimed": false, 00:19:23.512 "zoned": false, 00:19:23.512 "supported_io_types": { 00:19:23.512 "read": true, 00:19:23.512 "write": true, 00:19:23.512 "unmap": false, 00:19:23.512 "write_zeroes": true, 00:19:23.512 "flush": false, 00:19:23.512 "reset": true, 00:19:23.512 "compare": false, 00:19:23.512 "compare_and_write": false, 00:19:23.512 "abort": false, 00:19:23.512 "nvme_admin": false, 00:19:23.512 "nvme_io": false 00:19:23.512 }, 00:19:23.512 "memory_domains": [ 00:19:23.512 { 00:19:23.512 "dma_device_id": "system", 00:19:23.512 "dma_device_type": 1 00:19:23.512 }, 00:19:23.512 { 00:19:23.512 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:23.512 "dma_device_type": 2 00:19:23.512 }, 00:19:23.512 { 00:19:23.512 "dma_device_id": "system", 00:19:23.512 "dma_device_type": 1 00:19:23.512 }, 00:19:23.512 { 00:19:23.512 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:23.512 "dma_device_type": 2 00:19:23.512 }, 00:19:23.512 { 00:19:23.512 "dma_device_id": "system", 00:19:23.512 "dma_device_type": 1 00:19:23.512 }, 00:19:23.512 { 00:19:23.512 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:23.512 "dma_device_type": 2 00:19:23.512 }, 00:19:23.512 { 00:19:23.512 "dma_device_id": "system", 00:19:23.512 "dma_device_type": 1 00:19:23.512 }, 00:19:23.512 { 00:19:23.512 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:23.512 "dma_device_type": 2 00:19:23.512 } 00:19:23.512 ], 00:19:23.512 "driver_specific": { 00:19:23.512 "raid": { 00:19:23.512 "uuid": "ddae3ed2-17a1-480f-87a7-3194a1d55c96", 00:19:23.512 "strip_size_kb": 0, 00:19:23.512 "state": "online", 00:19:23.512 "raid_level": "raid1", 00:19:23.512 "superblock": true, 00:19:23.512 "num_base_bdevs": 4, 00:19:23.512 "num_base_bdevs_discovered": 4, 00:19:23.512 "num_base_bdevs_operational": 4, 00:19:23.512 "base_bdevs_list": [ 00:19:23.512 { 00:19:23.512 "name": "NewBaseBdev", 00:19:23.512 "uuid": "3f58b040-8386-45a9-8680-2f40206adb95", 00:19:23.512 "is_configured": true, 00:19:23.512 "data_offset": 2048, 00:19:23.512 "data_size": 63488 00:19:23.512 }, 00:19:23.512 { 00:19:23.512 "name": "BaseBdev2", 00:19:23.512 "uuid": "e7b2f9d8-2946-4ebd-9824-5a2731fb3917", 00:19:23.512 "is_configured": true, 00:19:23.512 "data_offset": 2048, 00:19:23.512 "data_size": 63488 00:19:23.512 }, 00:19:23.512 { 00:19:23.512 "name": "BaseBdev3", 00:19:23.512 "uuid": "97f221e1-3ab2-48a6-aa3d-e49017e3d8a4", 00:19:23.512 "is_configured": true, 00:19:23.512 "data_offset": 2048, 00:19:23.512 "data_size": 63488 00:19:23.512 }, 00:19:23.512 { 00:19:23.512 "name": "BaseBdev4", 00:19:23.512 "uuid": "c163209e-529c-4ba3-b133-54c794ba5c98", 00:19:23.512 "is_configured": true, 00:19:23.512 "data_offset": 2048, 00:19:23.512 "data_size": 63488 00:19:23.512 } 00:19:23.512 ] 00:19:23.512 } 00:19:23.512 } 00:19:23.512 }' 00:19:23.512 00:01:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:23.513 00:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:19:23.513 BaseBdev2 00:19:23.513 BaseBdev3 00:19:23.513 BaseBdev4' 00:19:23.513 00:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:19:23.513 00:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:23.513 00:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:19:23.770 00:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:19:23.770 "name": "NewBaseBdev", 00:19:23.770 "aliases": [ 00:19:23.770 "3f58b040-8386-45a9-8680-2f40206adb95" 00:19:23.770 ], 00:19:23.770 "product_name": "Malloc disk", 00:19:23.770 "block_size": 512, 00:19:23.770 "num_blocks": 65536, 00:19:23.770 "uuid": "3f58b040-8386-45a9-8680-2f40206adb95", 00:19:23.770 "assigned_rate_limits": { 00:19:23.770 "rw_ios_per_sec": 0, 00:19:23.770 "rw_mbytes_per_sec": 0, 00:19:23.770 "r_mbytes_per_sec": 0, 00:19:23.770 "w_mbytes_per_sec": 0 00:19:23.770 }, 00:19:23.770 "claimed": true, 00:19:23.770 "claim_type": "exclusive_write", 00:19:23.770 "zoned": false, 00:19:23.770 "supported_io_types": { 00:19:23.770 "read": true, 00:19:23.770 "write": true, 00:19:23.770 "unmap": true, 00:19:23.770 "write_zeroes": true, 00:19:23.770 "flush": true, 00:19:23.770 "reset": true, 00:19:23.770 "compare": false, 00:19:23.770 "compare_and_write": false, 00:19:23.770 "abort": true, 00:19:23.770 "nvme_admin": false, 00:19:23.770 "nvme_io": false 00:19:23.770 }, 00:19:23.770 "memory_domains": [ 00:19:23.770 { 00:19:23.770 "dma_device_id": "system", 00:19:23.770 "dma_device_type": 1 00:19:23.770 }, 00:19:23.770 { 00:19:23.770 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:23.770 "dma_device_type": 2 00:19:23.770 } 00:19:23.770 ], 00:19:23.770 "driver_specific": {} 00:19:23.770 }' 00:19:23.770 00:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:23.770 00:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:24.027 00:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:19:24.027 00:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:24.027 00:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:24.027 00:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:24.027 00:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:24.027 00:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:24.027 00:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:24.027 00:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:24.027 00:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:24.285 00:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:19:24.285 00:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:19:24.285 00:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:24.285 00:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:19:24.285 00:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:19:24.285 "name": "BaseBdev2", 00:19:24.285 "aliases": [ 00:19:24.285 "e7b2f9d8-2946-4ebd-9824-5a2731fb3917" 00:19:24.285 ], 00:19:24.285 "product_name": "Malloc disk", 00:19:24.285 "block_size": 512, 00:19:24.285 "num_blocks": 65536, 00:19:24.285 "uuid": "e7b2f9d8-2946-4ebd-9824-5a2731fb3917", 00:19:24.285 "assigned_rate_limits": { 00:19:24.285 "rw_ios_per_sec": 0, 00:19:24.285 "rw_mbytes_per_sec": 0, 00:19:24.285 "r_mbytes_per_sec": 0, 00:19:24.285 "w_mbytes_per_sec": 0 00:19:24.285 }, 00:19:24.285 "claimed": true, 00:19:24.285 "claim_type": "exclusive_write", 00:19:24.285 "zoned": false, 00:19:24.285 "supported_io_types": { 00:19:24.285 "read": true, 00:19:24.285 "write": true, 00:19:24.285 "unmap": true, 00:19:24.285 "write_zeroes": true, 00:19:24.285 "flush": true, 00:19:24.285 "reset": true, 00:19:24.285 "compare": false, 00:19:24.285 "compare_and_write": false, 00:19:24.285 "abort": true, 00:19:24.285 "nvme_admin": false, 00:19:24.285 "nvme_io": false 00:19:24.285 }, 00:19:24.285 "memory_domains": [ 00:19:24.285 { 00:19:24.285 "dma_device_id": "system", 00:19:24.285 "dma_device_type": 1 00:19:24.285 }, 00:19:24.285 { 00:19:24.285 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:24.285 "dma_device_type": 2 00:19:24.285 } 00:19:24.285 ], 00:19:24.285 "driver_specific": {} 00:19:24.285 }' 00:19:24.285 00:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:24.542 00:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:24.542 00:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:19:24.542 00:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:24.542 00:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:24.542 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:24.542 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:24.542 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:24.542 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:24.542 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:24.800 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:24.800 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:19:24.800 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:19:24.800 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:24.800 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:19:24.800 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:19:24.800 "name": "BaseBdev3", 00:19:24.800 "aliases": [ 00:19:24.800 "97f221e1-3ab2-48a6-aa3d-e49017e3d8a4" 00:19:24.800 ], 00:19:24.800 "product_name": "Malloc disk", 00:19:24.800 "block_size": 512, 00:19:24.800 "num_blocks": 65536, 00:19:24.800 "uuid": "97f221e1-3ab2-48a6-aa3d-e49017e3d8a4", 00:19:24.800 "assigned_rate_limits": { 00:19:24.800 "rw_ios_per_sec": 0, 00:19:24.800 "rw_mbytes_per_sec": 0, 00:19:24.800 "r_mbytes_per_sec": 0, 00:19:24.800 "w_mbytes_per_sec": 0 00:19:24.800 }, 00:19:24.800 "claimed": true, 00:19:24.800 "claim_type": "exclusive_write", 00:19:24.800 "zoned": false, 00:19:24.800 "supported_io_types": { 00:19:24.800 "read": true, 00:19:24.800 "write": true, 00:19:24.800 "unmap": true, 00:19:24.800 "write_zeroes": true, 00:19:24.800 "flush": true, 00:19:24.800 "reset": true, 00:19:24.800 "compare": false, 00:19:24.800 "compare_and_write": false, 00:19:24.800 "abort": true, 00:19:24.800 "nvme_admin": false, 00:19:24.800 "nvme_io": false 00:19:24.800 }, 00:19:24.800 "memory_domains": [ 00:19:24.800 { 00:19:24.800 "dma_device_id": "system", 00:19:24.800 "dma_device_type": 1 00:19:24.800 }, 00:19:24.800 { 00:19:24.800 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:24.800 "dma_device_type": 2 00:19:24.800 } 00:19:24.800 ], 00:19:24.800 "driver_specific": {} 00:19:24.800 }' 00:19:24.800 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:25.058 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:25.058 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:19:25.058 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:25.058 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:25.058 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:25.058 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:25.058 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:25.058 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:25.058 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:25.316 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:25.316 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:19:25.316 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:19:25.316 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:25.316 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:19:25.574 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:19:25.574 "name": "BaseBdev4", 00:19:25.574 "aliases": [ 00:19:25.574 "c163209e-529c-4ba3-b133-54c794ba5c98" 00:19:25.574 ], 00:19:25.574 "product_name": "Malloc disk", 00:19:25.574 "block_size": 512, 00:19:25.574 "num_blocks": 65536, 00:19:25.574 "uuid": "c163209e-529c-4ba3-b133-54c794ba5c98", 00:19:25.574 "assigned_rate_limits": { 00:19:25.574 "rw_ios_per_sec": 0, 00:19:25.574 "rw_mbytes_per_sec": 0, 00:19:25.574 "r_mbytes_per_sec": 0, 00:19:25.574 "w_mbytes_per_sec": 0 00:19:25.574 }, 00:19:25.574 "claimed": true, 00:19:25.574 "claim_type": "exclusive_write", 00:19:25.574 "zoned": false, 00:19:25.574 "supported_io_types": { 00:19:25.574 "read": true, 00:19:25.574 "write": true, 00:19:25.574 "unmap": true, 00:19:25.574 "write_zeroes": true, 00:19:25.574 "flush": true, 00:19:25.574 "reset": true, 00:19:25.574 "compare": false, 00:19:25.574 "compare_and_write": false, 00:19:25.574 "abort": true, 00:19:25.574 "nvme_admin": false, 00:19:25.574 "nvme_io": false 00:19:25.574 }, 00:19:25.574 "memory_domains": [ 00:19:25.574 { 00:19:25.574 "dma_device_id": "system", 00:19:25.574 "dma_device_type": 1 00:19:25.574 }, 00:19:25.574 { 00:19:25.574 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:25.574 "dma_device_type": 2 00:19:25.574 } 00:19:25.574 ], 00:19:25.574 "driver_specific": {} 00:19:25.574 }' 00:19:25.574 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:25.574 00:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:25.574 00:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:19:25.574 00:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:25.574 00:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:25.574 00:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:25.574 00:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:25.574 00:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:25.832 00:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:25.832 00:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:25.832 00:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:25.832 00:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:19:25.832 00:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:26.090 [2024-05-15 00:01:26.454570] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:26.090 [2024-05-15 00:01:26.454601] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:26.090 [2024-05-15 00:01:26.454663] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:26.090 [2024-05-15 00:01:26.454943] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:26.090 [2024-05-15 00:01:26.454957] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bf96d0 name Existed_Raid, state offline 00:19:26.090 00:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@342 -- # killprocess 463772 00:19:26.090 00:01:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@946 -- # '[' -z 463772 ']' 00:19:26.090 00:01:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # kill -0 463772 00:19:26.090 00:01:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # uname 00:19:26.090 00:01:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:19:26.090 00:01:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 463772 00:19:26.090 00:01:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:19:26.090 00:01:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:19:26.090 00:01:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 463772' 00:19:26.090 killing process with pid 463772 00:19:26.090 00:01:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@965 -- # kill 463772 00:19:26.090 [2024-05-15 00:01:26.502750] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:26.090 00:01:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@970 -- # wait 463772 00:19:26.090 [2024-05-15 00:01:26.538633] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:26.348 00:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@344 -- # return 0 00:19:26.348 00:19:26.348 real 0m31.194s 00:19:26.348 user 0m57.368s 00:19:26.348 sys 0m5.512s 00:19:26.348 00:01:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:19:26.348 00:01:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:26.348 ************************************ 00:19:26.348 END TEST raid_state_function_test_sb 00:19:26.348 ************************************ 00:19:26.348 00:01:26 bdev_raid -- bdev/bdev_raid.sh@817 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:19:26.348 00:01:26 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:19:26.348 00:01:26 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:19:26.348 00:01:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:26.348 ************************************ 00:19:26.348 START TEST raid_superblock_test 00:19:26.348 ************************************ 00:19:26.348 00:01:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1121 -- # raid_superblock_test raid1 4 00:19:26.348 00:01:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local raid_level=raid1 00:19:26.348 00:01:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=4 00:19:26.348 00:01:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:19:26.348 00:01:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:19:26.348 00:01:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:19:26.348 00:01:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:19:26.348 00:01:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:19:26.348 00:01:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:19:26.348 00:01:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:19:26.348 00:01:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size 00:19:26.348 00:01:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:19:26.349 00:01:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:19:26.349 00:01:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:19:26.349 00:01:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # '[' raid1 '!=' raid1 ']' 00:19:26.349 00:01:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # strip_size=0 00:19:26.349 00:01:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # raid_pid=468447 00:19:26.349 00:01:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # waitforlisten 468447 /var/tmp/spdk-raid.sock 00:19:26.349 00:01:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:19:26.349 00:01:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@827 -- # '[' -z 468447 ']' 00:19:26.349 00:01:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:26.349 00:01:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:19:26.349 00:01:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:26.349 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:26.349 00:01:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:19:26.349 00:01:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:26.349 [2024-05-15 00:01:26.932824] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:19:26.349 [2024-05-15 00:01:26.932892] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid468447 ] 00:19:26.664 [2024-05-15 00:01:27.063097] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:26.664 [2024-05-15 00:01:27.164843] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:19:26.664 [2024-05-15 00:01:27.229577] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:26.664 [2024-05-15 00:01:27.229618] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:27.595 00:01:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:19:27.595 00:01:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # return 0 00:19:27.595 00:01:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:19:27.595 00:01:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:19:27.595 00:01:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:19:27.595 00:01:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:19:27.595 00:01:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:19:27.595 00:01:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:27.595 00:01:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:19:27.595 00:01:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:27.595 00:01:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:19:27.595 malloc1 00:19:27.595 00:01:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:27.851 [2024-05-15 00:01:28.331517] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:27.851 [2024-05-15 00:01:28.331566] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:27.851 [2024-05-15 00:01:28.331589] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b69780 00:19:27.851 [2024-05-15 00:01:28.331602] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:27.851 [2024-05-15 00:01:28.333301] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:27.851 [2024-05-15 00:01:28.333332] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:27.851 pt1 00:19:27.851 00:01:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:19:27.851 00:01:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:19:27.851 00:01:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:19:27.851 00:01:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:19:27.851 00:01:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:19:27.851 00:01:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:27.851 00:01:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:19:27.851 00:01:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:27.851 00:01:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:19:28.108 malloc2 00:19:28.108 00:01:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:28.365 [2024-05-15 00:01:28.833717] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:28.366 [2024-05-15 00:01:28.833764] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:28.366 [2024-05-15 00:01:28.833782] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b6ab60 00:19:28.366 [2024-05-15 00:01:28.833795] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:28.366 [2024-05-15 00:01:28.835162] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:28.366 [2024-05-15 00:01:28.835191] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:28.366 pt2 00:19:28.366 00:01:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:19:28.366 00:01:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:19:28.366 00:01:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc3 00:19:28.366 00:01:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt3 00:19:28.366 00:01:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:19:28.366 00:01:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:28.366 00:01:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:19:28.366 00:01:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:28.366 00:01:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:19:28.623 malloc3 00:19:28.623 00:01:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:28.881 [2024-05-15 00:01:29.331587] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:28.881 [2024-05-15 00:01:29.331631] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:28.881 [2024-05-15 00:01:29.331650] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d15080 00:19:28.881 [2024-05-15 00:01:29.331663] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:28.881 [2024-05-15 00:01:29.333029] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:28.881 [2024-05-15 00:01:29.333058] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:28.881 pt3 00:19:28.881 00:01:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:19:28.881 00:01:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:19:28.881 00:01:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc4 00:19:28.881 00:01:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt4 00:19:28.881 00:01:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:19:28.881 00:01:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:28.881 00:01:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:19:28.881 00:01:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:28.881 00:01:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:19:29.138 malloc4 00:19:29.138 00:01:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:29.396 [2024-05-15 00:01:29.814764] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:29.396 [2024-05-15 00:01:29.814815] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:29.396 [2024-05-15 00:01:29.814839] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d17610 00:19:29.396 [2024-05-15 00:01:29.814853] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:29.396 [2024-05-15 00:01:29.816380] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:29.396 [2024-05-15 00:01:29.816419] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:29.396 pt4 00:19:29.396 00:01:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:19:29.396 00:01:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:19:29.396 00:01:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:19:29.654 [2024-05-15 00:01:30.059454] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:29.654 [2024-05-15 00:01:30.060829] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:29.654 [2024-05-15 00:01:30.060889] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:29.654 [2024-05-15 00:01:30.060934] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:29.654 [2024-05-15 00:01:30.061118] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d19480 00:19:29.654 [2024-05-15 00:01:30.061130] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:29.654 [2024-05-15 00:01:30.061340] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d18310 00:19:29.654 [2024-05-15 00:01:30.061511] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d19480 00:19:29.654 [2024-05-15 00:01:30.061522] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d19480 00:19:29.654 [2024-05-15 00:01:30.061632] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:29.654 00:01:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:19:29.654 00:01:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:29.654 00:01:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:29.654 00:01:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:29.654 00:01:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:29.654 00:01:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:19:29.654 00:01:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:29.654 00:01:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:29.654 00:01:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:29.654 00:01:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:29.654 00:01:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:29.654 00:01:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:29.912 00:01:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:29.912 "name": "raid_bdev1", 00:19:29.912 "uuid": "2e63b30b-3c80-4df1-98e4-ad1e9c78912b", 00:19:29.912 "strip_size_kb": 0, 00:19:29.912 "state": "online", 00:19:29.912 "raid_level": "raid1", 00:19:29.912 "superblock": true, 00:19:29.912 "num_base_bdevs": 4, 00:19:29.912 "num_base_bdevs_discovered": 4, 00:19:29.912 "num_base_bdevs_operational": 4, 00:19:29.912 "base_bdevs_list": [ 00:19:29.912 { 00:19:29.912 "name": "pt1", 00:19:29.912 "uuid": "14dc7e95-57a9-5bd2-bfd4-c3dc66ebef27", 00:19:29.912 "is_configured": true, 00:19:29.912 "data_offset": 2048, 00:19:29.912 "data_size": 63488 00:19:29.912 }, 00:19:29.912 { 00:19:29.912 "name": "pt2", 00:19:29.912 "uuid": "e1e83b46-f2bc-535e-be68-86befdbe25dd", 00:19:29.912 "is_configured": true, 00:19:29.912 "data_offset": 2048, 00:19:29.912 "data_size": 63488 00:19:29.912 }, 00:19:29.912 { 00:19:29.912 "name": "pt3", 00:19:29.912 "uuid": "7d7994b9-c21d-5e5b-a49d-ac7cd3cb00a3", 00:19:29.912 "is_configured": true, 00:19:29.912 "data_offset": 2048, 00:19:29.912 "data_size": 63488 00:19:29.912 }, 00:19:29.912 { 00:19:29.912 "name": "pt4", 00:19:29.912 "uuid": "4750b7a8-3d01-541f-9fac-6495a1d6cf49", 00:19:29.912 "is_configured": true, 00:19:29.912 "data_offset": 2048, 00:19:29.912 "data_size": 63488 00:19:29.912 } 00:19:29.912 ] 00:19:29.912 }' 00:19:29.912 00:01:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:29.912 00:01:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:30.475 00:01:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:19:30.475 00:01:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:19:30.475 00:01:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:19:30.475 00:01:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:19:30.475 00:01:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:19:30.475 00:01:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:19:30.475 00:01:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:19:30.475 00:01:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:30.733 [2024-05-15 00:01:31.082406] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:30.733 00:01:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:19:30.733 "name": "raid_bdev1", 00:19:30.733 "aliases": [ 00:19:30.733 "2e63b30b-3c80-4df1-98e4-ad1e9c78912b" 00:19:30.733 ], 00:19:30.733 "product_name": "Raid Volume", 00:19:30.733 "block_size": 512, 00:19:30.733 "num_blocks": 63488, 00:19:30.733 "uuid": "2e63b30b-3c80-4df1-98e4-ad1e9c78912b", 00:19:30.733 "assigned_rate_limits": { 00:19:30.733 "rw_ios_per_sec": 0, 00:19:30.733 "rw_mbytes_per_sec": 0, 00:19:30.733 "r_mbytes_per_sec": 0, 00:19:30.733 "w_mbytes_per_sec": 0 00:19:30.733 }, 00:19:30.733 "claimed": false, 00:19:30.733 "zoned": false, 00:19:30.733 "supported_io_types": { 00:19:30.733 "read": true, 00:19:30.733 "write": true, 00:19:30.733 "unmap": false, 00:19:30.733 "write_zeroes": true, 00:19:30.733 "flush": false, 00:19:30.733 "reset": true, 00:19:30.733 "compare": false, 00:19:30.733 "compare_and_write": false, 00:19:30.733 "abort": false, 00:19:30.733 "nvme_admin": false, 00:19:30.733 "nvme_io": false 00:19:30.733 }, 00:19:30.733 "memory_domains": [ 00:19:30.733 { 00:19:30.733 "dma_device_id": "system", 00:19:30.733 "dma_device_type": 1 00:19:30.733 }, 00:19:30.733 { 00:19:30.733 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:30.733 "dma_device_type": 2 00:19:30.733 }, 00:19:30.733 { 00:19:30.733 "dma_device_id": "system", 00:19:30.733 "dma_device_type": 1 00:19:30.733 }, 00:19:30.733 { 00:19:30.733 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:30.733 "dma_device_type": 2 00:19:30.733 }, 00:19:30.733 { 00:19:30.733 "dma_device_id": "system", 00:19:30.733 "dma_device_type": 1 00:19:30.733 }, 00:19:30.733 { 00:19:30.733 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:30.733 "dma_device_type": 2 00:19:30.733 }, 00:19:30.733 { 00:19:30.733 "dma_device_id": "system", 00:19:30.733 "dma_device_type": 1 00:19:30.733 }, 00:19:30.733 { 00:19:30.733 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:30.733 "dma_device_type": 2 00:19:30.733 } 00:19:30.733 ], 00:19:30.733 "driver_specific": { 00:19:30.733 "raid": { 00:19:30.733 "uuid": "2e63b30b-3c80-4df1-98e4-ad1e9c78912b", 00:19:30.733 "strip_size_kb": 0, 00:19:30.733 "state": "online", 00:19:30.733 "raid_level": "raid1", 00:19:30.733 "superblock": true, 00:19:30.733 "num_base_bdevs": 4, 00:19:30.733 "num_base_bdevs_discovered": 4, 00:19:30.733 "num_base_bdevs_operational": 4, 00:19:30.733 "base_bdevs_list": [ 00:19:30.733 { 00:19:30.733 "name": "pt1", 00:19:30.733 "uuid": "14dc7e95-57a9-5bd2-bfd4-c3dc66ebef27", 00:19:30.733 "is_configured": true, 00:19:30.733 "data_offset": 2048, 00:19:30.733 "data_size": 63488 00:19:30.733 }, 00:19:30.733 { 00:19:30.733 "name": "pt2", 00:19:30.733 "uuid": "e1e83b46-f2bc-535e-be68-86befdbe25dd", 00:19:30.733 "is_configured": true, 00:19:30.733 "data_offset": 2048, 00:19:30.733 "data_size": 63488 00:19:30.733 }, 00:19:30.733 { 00:19:30.733 "name": "pt3", 00:19:30.733 "uuid": "7d7994b9-c21d-5e5b-a49d-ac7cd3cb00a3", 00:19:30.733 "is_configured": true, 00:19:30.733 "data_offset": 2048, 00:19:30.733 "data_size": 63488 00:19:30.733 }, 00:19:30.733 { 00:19:30.733 "name": "pt4", 00:19:30.733 "uuid": "4750b7a8-3d01-541f-9fac-6495a1d6cf49", 00:19:30.733 "is_configured": true, 00:19:30.733 "data_offset": 2048, 00:19:30.733 "data_size": 63488 00:19:30.733 } 00:19:30.733 ] 00:19:30.733 } 00:19:30.733 } 00:19:30.733 }' 00:19:30.733 00:01:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:30.733 00:01:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:19:30.733 pt2 00:19:30.733 pt3 00:19:30.733 pt4' 00:19:30.733 00:01:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:19:30.733 00:01:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:30.733 00:01:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:19:30.990 00:01:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:19:30.990 "name": "pt1", 00:19:30.990 "aliases": [ 00:19:30.990 "14dc7e95-57a9-5bd2-bfd4-c3dc66ebef27" 00:19:30.990 ], 00:19:30.990 "product_name": "passthru", 00:19:30.990 "block_size": 512, 00:19:30.990 "num_blocks": 65536, 00:19:30.990 "uuid": "14dc7e95-57a9-5bd2-bfd4-c3dc66ebef27", 00:19:30.990 "assigned_rate_limits": { 00:19:30.990 "rw_ios_per_sec": 0, 00:19:30.990 "rw_mbytes_per_sec": 0, 00:19:30.990 "r_mbytes_per_sec": 0, 00:19:30.990 "w_mbytes_per_sec": 0 00:19:30.990 }, 00:19:30.990 "claimed": true, 00:19:30.990 "claim_type": "exclusive_write", 00:19:30.990 "zoned": false, 00:19:30.990 "supported_io_types": { 00:19:30.990 "read": true, 00:19:30.990 "write": true, 00:19:30.990 "unmap": true, 00:19:30.990 "write_zeroes": true, 00:19:30.990 "flush": true, 00:19:30.990 "reset": true, 00:19:30.990 "compare": false, 00:19:30.990 "compare_and_write": false, 00:19:30.990 "abort": true, 00:19:30.990 "nvme_admin": false, 00:19:30.990 "nvme_io": false 00:19:30.990 }, 00:19:30.990 "memory_domains": [ 00:19:30.990 { 00:19:30.990 "dma_device_id": "system", 00:19:30.990 "dma_device_type": 1 00:19:30.990 }, 00:19:30.990 { 00:19:30.990 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:30.990 "dma_device_type": 2 00:19:30.990 } 00:19:30.990 ], 00:19:30.990 "driver_specific": { 00:19:30.990 "passthru": { 00:19:30.990 "name": "pt1", 00:19:30.990 "base_bdev_name": "malloc1" 00:19:30.990 } 00:19:30.990 } 00:19:30.990 }' 00:19:30.990 00:01:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:30.990 00:01:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:30.990 00:01:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:19:30.990 00:01:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:30.990 00:01:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:30.990 00:01:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:30.990 00:01:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:31.248 00:01:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:31.248 00:01:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:31.248 00:01:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:31.248 00:01:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:31.248 00:01:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:19:31.248 00:01:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:19:31.248 00:01:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:31.248 00:01:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:19:31.511 00:01:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:19:31.511 "name": "pt2", 00:19:31.511 "aliases": [ 00:19:31.511 "e1e83b46-f2bc-535e-be68-86befdbe25dd" 00:19:31.511 ], 00:19:31.511 "product_name": "passthru", 00:19:31.511 "block_size": 512, 00:19:31.511 "num_blocks": 65536, 00:19:31.511 "uuid": "e1e83b46-f2bc-535e-be68-86befdbe25dd", 00:19:31.511 "assigned_rate_limits": { 00:19:31.511 "rw_ios_per_sec": 0, 00:19:31.511 "rw_mbytes_per_sec": 0, 00:19:31.511 "r_mbytes_per_sec": 0, 00:19:31.511 "w_mbytes_per_sec": 0 00:19:31.511 }, 00:19:31.511 "claimed": true, 00:19:31.511 "claim_type": "exclusive_write", 00:19:31.511 "zoned": false, 00:19:31.511 "supported_io_types": { 00:19:31.511 "read": true, 00:19:31.511 "write": true, 00:19:31.511 "unmap": true, 00:19:31.511 "write_zeroes": true, 00:19:31.511 "flush": true, 00:19:31.511 "reset": true, 00:19:31.511 "compare": false, 00:19:31.511 "compare_and_write": false, 00:19:31.511 "abort": true, 00:19:31.511 "nvme_admin": false, 00:19:31.511 "nvme_io": false 00:19:31.511 }, 00:19:31.511 "memory_domains": [ 00:19:31.511 { 00:19:31.511 "dma_device_id": "system", 00:19:31.511 "dma_device_type": 1 00:19:31.511 }, 00:19:31.511 { 00:19:31.511 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:31.511 "dma_device_type": 2 00:19:31.511 } 00:19:31.511 ], 00:19:31.511 "driver_specific": { 00:19:31.511 "passthru": { 00:19:31.511 "name": "pt2", 00:19:31.511 "base_bdev_name": "malloc2" 00:19:31.511 } 00:19:31.511 } 00:19:31.511 }' 00:19:31.511 00:01:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:31.511 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:31.511 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:19:31.511 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:31.772 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:31.772 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:31.772 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:31.772 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:31.772 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:31.772 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:31.772 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:31.772 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:19:31.772 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:19:31.772 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:31.772 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:19:32.030 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:19:32.030 "name": "pt3", 00:19:32.030 "aliases": [ 00:19:32.030 "7d7994b9-c21d-5e5b-a49d-ac7cd3cb00a3" 00:19:32.030 ], 00:19:32.030 "product_name": "passthru", 00:19:32.030 "block_size": 512, 00:19:32.030 "num_blocks": 65536, 00:19:32.030 "uuid": "7d7994b9-c21d-5e5b-a49d-ac7cd3cb00a3", 00:19:32.030 "assigned_rate_limits": { 00:19:32.030 "rw_ios_per_sec": 0, 00:19:32.030 "rw_mbytes_per_sec": 0, 00:19:32.030 "r_mbytes_per_sec": 0, 00:19:32.030 "w_mbytes_per_sec": 0 00:19:32.030 }, 00:19:32.030 "claimed": true, 00:19:32.030 "claim_type": "exclusive_write", 00:19:32.030 "zoned": false, 00:19:32.030 "supported_io_types": { 00:19:32.030 "read": true, 00:19:32.030 "write": true, 00:19:32.030 "unmap": true, 00:19:32.030 "write_zeroes": true, 00:19:32.030 "flush": true, 00:19:32.030 "reset": true, 00:19:32.030 "compare": false, 00:19:32.030 "compare_and_write": false, 00:19:32.030 "abort": true, 00:19:32.030 "nvme_admin": false, 00:19:32.030 "nvme_io": false 00:19:32.030 }, 00:19:32.030 "memory_domains": [ 00:19:32.030 { 00:19:32.030 "dma_device_id": "system", 00:19:32.030 "dma_device_type": 1 00:19:32.030 }, 00:19:32.030 { 00:19:32.030 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:32.030 "dma_device_type": 2 00:19:32.030 } 00:19:32.030 ], 00:19:32.030 "driver_specific": { 00:19:32.030 "passthru": { 00:19:32.030 "name": "pt3", 00:19:32.030 "base_bdev_name": "malloc3" 00:19:32.030 } 00:19:32.030 } 00:19:32.030 }' 00:19:32.030 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:32.030 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:32.287 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:19:32.288 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:32.288 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:32.288 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:32.288 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:32.288 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:32.288 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:32.288 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:32.288 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:32.546 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:19:32.546 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:19:32.546 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:19:32.546 00:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:19:32.803 00:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:19:32.803 "name": "pt4", 00:19:32.803 "aliases": [ 00:19:32.803 "4750b7a8-3d01-541f-9fac-6495a1d6cf49" 00:19:32.803 ], 00:19:32.803 "product_name": "passthru", 00:19:32.803 "block_size": 512, 00:19:32.803 "num_blocks": 65536, 00:19:32.803 "uuid": "4750b7a8-3d01-541f-9fac-6495a1d6cf49", 00:19:32.803 "assigned_rate_limits": { 00:19:32.804 "rw_ios_per_sec": 0, 00:19:32.804 "rw_mbytes_per_sec": 0, 00:19:32.804 "r_mbytes_per_sec": 0, 00:19:32.804 "w_mbytes_per_sec": 0 00:19:32.804 }, 00:19:32.804 "claimed": true, 00:19:32.804 "claim_type": "exclusive_write", 00:19:32.804 "zoned": false, 00:19:32.804 "supported_io_types": { 00:19:32.804 "read": true, 00:19:32.804 "write": true, 00:19:32.804 "unmap": true, 00:19:32.804 "write_zeroes": true, 00:19:32.804 "flush": true, 00:19:32.804 "reset": true, 00:19:32.804 "compare": false, 00:19:32.804 "compare_and_write": false, 00:19:32.804 "abort": true, 00:19:32.804 "nvme_admin": false, 00:19:32.804 "nvme_io": false 00:19:32.804 }, 00:19:32.804 "memory_domains": [ 00:19:32.804 { 00:19:32.804 "dma_device_id": "system", 00:19:32.804 "dma_device_type": 1 00:19:32.804 }, 00:19:32.804 { 00:19:32.804 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:32.804 "dma_device_type": 2 00:19:32.804 } 00:19:32.804 ], 00:19:32.804 "driver_specific": { 00:19:32.804 "passthru": { 00:19:32.804 "name": "pt4", 00:19:32.804 "base_bdev_name": "malloc4" 00:19:32.804 } 00:19:32.804 } 00:19:32.804 }' 00:19:32.804 00:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:32.804 00:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:32.804 00:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:19:32.804 00:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:32.804 00:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:32.804 00:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:32.804 00:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:32.804 00:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:33.060 00:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:33.060 00:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:33.060 00:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:33.060 00:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:19:33.060 00:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:33.060 00:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:19:33.317 [2024-05-15 00:01:33.729421] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:33.317 00:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=2e63b30b-3c80-4df1-98e4-ad1e9c78912b 00:19:33.317 00:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # '[' -z 2e63b30b-3c80-4df1-98e4-ad1e9c78912b ']' 00:19:33.317 00:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:33.575 [2024-05-15 00:01:33.973788] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:33.575 [2024-05-15 00:01:33.973813] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:33.575 [2024-05-15 00:01:33.973871] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:33.575 [2024-05-15 00:01:33.973953] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:33.575 [2024-05-15 00:01:33.973966] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d19480 name raid_bdev1, state offline 00:19:33.575 00:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:33.575 00:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:19:33.832 00:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:19:33.832 00:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:19:33.832 00:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:19:33.832 00:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:19:34.090 00:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:19:34.090 00:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:34.347 00:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:19:34.347 00:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:19:34.603 00:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:19:34.603 00:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:19:34.861 00:01:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:19:34.861 00:01:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:19:34.861 00:01:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:19:34.861 00:01:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:34.861 00:01:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:19:34.861 00:01:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:34.861 00:01:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:34.861 00:01:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:34.861 00:01:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:35.122 00:01:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:35.122 00:01:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:35.122 00:01:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:35.122 00:01:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:35.122 00:01:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:19:35.122 00:01:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:35.122 [2024-05-15 00:01:35.678227] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:19:35.122 [2024-05-15 00:01:35.679645] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:19:35.122 [2024-05-15 00:01:35.679693] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:19:35.122 [2024-05-15 00:01:35.679729] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:19:35.122 [2024-05-15 00:01:35.679777] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:19:35.122 [2024-05-15 00:01:35.679819] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:19:35.122 [2024-05-15 00:01:35.679842] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:19:35.122 [2024-05-15 00:01:35.679863] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:19:35.122 [2024-05-15 00:01:35.679882] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:35.122 [2024-05-15 00:01:35.679893] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b68930 name raid_bdev1, state configuring 00:19:35.122 request: 00:19:35.122 { 00:19:35.122 "name": "raid_bdev1", 00:19:35.122 "raid_level": "raid1", 00:19:35.122 "base_bdevs": [ 00:19:35.122 "malloc1", 00:19:35.122 "malloc2", 00:19:35.122 "malloc3", 00:19:35.122 "malloc4" 00:19:35.122 ], 00:19:35.122 "superblock": false, 00:19:35.122 "method": "bdev_raid_create", 00:19:35.122 "req_id": 1 00:19:35.122 } 00:19:35.122 Got JSON-RPC error response 00:19:35.122 response: 00:19:35.122 { 00:19:35.122 "code": -17, 00:19:35.122 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:19:35.122 } 00:19:35.122 00:01:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:19:35.122 00:01:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:19:35.122 00:01:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:19:35.122 00:01:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:19:35.122 00:01:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:35.122 00:01:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:19:35.428 00:01:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:19:35.428 00:01:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:19:35.428 00:01:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:35.686 [2024-05-15 00:01:36.159432] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:35.686 [2024-05-15 00:01:36.159494] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:35.686 [2024-05-15 00:01:36.159516] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d12aa0 00:19:35.686 [2024-05-15 00:01:36.159530] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:35.686 [2024-05-15 00:01:36.161231] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:35.686 [2024-05-15 00:01:36.161264] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:35.686 [2024-05-15 00:01:36.161341] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt1 00:19:35.686 [2024-05-15 00:01:36.161369] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:35.686 pt1 00:19:35.686 00:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:19:35.686 00:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:35.686 00:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:19:35.686 00:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:35.686 00:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:35.686 00:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:19:35.686 00:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:35.686 00:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:35.686 00:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:35.686 00:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:35.686 00:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:35.686 00:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:35.944 00:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:35.944 "name": "raid_bdev1", 00:19:35.944 "uuid": "2e63b30b-3c80-4df1-98e4-ad1e9c78912b", 00:19:35.944 "strip_size_kb": 0, 00:19:35.944 "state": "configuring", 00:19:35.944 "raid_level": "raid1", 00:19:35.944 "superblock": true, 00:19:35.944 "num_base_bdevs": 4, 00:19:35.944 "num_base_bdevs_discovered": 1, 00:19:35.944 "num_base_bdevs_operational": 4, 00:19:35.944 "base_bdevs_list": [ 00:19:35.944 { 00:19:35.944 "name": "pt1", 00:19:35.944 "uuid": "14dc7e95-57a9-5bd2-bfd4-c3dc66ebef27", 00:19:35.944 "is_configured": true, 00:19:35.944 "data_offset": 2048, 00:19:35.944 "data_size": 63488 00:19:35.944 }, 00:19:35.944 { 00:19:35.944 "name": null, 00:19:35.944 "uuid": "e1e83b46-f2bc-535e-be68-86befdbe25dd", 00:19:35.944 "is_configured": false, 00:19:35.944 "data_offset": 2048, 00:19:35.944 "data_size": 63488 00:19:35.944 }, 00:19:35.944 { 00:19:35.944 "name": null, 00:19:35.944 "uuid": "7d7994b9-c21d-5e5b-a49d-ac7cd3cb00a3", 00:19:35.944 "is_configured": false, 00:19:35.944 "data_offset": 2048, 00:19:35.944 "data_size": 63488 00:19:35.944 }, 00:19:35.944 { 00:19:35.944 "name": null, 00:19:35.944 "uuid": "4750b7a8-3d01-541f-9fac-6495a1d6cf49", 00:19:35.944 "is_configured": false, 00:19:35.944 "data_offset": 2048, 00:19:35.944 "data_size": 63488 00:19:35.944 } 00:19:35.944 ] 00:19:35.944 }' 00:19:35.944 00:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:35.944 00:01:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:36.507 00:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@470 -- # '[' 4 -gt 2 ']' 00:19:36.507 00:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:36.764 [2024-05-15 00:01:37.242306] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:36.764 [2024-05-15 00:01:37.242360] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:36.764 [2024-05-15 00:01:37.242383] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d17840 00:19:36.764 [2024-05-15 00:01:37.242397] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:36.764 [2024-05-15 00:01:37.242770] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:36.764 [2024-05-15 00:01:37.242790] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:36.764 [2024-05-15 00:01:37.242853] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:19:36.764 [2024-05-15 00:01:37.242879] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:36.764 pt2 00:19:36.764 00:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:37.022 [2024-05-15 00:01:37.482953] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:19:37.022 00:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:19:37.022 00:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:37.022 00:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:19:37.022 00:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:37.022 00:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:37.022 00:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:19:37.022 00:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:37.022 00:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:37.022 00:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:37.022 00:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:37.022 00:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:37.022 00:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:37.281 00:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:37.281 "name": "raid_bdev1", 00:19:37.281 "uuid": "2e63b30b-3c80-4df1-98e4-ad1e9c78912b", 00:19:37.281 "strip_size_kb": 0, 00:19:37.281 "state": "configuring", 00:19:37.281 "raid_level": "raid1", 00:19:37.281 "superblock": true, 00:19:37.281 "num_base_bdevs": 4, 00:19:37.281 "num_base_bdevs_discovered": 1, 00:19:37.281 "num_base_bdevs_operational": 4, 00:19:37.281 "base_bdevs_list": [ 00:19:37.281 { 00:19:37.281 "name": "pt1", 00:19:37.281 "uuid": "14dc7e95-57a9-5bd2-bfd4-c3dc66ebef27", 00:19:37.281 "is_configured": true, 00:19:37.281 "data_offset": 2048, 00:19:37.281 "data_size": 63488 00:19:37.281 }, 00:19:37.281 { 00:19:37.281 "name": null, 00:19:37.281 "uuid": "e1e83b46-f2bc-535e-be68-86befdbe25dd", 00:19:37.281 "is_configured": false, 00:19:37.281 "data_offset": 2048, 00:19:37.281 "data_size": 63488 00:19:37.281 }, 00:19:37.281 { 00:19:37.281 "name": null, 00:19:37.281 "uuid": "7d7994b9-c21d-5e5b-a49d-ac7cd3cb00a3", 00:19:37.281 "is_configured": false, 00:19:37.281 "data_offset": 2048, 00:19:37.281 "data_size": 63488 00:19:37.281 }, 00:19:37.281 { 00:19:37.281 "name": null, 00:19:37.281 "uuid": "4750b7a8-3d01-541f-9fac-6495a1d6cf49", 00:19:37.281 "is_configured": false, 00:19:37.281 "data_offset": 2048, 00:19:37.281 "data_size": 63488 00:19:37.281 } 00:19:37.281 ] 00:19:37.281 }' 00:19:37.281 00:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:37.281 00:01:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:37.847 00:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:19:37.847 00:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:19:37.847 00:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:38.105 [2024-05-15 00:01:38.565822] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:38.105 [2024-05-15 00:01:38.565885] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:38.105 [2024-05-15 00:01:38.565910] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b69b50 00:19:38.105 [2024-05-15 00:01:38.565925] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:38.105 [2024-05-15 00:01:38.566302] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:38.105 [2024-05-15 00:01:38.566321] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:38.105 [2024-05-15 00:01:38.566390] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:19:38.105 [2024-05-15 00:01:38.566425] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:38.105 pt2 00:19:38.105 00:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:19:38.105 00:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:19:38.105 00:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:38.363 [2024-05-15 00:01:38.806463] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:38.363 [2024-05-15 00:01:38.806502] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:38.363 [2024-05-15 00:01:38.806521] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d18e50 00:19:38.363 [2024-05-15 00:01:38.806533] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:38.363 [2024-05-15 00:01:38.806877] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:38.363 [2024-05-15 00:01:38.806896] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:38.363 [2024-05-15 00:01:38.806957] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt3 00:19:38.363 [2024-05-15 00:01:38.806975] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:38.363 pt3 00:19:38.363 00:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:19:38.363 00:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:19:38.363 00:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:38.621 [2024-05-15 00:01:39.035069] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:38.621 [2024-05-15 00:01:39.035113] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:38.621 [2024-05-15 00:01:39.035133] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b680e0 00:19:38.621 [2024-05-15 00:01:39.035145] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:38.621 [2024-05-15 00:01:39.035491] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:38.621 [2024-05-15 00:01:39.035511] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:38.621 [2024-05-15 00:01:39.035574] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt4 00:19:38.621 [2024-05-15 00:01:39.035593] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:38.621 [2024-05-15 00:01:39.035718] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d1a700 00:19:38.621 [2024-05-15 00:01:39.035729] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:38.621 [2024-05-15 00:01:39.035900] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d0bcd0 00:19:38.621 [2024-05-15 00:01:39.036040] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d1a700 00:19:38.621 [2024-05-15 00:01:39.036050] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d1a700 00:19:38.621 [2024-05-15 00:01:39.036147] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:38.621 pt4 00:19:38.621 00:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:19:38.621 00:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:19:38.621 00:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:19:38.621 00:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:38.621 00:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:38.621 00:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:38.621 00:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:38.621 00:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:19:38.621 00:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:38.621 00:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:38.621 00:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:38.621 00:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:38.621 00:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:38.621 00:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:38.879 00:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:38.879 "name": "raid_bdev1", 00:19:38.879 "uuid": "2e63b30b-3c80-4df1-98e4-ad1e9c78912b", 00:19:38.879 "strip_size_kb": 0, 00:19:38.879 "state": "online", 00:19:38.879 "raid_level": "raid1", 00:19:38.879 "superblock": true, 00:19:38.879 "num_base_bdevs": 4, 00:19:38.879 "num_base_bdevs_discovered": 4, 00:19:38.879 "num_base_bdevs_operational": 4, 00:19:38.880 "base_bdevs_list": [ 00:19:38.880 { 00:19:38.880 "name": "pt1", 00:19:38.880 "uuid": "14dc7e95-57a9-5bd2-bfd4-c3dc66ebef27", 00:19:38.880 "is_configured": true, 00:19:38.880 "data_offset": 2048, 00:19:38.880 "data_size": 63488 00:19:38.880 }, 00:19:38.880 { 00:19:38.880 "name": "pt2", 00:19:38.880 "uuid": "e1e83b46-f2bc-535e-be68-86befdbe25dd", 00:19:38.880 "is_configured": true, 00:19:38.880 "data_offset": 2048, 00:19:38.880 "data_size": 63488 00:19:38.880 }, 00:19:38.880 { 00:19:38.880 "name": "pt3", 00:19:38.880 "uuid": "7d7994b9-c21d-5e5b-a49d-ac7cd3cb00a3", 00:19:38.880 "is_configured": true, 00:19:38.880 "data_offset": 2048, 00:19:38.880 "data_size": 63488 00:19:38.880 }, 00:19:38.880 { 00:19:38.880 "name": "pt4", 00:19:38.880 "uuid": "4750b7a8-3d01-541f-9fac-6495a1d6cf49", 00:19:38.880 "is_configured": true, 00:19:38.880 "data_offset": 2048, 00:19:38.880 "data_size": 63488 00:19:38.880 } 00:19:38.880 ] 00:19:38.880 }' 00:19:38.880 00:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:38.880 00:01:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:39.448 00:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:19:39.448 00:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:19:39.448 00:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:19:39.448 00:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:19:39.448 00:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:19:39.448 00:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:19:39.448 00:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:39.448 00:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:19:39.707 [2024-05-15 00:01:40.118236] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:39.707 00:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:19:39.707 "name": "raid_bdev1", 00:19:39.707 "aliases": [ 00:19:39.707 "2e63b30b-3c80-4df1-98e4-ad1e9c78912b" 00:19:39.707 ], 00:19:39.707 "product_name": "Raid Volume", 00:19:39.707 "block_size": 512, 00:19:39.707 "num_blocks": 63488, 00:19:39.707 "uuid": "2e63b30b-3c80-4df1-98e4-ad1e9c78912b", 00:19:39.707 "assigned_rate_limits": { 00:19:39.707 "rw_ios_per_sec": 0, 00:19:39.707 "rw_mbytes_per_sec": 0, 00:19:39.707 "r_mbytes_per_sec": 0, 00:19:39.707 "w_mbytes_per_sec": 0 00:19:39.707 }, 00:19:39.707 "claimed": false, 00:19:39.707 "zoned": false, 00:19:39.707 "supported_io_types": { 00:19:39.707 "read": true, 00:19:39.707 "write": true, 00:19:39.707 "unmap": false, 00:19:39.707 "write_zeroes": true, 00:19:39.707 "flush": false, 00:19:39.707 "reset": true, 00:19:39.707 "compare": false, 00:19:39.707 "compare_and_write": false, 00:19:39.707 "abort": false, 00:19:39.707 "nvme_admin": false, 00:19:39.707 "nvme_io": false 00:19:39.707 }, 00:19:39.707 "memory_domains": [ 00:19:39.707 { 00:19:39.707 "dma_device_id": "system", 00:19:39.707 "dma_device_type": 1 00:19:39.707 }, 00:19:39.707 { 00:19:39.707 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:39.707 "dma_device_type": 2 00:19:39.707 }, 00:19:39.707 { 00:19:39.707 "dma_device_id": "system", 00:19:39.707 "dma_device_type": 1 00:19:39.707 }, 00:19:39.707 { 00:19:39.707 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:39.707 "dma_device_type": 2 00:19:39.707 }, 00:19:39.707 { 00:19:39.707 "dma_device_id": "system", 00:19:39.707 "dma_device_type": 1 00:19:39.707 }, 00:19:39.707 { 00:19:39.707 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:39.707 "dma_device_type": 2 00:19:39.707 }, 00:19:39.707 { 00:19:39.707 "dma_device_id": "system", 00:19:39.707 "dma_device_type": 1 00:19:39.707 }, 00:19:39.707 { 00:19:39.707 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:39.707 "dma_device_type": 2 00:19:39.707 } 00:19:39.707 ], 00:19:39.707 "driver_specific": { 00:19:39.707 "raid": { 00:19:39.707 "uuid": "2e63b30b-3c80-4df1-98e4-ad1e9c78912b", 00:19:39.707 "strip_size_kb": 0, 00:19:39.707 "state": "online", 00:19:39.707 "raid_level": "raid1", 00:19:39.707 "superblock": true, 00:19:39.707 "num_base_bdevs": 4, 00:19:39.707 "num_base_bdevs_discovered": 4, 00:19:39.707 "num_base_bdevs_operational": 4, 00:19:39.707 "base_bdevs_list": [ 00:19:39.707 { 00:19:39.707 "name": "pt1", 00:19:39.707 "uuid": "14dc7e95-57a9-5bd2-bfd4-c3dc66ebef27", 00:19:39.707 "is_configured": true, 00:19:39.707 "data_offset": 2048, 00:19:39.707 "data_size": 63488 00:19:39.707 }, 00:19:39.707 { 00:19:39.707 "name": "pt2", 00:19:39.707 "uuid": "e1e83b46-f2bc-535e-be68-86befdbe25dd", 00:19:39.707 "is_configured": true, 00:19:39.707 "data_offset": 2048, 00:19:39.707 "data_size": 63488 00:19:39.707 }, 00:19:39.707 { 00:19:39.707 "name": "pt3", 00:19:39.707 "uuid": "7d7994b9-c21d-5e5b-a49d-ac7cd3cb00a3", 00:19:39.707 "is_configured": true, 00:19:39.707 "data_offset": 2048, 00:19:39.707 "data_size": 63488 00:19:39.707 }, 00:19:39.707 { 00:19:39.707 "name": "pt4", 00:19:39.707 "uuid": "4750b7a8-3d01-541f-9fac-6495a1d6cf49", 00:19:39.707 "is_configured": true, 00:19:39.707 "data_offset": 2048, 00:19:39.707 "data_size": 63488 00:19:39.707 } 00:19:39.707 ] 00:19:39.707 } 00:19:39.707 } 00:19:39.707 }' 00:19:39.707 00:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:39.707 00:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:19:39.707 pt2 00:19:39.707 pt3 00:19:39.707 pt4' 00:19:39.707 00:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:19:39.707 00:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:39.707 00:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:19:39.965 00:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:19:39.965 "name": "pt1", 00:19:39.965 "aliases": [ 00:19:39.965 "14dc7e95-57a9-5bd2-bfd4-c3dc66ebef27" 00:19:39.965 ], 00:19:39.965 "product_name": "passthru", 00:19:39.965 "block_size": 512, 00:19:39.965 "num_blocks": 65536, 00:19:39.965 "uuid": "14dc7e95-57a9-5bd2-bfd4-c3dc66ebef27", 00:19:39.965 "assigned_rate_limits": { 00:19:39.965 "rw_ios_per_sec": 0, 00:19:39.965 "rw_mbytes_per_sec": 0, 00:19:39.965 "r_mbytes_per_sec": 0, 00:19:39.965 "w_mbytes_per_sec": 0 00:19:39.965 }, 00:19:39.965 "claimed": true, 00:19:39.965 "claim_type": "exclusive_write", 00:19:39.965 "zoned": false, 00:19:39.965 "supported_io_types": { 00:19:39.965 "read": true, 00:19:39.965 "write": true, 00:19:39.965 "unmap": true, 00:19:39.965 "write_zeroes": true, 00:19:39.965 "flush": true, 00:19:39.965 "reset": true, 00:19:39.965 "compare": false, 00:19:39.965 "compare_and_write": false, 00:19:39.965 "abort": true, 00:19:39.965 "nvme_admin": false, 00:19:39.965 "nvme_io": false 00:19:39.965 }, 00:19:39.965 "memory_domains": [ 00:19:39.965 { 00:19:39.965 "dma_device_id": "system", 00:19:39.965 "dma_device_type": 1 00:19:39.965 }, 00:19:39.965 { 00:19:39.965 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:39.965 "dma_device_type": 2 00:19:39.965 } 00:19:39.965 ], 00:19:39.965 "driver_specific": { 00:19:39.965 "passthru": { 00:19:39.965 "name": "pt1", 00:19:39.965 "base_bdev_name": "malloc1" 00:19:39.966 } 00:19:39.966 } 00:19:39.966 }' 00:19:39.966 00:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:39.966 00:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:39.966 00:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:19:39.966 00:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:40.224 00:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:40.224 00:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:40.224 00:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:40.224 00:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:40.224 00:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:40.224 00:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:40.224 00:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:40.224 00:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:19:40.224 00:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:19:40.224 00:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:40.224 00:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:19:40.482 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:19:40.482 "name": "pt2", 00:19:40.482 "aliases": [ 00:19:40.482 "e1e83b46-f2bc-535e-be68-86befdbe25dd" 00:19:40.482 ], 00:19:40.482 "product_name": "passthru", 00:19:40.482 "block_size": 512, 00:19:40.482 "num_blocks": 65536, 00:19:40.482 "uuid": "e1e83b46-f2bc-535e-be68-86befdbe25dd", 00:19:40.482 "assigned_rate_limits": { 00:19:40.482 "rw_ios_per_sec": 0, 00:19:40.482 "rw_mbytes_per_sec": 0, 00:19:40.482 "r_mbytes_per_sec": 0, 00:19:40.482 "w_mbytes_per_sec": 0 00:19:40.482 }, 00:19:40.482 "claimed": true, 00:19:40.482 "claim_type": "exclusive_write", 00:19:40.482 "zoned": false, 00:19:40.482 "supported_io_types": { 00:19:40.482 "read": true, 00:19:40.482 "write": true, 00:19:40.482 "unmap": true, 00:19:40.482 "write_zeroes": true, 00:19:40.483 "flush": true, 00:19:40.483 "reset": true, 00:19:40.483 "compare": false, 00:19:40.483 "compare_and_write": false, 00:19:40.483 "abort": true, 00:19:40.483 "nvme_admin": false, 00:19:40.483 "nvme_io": false 00:19:40.483 }, 00:19:40.483 "memory_domains": [ 00:19:40.483 { 00:19:40.483 "dma_device_id": "system", 00:19:40.483 "dma_device_type": 1 00:19:40.483 }, 00:19:40.483 { 00:19:40.483 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:40.483 "dma_device_type": 2 00:19:40.483 } 00:19:40.483 ], 00:19:40.483 "driver_specific": { 00:19:40.483 "passthru": { 00:19:40.483 "name": "pt2", 00:19:40.483 "base_bdev_name": "malloc2" 00:19:40.483 } 00:19:40.483 } 00:19:40.483 }' 00:19:40.483 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:40.483 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:40.741 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:19:40.741 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:40.741 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:40.741 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:40.741 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:40.741 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:40.741 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:40.741 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:40.999 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:40.999 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:19:40.999 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:19:40.999 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:40.999 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:19:40.999 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:19:40.999 "name": "pt3", 00:19:40.999 "aliases": [ 00:19:40.999 "7d7994b9-c21d-5e5b-a49d-ac7cd3cb00a3" 00:19:40.999 ], 00:19:40.999 "product_name": "passthru", 00:19:40.999 "block_size": 512, 00:19:40.999 "num_blocks": 65536, 00:19:40.999 "uuid": "7d7994b9-c21d-5e5b-a49d-ac7cd3cb00a3", 00:19:40.999 "assigned_rate_limits": { 00:19:40.999 "rw_ios_per_sec": 0, 00:19:40.999 "rw_mbytes_per_sec": 0, 00:19:40.999 "r_mbytes_per_sec": 0, 00:19:40.999 "w_mbytes_per_sec": 0 00:19:40.999 }, 00:19:40.999 "claimed": true, 00:19:40.999 "claim_type": "exclusive_write", 00:19:40.999 "zoned": false, 00:19:40.999 "supported_io_types": { 00:19:40.999 "read": true, 00:19:40.999 "write": true, 00:19:40.999 "unmap": true, 00:19:40.999 "write_zeroes": true, 00:19:40.999 "flush": true, 00:19:40.999 "reset": true, 00:19:40.999 "compare": false, 00:19:40.999 "compare_and_write": false, 00:19:40.999 "abort": true, 00:19:40.999 "nvme_admin": false, 00:19:40.999 "nvme_io": false 00:19:40.999 }, 00:19:40.999 "memory_domains": [ 00:19:40.999 { 00:19:40.999 "dma_device_id": "system", 00:19:40.999 "dma_device_type": 1 00:19:40.999 }, 00:19:40.999 { 00:19:40.999 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:40.999 "dma_device_type": 2 00:19:40.999 } 00:19:40.999 ], 00:19:40.999 "driver_specific": { 00:19:40.999 "passthru": { 00:19:40.999 "name": "pt3", 00:19:40.999 "base_bdev_name": "malloc3" 00:19:40.999 } 00:19:40.999 } 00:19:40.999 }' 00:19:40.999 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:41.257 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:41.257 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:19:41.257 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:41.257 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:41.257 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:41.257 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:41.257 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:41.257 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:41.257 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:41.515 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:41.515 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:19:41.515 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:19:41.515 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:19:41.515 00:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:19:41.774 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:19:41.774 "name": "pt4", 00:19:41.774 "aliases": [ 00:19:41.774 "4750b7a8-3d01-541f-9fac-6495a1d6cf49" 00:19:41.774 ], 00:19:41.774 "product_name": "passthru", 00:19:41.774 "block_size": 512, 00:19:41.774 "num_blocks": 65536, 00:19:41.774 "uuid": "4750b7a8-3d01-541f-9fac-6495a1d6cf49", 00:19:41.774 "assigned_rate_limits": { 00:19:41.774 "rw_ios_per_sec": 0, 00:19:41.774 "rw_mbytes_per_sec": 0, 00:19:41.774 "r_mbytes_per_sec": 0, 00:19:41.774 "w_mbytes_per_sec": 0 00:19:41.774 }, 00:19:41.774 "claimed": true, 00:19:41.774 "claim_type": "exclusive_write", 00:19:41.774 "zoned": false, 00:19:41.774 "supported_io_types": { 00:19:41.774 "read": true, 00:19:41.774 "write": true, 00:19:41.774 "unmap": true, 00:19:41.774 "write_zeroes": true, 00:19:41.774 "flush": true, 00:19:41.774 "reset": true, 00:19:41.774 "compare": false, 00:19:41.774 "compare_and_write": false, 00:19:41.774 "abort": true, 00:19:41.774 "nvme_admin": false, 00:19:41.774 "nvme_io": false 00:19:41.774 }, 00:19:41.774 "memory_domains": [ 00:19:41.774 { 00:19:41.774 "dma_device_id": "system", 00:19:41.774 "dma_device_type": 1 00:19:41.774 }, 00:19:41.774 { 00:19:41.774 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:41.774 "dma_device_type": 2 00:19:41.774 } 00:19:41.774 ], 00:19:41.774 "driver_specific": { 00:19:41.774 "passthru": { 00:19:41.774 "name": "pt4", 00:19:41.774 "base_bdev_name": "malloc4" 00:19:41.774 } 00:19:41.774 } 00:19:41.774 }' 00:19:41.774 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:41.774 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:41.774 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:19:41.774 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:41.774 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:41.774 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:41.774 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:42.055 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:42.055 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:42.055 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:42.055 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:42.055 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:19:42.055 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:42.055 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:19:42.313 [2024-05-15 00:01:42.733163] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:42.313 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # '[' 2e63b30b-3c80-4df1-98e4-ad1e9c78912b '!=' 2e63b30b-3c80-4df1-98e4-ad1e9c78912b ']' 00:19:42.313 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@491 -- # has_redundancy raid1 00:19:42.313 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:19:42.313 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 0 00:19:42.313 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:19:42.571 [2024-05-15 00:01:42.977541] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:19:42.571 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@496 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:42.571 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:42.571 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:42.571 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:42.571 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:42.571 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:19:42.571 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:42.571 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:42.571 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:42.571 00:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:42.571 00:01:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:42.571 00:01:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:42.829 00:01:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:42.829 "name": "raid_bdev1", 00:19:42.829 "uuid": "2e63b30b-3c80-4df1-98e4-ad1e9c78912b", 00:19:42.829 "strip_size_kb": 0, 00:19:42.829 "state": "online", 00:19:42.829 "raid_level": "raid1", 00:19:42.829 "superblock": true, 00:19:42.829 "num_base_bdevs": 4, 00:19:42.829 "num_base_bdevs_discovered": 3, 00:19:42.829 "num_base_bdevs_operational": 3, 00:19:42.829 "base_bdevs_list": [ 00:19:42.829 { 00:19:42.829 "name": null, 00:19:42.829 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:42.829 "is_configured": false, 00:19:42.829 "data_offset": 2048, 00:19:42.829 "data_size": 63488 00:19:42.829 }, 00:19:42.829 { 00:19:42.829 "name": "pt2", 00:19:42.829 "uuid": "e1e83b46-f2bc-535e-be68-86befdbe25dd", 00:19:42.829 "is_configured": true, 00:19:42.829 "data_offset": 2048, 00:19:42.829 "data_size": 63488 00:19:42.829 }, 00:19:42.829 { 00:19:42.829 "name": "pt3", 00:19:42.830 "uuid": "7d7994b9-c21d-5e5b-a49d-ac7cd3cb00a3", 00:19:42.830 "is_configured": true, 00:19:42.830 "data_offset": 2048, 00:19:42.830 "data_size": 63488 00:19:42.830 }, 00:19:42.830 { 00:19:42.830 "name": "pt4", 00:19:42.830 "uuid": "4750b7a8-3d01-541f-9fac-6495a1d6cf49", 00:19:42.830 "is_configured": true, 00:19:42.830 "data_offset": 2048, 00:19:42.830 "data_size": 63488 00:19:42.830 } 00:19:42.830 ] 00:19:42.830 }' 00:19:42.830 00:01:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:42.830 00:01:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:43.395 00:01:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:43.653 [2024-05-15 00:01:44.056391] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:43.653 [2024-05-15 00:01:44.056427] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:43.653 [2024-05-15 00:01:44.056490] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:43.653 [2024-05-15 00:01:44.056561] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:43.653 [2024-05-15 00:01:44.056573] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d1a700 name raid_bdev1, state offline 00:19:43.653 00:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:43.653 00:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # jq -r '.[]' 00:19:43.911 00:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # raid_bdev= 00:19:43.911 00:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@501 -- # '[' -n '' ']' 00:19:43.911 00:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i = 1 )) 00:19:43.911 00:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:19:43.911 00:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@507 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:44.169 00:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i++ )) 00:19:44.169 00:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:19:44.169 00:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@507 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:19:44.427 00:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i++ )) 00:19:44.427 00:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:19:44.427 00:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@507 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:19:44.685 00:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i++ )) 00:19:44.685 00:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:19:44.685 00:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i = 1 )) 00:19:44.685 00:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i < num_base_bdevs - 1 )) 00:19:44.685 00:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@512 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:44.943 [2024-05-15 00:01:45.283558] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:44.943 [2024-05-15 00:01:45.283608] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:44.943 [2024-05-15 00:01:45.283627] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b686e0 00:19:44.943 [2024-05-15 00:01:45.283640] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:44.943 [2024-05-15 00:01:45.285271] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:44.943 [2024-05-15 00:01:45.285302] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:44.943 [2024-05-15 00:01:45.285372] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:19:44.943 [2024-05-15 00:01:45.285410] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:44.943 pt2 00:19:44.943 00:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:19:44.943 00:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:44.943 00:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:19:44.943 00:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:44.943 00:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:44.943 00:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:19:44.943 00:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:44.943 00:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:44.943 00:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:44.943 00:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:44.943 00:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:44.943 00:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:45.201 00:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:45.201 "name": "raid_bdev1", 00:19:45.201 "uuid": "2e63b30b-3c80-4df1-98e4-ad1e9c78912b", 00:19:45.201 "strip_size_kb": 0, 00:19:45.201 "state": "configuring", 00:19:45.201 "raid_level": "raid1", 00:19:45.201 "superblock": true, 00:19:45.201 "num_base_bdevs": 4, 00:19:45.201 "num_base_bdevs_discovered": 1, 00:19:45.201 "num_base_bdevs_operational": 3, 00:19:45.201 "base_bdevs_list": [ 00:19:45.201 { 00:19:45.201 "name": null, 00:19:45.201 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:45.201 "is_configured": false, 00:19:45.201 "data_offset": 2048, 00:19:45.201 "data_size": 63488 00:19:45.201 }, 00:19:45.201 { 00:19:45.201 "name": "pt2", 00:19:45.201 "uuid": "e1e83b46-f2bc-535e-be68-86befdbe25dd", 00:19:45.201 "is_configured": true, 00:19:45.201 "data_offset": 2048, 00:19:45.201 "data_size": 63488 00:19:45.201 }, 00:19:45.201 { 00:19:45.201 "name": null, 00:19:45.201 "uuid": "7d7994b9-c21d-5e5b-a49d-ac7cd3cb00a3", 00:19:45.201 "is_configured": false, 00:19:45.201 "data_offset": 2048, 00:19:45.201 "data_size": 63488 00:19:45.201 }, 00:19:45.201 { 00:19:45.201 "name": null, 00:19:45.201 "uuid": "4750b7a8-3d01-541f-9fac-6495a1d6cf49", 00:19:45.201 "is_configured": false, 00:19:45.201 "data_offset": 2048, 00:19:45.201 "data_size": 63488 00:19:45.201 } 00:19:45.201 ] 00:19:45.201 }' 00:19:45.201 00:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:45.201 00:01:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:45.766 00:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i++ )) 00:19:45.766 00:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i < num_base_bdevs - 1 )) 00:19:45.766 00:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@512 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:46.024 [2024-05-15 00:01:46.370689] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:46.024 [2024-05-15 00:01:46.370745] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:46.024 [2024-05-15 00:01:46.370767] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b6a370 00:19:46.024 [2024-05-15 00:01:46.370780] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:46.024 [2024-05-15 00:01:46.371142] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:46.024 [2024-05-15 00:01:46.371160] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:46.024 [2024-05-15 00:01:46.371229] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt3 00:19:46.024 [2024-05-15 00:01:46.371247] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:46.024 pt3 00:19:46.024 00:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:19:46.024 00:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:46.024 00:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:19:46.024 00:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:46.024 00:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:46.024 00:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:19:46.024 00:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:46.024 00:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:46.024 00:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:46.024 00:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:46.024 00:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:46.024 00:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:46.282 00:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:46.282 "name": "raid_bdev1", 00:19:46.282 "uuid": "2e63b30b-3c80-4df1-98e4-ad1e9c78912b", 00:19:46.282 "strip_size_kb": 0, 00:19:46.282 "state": "configuring", 00:19:46.282 "raid_level": "raid1", 00:19:46.282 "superblock": true, 00:19:46.282 "num_base_bdevs": 4, 00:19:46.282 "num_base_bdevs_discovered": 2, 00:19:46.282 "num_base_bdevs_operational": 3, 00:19:46.282 "base_bdevs_list": [ 00:19:46.282 { 00:19:46.282 "name": null, 00:19:46.282 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:46.282 "is_configured": false, 00:19:46.282 "data_offset": 2048, 00:19:46.282 "data_size": 63488 00:19:46.282 }, 00:19:46.282 { 00:19:46.282 "name": "pt2", 00:19:46.282 "uuid": "e1e83b46-f2bc-535e-be68-86befdbe25dd", 00:19:46.282 "is_configured": true, 00:19:46.282 "data_offset": 2048, 00:19:46.282 "data_size": 63488 00:19:46.282 }, 00:19:46.282 { 00:19:46.282 "name": "pt3", 00:19:46.282 "uuid": "7d7994b9-c21d-5e5b-a49d-ac7cd3cb00a3", 00:19:46.282 "is_configured": true, 00:19:46.282 "data_offset": 2048, 00:19:46.282 "data_size": 63488 00:19:46.282 }, 00:19:46.282 { 00:19:46.282 "name": null, 00:19:46.282 "uuid": "4750b7a8-3d01-541f-9fac-6495a1d6cf49", 00:19:46.282 "is_configured": false, 00:19:46.282 "data_offset": 2048, 00:19:46.282 "data_size": 63488 00:19:46.282 } 00:19:46.282 ] 00:19:46.282 }' 00:19:46.282 00:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:46.282 00:01:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:46.848 00:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i++ )) 00:19:46.848 00:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i < num_base_bdevs - 1 )) 00:19:46.848 00:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # i=3 00:19:46.848 00:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@520 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:47.107 [2024-05-15 00:01:47.445549] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:47.107 [2024-05-15 00:01:47.445612] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:47.107 [2024-05-15 00:01:47.445637] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d08ce0 00:19:47.107 [2024-05-15 00:01:47.445652] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:47.107 [2024-05-15 00:01:47.446041] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:47.107 [2024-05-15 00:01:47.446063] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:47.107 [2024-05-15 00:01:47.446138] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt4 00:19:47.107 [2024-05-15 00:01:47.446158] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:47.107 [2024-05-15 00:01:47.446277] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b699b0 00:19:47.107 [2024-05-15 00:01:47.446288] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:47.107 [2024-05-15 00:01:47.446474] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d1ad50 00:19:47.107 [2024-05-15 00:01:47.446615] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b699b0 00:19:47.107 [2024-05-15 00:01:47.446625] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b699b0 00:19:47.107 [2024-05-15 00:01:47.446726] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:47.107 pt4 00:19:47.107 00:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@523 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:47.107 00:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:47.107 00:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:47.107 00:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:47.107 00:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:47.107 00:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:19:47.107 00:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:47.107 00:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:47.107 00:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:47.107 00:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:47.107 00:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:47.107 00:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:47.364 00:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:47.364 "name": "raid_bdev1", 00:19:47.364 "uuid": "2e63b30b-3c80-4df1-98e4-ad1e9c78912b", 00:19:47.364 "strip_size_kb": 0, 00:19:47.364 "state": "online", 00:19:47.364 "raid_level": "raid1", 00:19:47.364 "superblock": true, 00:19:47.364 "num_base_bdevs": 4, 00:19:47.364 "num_base_bdevs_discovered": 3, 00:19:47.364 "num_base_bdevs_operational": 3, 00:19:47.364 "base_bdevs_list": [ 00:19:47.364 { 00:19:47.364 "name": null, 00:19:47.364 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:47.364 "is_configured": false, 00:19:47.364 "data_offset": 2048, 00:19:47.364 "data_size": 63488 00:19:47.364 }, 00:19:47.364 { 00:19:47.364 "name": "pt2", 00:19:47.364 "uuid": "e1e83b46-f2bc-535e-be68-86befdbe25dd", 00:19:47.364 "is_configured": true, 00:19:47.364 "data_offset": 2048, 00:19:47.364 "data_size": 63488 00:19:47.364 }, 00:19:47.364 { 00:19:47.364 "name": "pt3", 00:19:47.364 "uuid": "7d7994b9-c21d-5e5b-a49d-ac7cd3cb00a3", 00:19:47.364 "is_configured": true, 00:19:47.364 "data_offset": 2048, 00:19:47.364 "data_size": 63488 00:19:47.364 }, 00:19:47.364 { 00:19:47.364 "name": "pt4", 00:19:47.364 "uuid": "4750b7a8-3d01-541f-9fac-6495a1d6cf49", 00:19:47.364 "is_configured": true, 00:19:47.364 "data_offset": 2048, 00:19:47.364 "data_size": 63488 00:19:47.364 } 00:19:47.364 ] 00:19:47.364 }' 00:19:47.364 00:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:47.364 00:01:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:47.932 00:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # '[' 4 -gt 2 ']' 00:19:47.932 00:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:47.932 [2024-05-15 00:01:48.420111] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:47.932 [2024-05-15 00:01:48.420142] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:47.932 [2024-05-15 00:01:48.420202] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:47.932 [2024-05-15 00:01:48.420275] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:47.932 [2024-05-15 00:01:48.420287] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b699b0 name raid_bdev1, state offline 00:19:47.932 00:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@528 -- # jq -r '.[]' 00:19:47.932 00:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@528 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:48.190 00:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@528 -- # raid_bdev= 00:19:48.190 00:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@529 -- # '[' -n '' ']' 00:19:48.190 00:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:48.448 [2024-05-15 00:01:48.853243] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:48.448 [2024-05-15 00:01:48.853296] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:48.448 [2024-05-15 00:01:48.853317] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d13e10 00:19:48.448 [2024-05-15 00:01:48.853330] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:48.448 [2024-05-15 00:01:48.854965] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:48.448 [2024-05-15 00:01:48.854997] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:48.448 [2024-05-15 00:01:48.855066] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt1 00:19:48.448 [2024-05-15 00:01:48.855093] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:48.448 pt1 00:19:48.448 00:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@538 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:19:48.448 00:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:48.448 00:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:19:48.448 00:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:48.448 00:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:48.448 00:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:19:48.448 00:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:48.448 00:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:48.448 00:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:48.448 00:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:48.448 00:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:48.448 00:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:48.707 00:01:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:48.707 "name": "raid_bdev1", 00:19:48.707 "uuid": "2e63b30b-3c80-4df1-98e4-ad1e9c78912b", 00:19:48.707 "strip_size_kb": 0, 00:19:48.707 "state": "configuring", 00:19:48.707 "raid_level": "raid1", 00:19:48.707 "superblock": true, 00:19:48.707 "num_base_bdevs": 4, 00:19:48.707 "num_base_bdevs_discovered": 1, 00:19:48.707 "num_base_bdevs_operational": 4, 00:19:48.707 "base_bdevs_list": [ 00:19:48.707 { 00:19:48.707 "name": "pt1", 00:19:48.707 "uuid": "14dc7e95-57a9-5bd2-bfd4-c3dc66ebef27", 00:19:48.707 "is_configured": true, 00:19:48.707 "data_offset": 2048, 00:19:48.707 "data_size": 63488 00:19:48.707 }, 00:19:48.707 { 00:19:48.707 "name": null, 00:19:48.707 "uuid": "e1e83b46-f2bc-535e-be68-86befdbe25dd", 00:19:48.707 "is_configured": false, 00:19:48.707 "data_offset": 2048, 00:19:48.707 "data_size": 63488 00:19:48.707 }, 00:19:48.707 { 00:19:48.707 "name": null, 00:19:48.707 "uuid": "7d7994b9-c21d-5e5b-a49d-ac7cd3cb00a3", 00:19:48.707 "is_configured": false, 00:19:48.707 "data_offset": 2048, 00:19:48.707 "data_size": 63488 00:19:48.707 }, 00:19:48.707 { 00:19:48.707 "name": null, 00:19:48.707 "uuid": "4750b7a8-3d01-541f-9fac-6495a1d6cf49", 00:19:48.707 "is_configured": false, 00:19:48.707 "data_offset": 2048, 00:19:48.707 "data_size": 63488 00:19:48.707 } 00:19:48.707 ] 00:19:48.707 }' 00:19:48.707 00:01:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:48.707 00:01:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:49.310 00:01:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # (( i = 1 )) 00:19:49.310 00:01:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # (( i < num_base_bdevs )) 00:19:49.310 00:01:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:49.571 00:01:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # (( i++ )) 00:19:49.571 00:01:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # (( i < num_base_bdevs )) 00:19:49.571 00:01:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:19:49.828 00:01:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # (( i++ )) 00:19:49.828 00:01:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # (( i < num_base_bdevs )) 00:19:49.828 00:01:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:19:50.085 00:01:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # (( i++ )) 00:19:50.085 00:01:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # (( i < num_base_bdevs )) 00:19:50.085 00:01:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@546 -- # i=3 00:19:50.085 00:01:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@547 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:50.351 [2024-05-15 00:01:50.678088] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:50.351 [2024-05-15 00:01:50.678137] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:50.351 [2024-05-15 00:01:50.678158] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d16590 00:19:50.351 [2024-05-15 00:01:50.678172] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:50.351 [2024-05-15 00:01:50.678536] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:50.351 [2024-05-15 00:01:50.678565] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:50.351 [2024-05-15 00:01:50.678630] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt4 00:19:50.351 [2024-05-15 00:01:50.678642] bdev_raid.c:3396:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt4 (4) greater than existing raid bdev raid_bdev1 (2) 00:19:50.351 [2024-05-15 00:01:50.678653] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:50.351 [2024-05-15 00:01:50.678668] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d18e50 name raid_bdev1, state configuring 00:19:50.351 [2024-05-15 00:01:50.678699] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:50.351 pt4 00:19:50.351 00:01:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@551 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:19:50.351 00:01:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:50.351 00:01:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:19:50.351 00:01:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:50.351 00:01:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:50.351 00:01:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:19:50.351 00:01:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:50.351 00:01:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:50.351 00:01:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:50.351 00:01:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:50.351 00:01:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:50.351 00:01:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:50.608 00:01:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:50.608 "name": "raid_bdev1", 00:19:50.608 "uuid": "2e63b30b-3c80-4df1-98e4-ad1e9c78912b", 00:19:50.608 "strip_size_kb": 0, 00:19:50.608 "state": "configuring", 00:19:50.608 "raid_level": "raid1", 00:19:50.608 "superblock": true, 00:19:50.608 "num_base_bdevs": 4, 00:19:50.608 "num_base_bdevs_discovered": 1, 00:19:50.608 "num_base_bdevs_operational": 3, 00:19:50.608 "base_bdevs_list": [ 00:19:50.608 { 00:19:50.608 "name": null, 00:19:50.608 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:50.608 "is_configured": false, 00:19:50.608 "data_offset": 2048, 00:19:50.608 "data_size": 63488 00:19:50.608 }, 00:19:50.608 { 00:19:50.608 "name": null, 00:19:50.608 "uuid": "e1e83b46-f2bc-535e-be68-86befdbe25dd", 00:19:50.608 "is_configured": false, 00:19:50.608 "data_offset": 2048, 00:19:50.608 "data_size": 63488 00:19:50.608 }, 00:19:50.608 { 00:19:50.608 "name": null, 00:19:50.608 "uuid": "7d7994b9-c21d-5e5b-a49d-ac7cd3cb00a3", 00:19:50.608 "is_configured": false, 00:19:50.608 "data_offset": 2048, 00:19:50.608 "data_size": 63488 00:19:50.608 }, 00:19:50.608 { 00:19:50.608 "name": "pt4", 00:19:50.608 "uuid": "4750b7a8-3d01-541f-9fac-6495a1d6cf49", 00:19:50.608 "is_configured": true, 00:19:50.608 "data_offset": 2048, 00:19:50.608 "data_size": 63488 00:19:50.608 } 00:19:50.608 ] 00:19:50.608 }' 00:19:50.608 00:01:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:50.608 00:01:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:51.174 00:01:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # (( i = 1 )) 00:19:51.174 00:01:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # (( i < num_base_bdevs - 1 )) 00:19:51.174 00:01:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:51.174 [2024-05-15 00:01:51.744933] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:51.174 [2024-05-15 00:01:51.744991] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:51.174 [2024-05-15 00:01:51.745013] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d191e0 00:19:51.174 [2024-05-15 00:01:51.745026] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:51.174 [2024-05-15 00:01:51.745393] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:51.174 [2024-05-15 00:01:51.745426] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:51.174 [2024-05-15 00:01:51.745496] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:19:51.174 [2024-05-15 00:01:51.745516] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:51.174 pt2 00:19:51.174 00:01:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # (( i++ )) 00:19:51.174 00:01:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # (( i < num_base_bdevs - 1 )) 00:19:51.174 00:01:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:51.441 [2024-05-15 00:01:51.985565] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:51.441 [2024-05-15 00:01:51.985614] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:51.441 [2024-05-15 00:01:51.985634] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d09360 00:19:51.441 [2024-05-15 00:01:51.985647] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:51.441 [2024-05-15 00:01:51.985987] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:51.441 [2024-05-15 00:01:51.986005] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:51.441 [2024-05-15 00:01:51.986069] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt3 00:19:51.441 [2024-05-15 00:01:51.986087] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:51.441 [2024-05-15 00:01:51.986209] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d0b850 00:19:51.441 [2024-05-15 00:01:51.986220] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:51.441 [2024-05-15 00:01:51.986385] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d0c1c0 00:19:51.441 [2024-05-15 00:01:51.986542] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d0b850 00:19:51.441 [2024-05-15 00:01:51.986553] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d0b850 00:19:51.441 [2024-05-15 00:01:51.986657] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:51.441 pt3 00:19:51.441 00:01:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # (( i++ )) 00:19:51.441 00:01:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # (( i < num_base_bdevs - 1 )) 00:19:51.441 00:01:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@559 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:51.441 00:01:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:51.441 00:01:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:51.441 00:01:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:51.441 00:01:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:51.441 00:01:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:19:51.441 00:01:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:51.441 00:01:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:51.441 00:01:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:51.441 00:01:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:51.441 00:01:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:51.441 00:01:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:51.702 00:01:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:51.702 "name": "raid_bdev1", 00:19:51.702 "uuid": "2e63b30b-3c80-4df1-98e4-ad1e9c78912b", 00:19:51.702 "strip_size_kb": 0, 00:19:51.702 "state": "online", 00:19:51.702 "raid_level": "raid1", 00:19:51.702 "superblock": true, 00:19:51.702 "num_base_bdevs": 4, 00:19:51.702 "num_base_bdevs_discovered": 3, 00:19:51.702 "num_base_bdevs_operational": 3, 00:19:51.702 "base_bdevs_list": [ 00:19:51.702 { 00:19:51.702 "name": null, 00:19:51.702 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:51.702 "is_configured": false, 00:19:51.702 "data_offset": 2048, 00:19:51.702 "data_size": 63488 00:19:51.702 }, 00:19:51.702 { 00:19:51.702 "name": "pt2", 00:19:51.702 "uuid": "e1e83b46-f2bc-535e-be68-86befdbe25dd", 00:19:51.702 "is_configured": true, 00:19:51.702 "data_offset": 2048, 00:19:51.702 "data_size": 63488 00:19:51.702 }, 00:19:51.702 { 00:19:51.702 "name": "pt3", 00:19:51.702 "uuid": "7d7994b9-c21d-5e5b-a49d-ac7cd3cb00a3", 00:19:51.702 "is_configured": true, 00:19:51.702 "data_offset": 2048, 00:19:51.702 "data_size": 63488 00:19:51.702 }, 00:19:51.702 { 00:19:51.702 "name": "pt4", 00:19:51.702 "uuid": "4750b7a8-3d01-541f-9fac-6495a1d6cf49", 00:19:51.702 "is_configured": true, 00:19:51.702 "data_offset": 2048, 00:19:51.702 "data_size": 63488 00:19:51.702 } 00:19:51.702 ] 00:19:51.702 }' 00:19:51.702 00:01:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:51.702 00:01:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:52.267 00:01:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@563 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:52.267 00:01:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@563 -- # jq -r '.[] | .uuid' 00:19:52.524 [2024-05-15 00:01:53.068710] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:52.524 00:01:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@563 -- # '[' 2e63b30b-3c80-4df1-98e4-ad1e9c78912b '!=' 2e63b30b-3c80-4df1-98e4-ad1e9c78912b ']' 00:19:52.524 00:01:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@568 -- # killprocess 468447 00:19:52.524 00:01:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@946 -- # '[' -z 468447 ']' 00:19:52.524 00:01:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # kill -0 468447 00:19:52.524 00:01:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # uname 00:19:52.524 00:01:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:19:52.524 00:01:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 468447 00:19:52.782 00:01:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:19:52.782 00:01:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:19:52.782 00:01:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 468447' 00:19:52.782 killing process with pid 468447 00:19:52.782 00:01:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@965 -- # kill 468447 00:19:52.782 00:01:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@970 -- # wait 468447 00:19:52.782 [2024-05-15 00:01:53.129694] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:52.782 [2024-05-15 00:01:53.129768] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:52.782 [2024-05-15 00:01:53.129846] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:52.782 [2024-05-15 00:01:53.129860] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d0b850 name raid_bdev1, state offline 00:19:52.782 [2024-05-15 00:01:53.172136] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:53.059 00:01:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # return 0 00:19:53.059 00:19:53.059 real 0m26.553s 00:19:53.059 user 0m48.502s 00:19:53.059 sys 0m4.733s 00:19:53.059 00:01:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:19:53.059 00:01:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:53.059 ************************************ 00:19:53.059 END TEST raid_superblock_test 00:19:53.059 ************************************ 00:19:53.059 00:01:53 bdev_raid -- bdev/bdev_raid.sh@821 -- # '[' true = true ']' 00:19:53.059 00:01:53 bdev_raid -- bdev/bdev_raid.sh@822 -- # for n in 2 4 00:19:53.059 00:01:53 bdev_raid -- bdev/bdev_raid.sh@823 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:19:53.059 00:01:53 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:19:53.059 00:01:53 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:19:53.059 00:01:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:53.059 ************************************ 00:19:53.059 START TEST raid_rebuild_test 00:19:53.059 ************************************ 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 2 false false true 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_level=raid1 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local num_base_bdevs=2 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local superblock=false 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local background_io=false 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local verify=true 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i = 1 )) 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@581 -- # echo BaseBdev1 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@581 -- # echo BaseBdev2 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # local base_bdevs 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # local raid_bdev_name=raid_bdev1 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@581 -- # local strip_size 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@582 -- # local create_arg 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@583 -- # local raid_bdev_size 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@584 -- # local data_offset 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@586 -- # '[' raid1 '!=' raid1 ']' 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@594 -- # strip_size=0 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # '[' false = true ']' 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # raid_pid=472414 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@603 -- # waitforlisten 472414 /var/tmp/spdk-raid.sock 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@827 -- # '[' -z 472414 ']' 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:19:53.059 00:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:53.060 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:53.060 00:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:19:53.060 00:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:53.060 00:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:19:53.060 [2024-05-15 00:01:53.579172] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:19:53.061 [2024-05-15 00:01:53.579238] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid472414 ] 00:19:53.061 I/O size of 3145728 is greater than zero copy threshold (65536). 00:19:53.061 Zero copy mechanism will not be used. 00:19:53.323 [2024-05-15 00:01:53.705477] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:53.323 [2024-05-15 00:01:53.807526] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:19:53.323 [2024-05-15 00:01:53.863664] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:53.323 [2024-05-15 00:01:53.863699] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:54.256 00:01:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:19:54.256 00:01:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@860 -- # return 0 00:19:54.256 00:01:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:19:54.256 00:01:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:54.256 BaseBdev1_malloc 00:19:54.256 00:01:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:19:54.514 [2024-05-15 00:01:54.955162] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:19:54.514 [2024-05-15 00:01:54.955213] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:54.514 [2024-05-15 00:01:54.955237] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2764b50 00:19:54.514 [2024-05-15 00:01:54.955250] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:54.514 [2024-05-15 00:01:54.956985] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:54.514 [2024-05-15 00:01:54.957016] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:54.514 BaseBdev1 00:19:54.514 00:01:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:19:54.514 00:01:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:54.772 BaseBdev2_malloc 00:19:54.772 00:01:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:19:55.029 [2024-05-15 00:01:55.446095] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:19:55.029 [2024-05-15 00:01:55.446143] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:55.029 [2024-05-15 00:01:55.446165] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x290ad10 00:19:55.029 [2024-05-15 00:01:55.446177] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:55.029 [2024-05-15 00:01:55.447841] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:55.029 [2024-05-15 00:01:55.447873] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:55.029 BaseBdev2 00:19:55.029 00:01:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:19:55.287 spare_malloc 00:19:55.287 00:01:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@613 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:19:55.287 spare_delay 00:19:55.545 00:01:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@614 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:55.545 [2024-05-15 00:01:56.121771] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:55.545 [2024-05-15 00:01:56.121817] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:55.545 [2024-05-15 00:01:56.121839] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x290d240 00:19:55.545 [2024-05-15 00:01:56.121851] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:55.545 [2024-05-15 00:01:56.123500] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:55.545 [2024-05-15 00:01:56.123531] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:55.545 spare 00:19:55.803 00:01:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:19:55.803 [2024-05-15 00:01:56.366449] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:55.803 [2024-05-15 00:01:56.367823] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:55.803 [2024-05-15 00:01:56.367904] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x275df00 00:19:55.803 [2024-05-15 00:01:56.367915] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:19:55.803 [2024-05-15 00:01:56.368133] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x275c510 00:19:55.803 [2024-05-15 00:01:56.368288] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x275df00 00:19:55.803 [2024-05-15 00:01:56.368299] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x275df00 00:19:55.803 [2024-05-15 00:01:56.368434] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:55.803 00:01:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:55.803 00:01:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:55.803 00:01:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:55.803 00:01:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:55.803 00:01:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:55.803 00:01:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:19:55.803 00:01:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:55.803 00:01:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:55.803 00:01:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:55.803 00:01:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:55.803 00:01:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:55.803 00:01:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:56.062 00:01:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:56.062 "name": "raid_bdev1", 00:19:56.062 "uuid": "c46d16e1-dd75-47cb-9713-ab94159838a9", 00:19:56.062 "strip_size_kb": 0, 00:19:56.062 "state": "online", 00:19:56.062 "raid_level": "raid1", 00:19:56.062 "superblock": false, 00:19:56.062 "num_base_bdevs": 2, 00:19:56.062 "num_base_bdevs_discovered": 2, 00:19:56.062 "num_base_bdevs_operational": 2, 00:19:56.062 "base_bdevs_list": [ 00:19:56.062 { 00:19:56.062 "name": "BaseBdev1", 00:19:56.062 "uuid": "94b68246-1b13-5a31-8e70-15ee901c987f", 00:19:56.062 "is_configured": true, 00:19:56.062 "data_offset": 0, 00:19:56.062 "data_size": 65536 00:19:56.062 }, 00:19:56.062 { 00:19:56.062 "name": "BaseBdev2", 00:19:56.062 "uuid": "4b99068f-f7cb-5837-869b-44aa7a9322ba", 00:19:56.062 "is_configured": true, 00:19:56.062 "data_offset": 0, 00:19:56.062 "data_size": 65536 00:19:56.062 } 00:19:56.062 ] 00:19:56.062 }' 00:19:56.062 00:01:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:56.062 00:01:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:56.996 00:01:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@621 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:56.996 00:01:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@621 -- # jq -r '.[].num_blocks' 00:19:56.996 [2024-05-15 00:01:57.481591] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:56.996 00:01:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@621 -- # raid_bdev_size=65536 00:19:56.996 00:01:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:56.996 00:01:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:19:57.254 00:01:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # data_offset=0 00:19:57.254 00:01:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@626 -- # '[' false = true ']' 00:19:57.254 00:01:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@629 -- # '[' true = true ']' 00:19:57.254 00:01:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@630 -- # local write_unit_size 00:19:57.254 00:01:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@633 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:19:57.254 00:01:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:57.254 00:01:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:19:57.254 00:01:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:57.254 00:01:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:19:57.254 00:01:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:57.254 00:01:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:19:57.254 00:01:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:57.254 00:01:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:57.254 00:01:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:19:57.513 [2024-05-15 00:01:57.970832] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x275c510 00:19:57.513 /dev/nbd0 00:19:57.513 00:01:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:57.513 00:01:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:57.513 00:01:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:19:57.513 00:01:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@865 -- # local i 00:19:57.513 00:01:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:19:57.513 00:01:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:19:57.513 00:01:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:19:57.513 00:01:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # break 00:19:57.513 00:01:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:19:57.513 00:01:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:19:57.513 00:01:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:57.513 1+0 records in 00:19:57.513 1+0 records out 00:19:57.513 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000279323 s, 14.7 MB/s 00:19:57.513 00:01:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:57.513 00:01:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # size=4096 00:19:57.513 00:01:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:57.513 00:01:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:19:57.513 00:01:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # return 0 00:19:57.513 00:01:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:57.513 00:01:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:57.513 00:01:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # '[' raid1 = raid5f ']' 00:19:57.513 00:01:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@638 -- # write_unit_size=1 00:19:57.513 00:01:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@640 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:20:04.062 65536+0 records in 00:20:04.062 65536+0 records out 00:20:04.062 33554432 bytes (34 MB, 32 MiB) copied, 5.37049 s, 6.2 MB/s 00:20:04.062 00:02:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@641 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:20:04.062 00:02:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:04.062 00:02:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:20:04.062 00:02:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:04.062 00:02:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:20:04.062 00:02:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:04.062 00:02:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:04.062 [2024-05-15 00:02:03.679291] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:04.062 00:02:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:04.062 00:02:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:04.062 00:02:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:04.062 00:02:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:04.062 00:02:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:04.062 00:02:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:04.062 00:02:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:20:04.062 00:02:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:20:04.062 00:02:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:20:04.062 [2024-05-15 00:02:03.915973] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:04.062 00:02:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@648 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:04.062 00:02:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:04.062 00:02:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:04.062 00:02:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:04.063 00:02:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:04.063 00:02:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:20:04.063 00:02:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:04.063 00:02:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:04.063 00:02:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:04.063 00:02:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:04.063 00:02:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:04.063 00:02:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:04.063 00:02:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:04.063 "name": "raid_bdev1", 00:20:04.063 "uuid": "c46d16e1-dd75-47cb-9713-ab94159838a9", 00:20:04.063 "strip_size_kb": 0, 00:20:04.063 "state": "online", 00:20:04.063 "raid_level": "raid1", 00:20:04.063 "superblock": false, 00:20:04.063 "num_base_bdevs": 2, 00:20:04.063 "num_base_bdevs_discovered": 1, 00:20:04.063 "num_base_bdevs_operational": 1, 00:20:04.063 "base_bdevs_list": [ 00:20:04.063 { 00:20:04.063 "name": null, 00:20:04.063 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:04.063 "is_configured": false, 00:20:04.063 "data_offset": 0, 00:20:04.063 "data_size": 65536 00:20:04.063 }, 00:20:04.063 { 00:20:04.063 "name": "BaseBdev2", 00:20:04.063 "uuid": "4b99068f-f7cb-5837-869b-44aa7a9322ba", 00:20:04.063 "is_configured": true, 00:20:04.063 "data_offset": 0, 00:20:04.063 "data_size": 65536 00:20:04.063 } 00:20:04.063 ] 00:20:04.063 }' 00:20:04.063 00:02:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:04.063 00:02:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:04.319 00:02:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:04.577 [2024-05-15 00:02:04.990841] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:04.577 [2024-05-15 00:02:04.995777] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x275d570 00:20:04.578 [2024-05-15 00:02:04.997995] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:04.578 00:02:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # sleep 1 00:20:05.507 00:02:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:05.507 00:02:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:05.507 00:02:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:05.507 00:02:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:05.507 00:02:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:05.507 00:02:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:05.507 00:02:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:05.764 00:02:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:05.764 "name": "raid_bdev1", 00:20:05.764 "uuid": "c46d16e1-dd75-47cb-9713-ab94159838a9", 00:20:05.764 "strip_size_kb": 0, 00:20:05.764 "state": "online", 00:20:05.764 "raid_level": "raid1", 00:20:05.764 "superblock": false, 00:20:05.764 "num_base_bdevs": 2, 00:20:05.764 "num_base_bdevs_discovered": 2, 00:20:05.764 "num_base_bdevs_operational": 2, 00:20:05.764 "process": { 00:20:05.764 "type": "rebuild", 00:20:05.764 "target": "spare", 00:20:05.764 "progress": { 00:20:05.764 "blocks": 22528, 00:20:05.764 "percent": 34 00:20:05.764 } 00:20:05.764 }, 00:20:05.765 "base_bdevs_list": [ 00:20:05.765 { 00:20:05.765 "name": "spare", 00:20:05.765 "uuid": "bb33c19c-ca2b-5c6c-aaf4-0b6e2d4a8820", 00:20:05.765 "is_configured": true, 00:20:05.765 "data_offset": 0, 00:20:05.765 "data_size": 65536 00:20:05.765 }, 00:20:05.765 { 00:20:05.765 "name": "BaseBdev2", 00:20:05.765 "uuid": "4b99068f-f7cb-5837-869b-44aa7a9322ba", 00:20:05.765 "is_configured": true, 00:20:05.765 "data_offset": 0, 00:20:05.765 "data_size": 65536 00:20:05.765 } 00:20:05.765 ] 00:20:05.765 }' 00:20:05.765 00:02:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:05.765 00:02:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:05.765 00:02:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:05.765 00:02:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:05.765 00:02:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:06.023 [2024-05-15 00:02:06.504157] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:06.023 [2024-05-15 00:02:06.509608] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:06.023 [2024-05-15 00:02:06.509651] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:06.023 00:02:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:06.023 00:02:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:06.023 00:02:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:06.023 00:02:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:06.023 00:02:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:06.023 00:02:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:20:06.023 00:02:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:06.023 00:02:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:06.023 00:02:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:06.023 00:02:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:06.023 00:02:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:06.023 00:02:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:06.281 00:02:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:06.281 "name": "raid_bdev1", 00:20:06.281 "uuid": "c46d16e1-dd75-47cb-9713-ab94159838a9", 00:20:06.281 "strip_size_kb": 0, 00:20:06.281 "state": "online", 00:20:06.281 "raid_level": "raid1", 00:20:06.281 "superblock": false, 00:20:06.281 "num_base_bdevs": 2, 00:20:06.281 "num_base_bdevs_discovered": 1, 00:20:06.281 "num_base_bdevs_operational": 1, 00:20:06.281 "base_bdevs_list": [ 00:20:06.281 { 00:20:06.281 "name": null, 00:20:06.281 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:06.281 "is_configured": false, 00:20:06.281 "data_offset": 0, 00:20:06.281 "data_size": 65536 00:20:06.281 }, 00:20:06.281 { 00:20:06.281 "name": "BaseBdev2", 00:20:06.281 "uuid": "4b99068f-f7cb-5837-869b-44aa7a9322ba", 00:20:06.281 "is_configured": true, 00:20:06.281 "data_offset": 0, 00:20:06.281 "data_size": 65536 00:20:06.281 } 00:20:06.281 ] 00:20:06.281 }' 00:20:06.281 00:02:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:06.281 00:02:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:06.862 00:02:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:06.862 00:02:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:06.862 00:02:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:20:06.862 00:02:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=none 00:20:06.862 00:02:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:06.862 00:02:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:06.862 00:02:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:07.124 00:02:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:07.124 "name": "raid_bdev1", 00:20:07.124 "uuid": "c46d16e1-dd75-47cb-9713-ab94159838a9", 00:20:07.124 "strip_size_kb": 0, 00:20:07.124 "state": "online", 00:20:07.124 "raid_level": "raid1", 00:20:07.124 "superblock": false, 00:20:07.124 "num_base_bdevs": 2, 00:20:07.124 "num_base_bdevs_discovered": 1, 00:20:07.124 "num_base_bdevs_operational": 1, 00:20:07.124 "base_bdevs_list": [ 00:20:07.124 { 00:20:07.124 "name": null, 00:20:07.124 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:07.124 "is_configured": false, 00:20:07.124 "data_offset": 0, 00:20:07.124 "data_size": 65536 00:20:07.124 }, 00:20:07.124 { 00:20:07.124 "name": "BaseBdev2", 00:20:07.124 "uuid": "4b99068f-f7cb-5837-869b-44aa7a9322ba", 00:20:07.124 "is_configured": true, 00:20:07.124 "data_offset": 0, 00:20:07.124 "data_size": 65536 00:20:07.124 } 00:20:07.124 ] 00:20:07.124 }' 00:20:07.124 00:02:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:07.124 00:02:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:07.124 00:02:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:07.124 00:02:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:20:07.124 00:02:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@667 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:07.382 [2024-05-15 00:02:07.930569] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:07.382 [2024-05-15 00:02:07.936168] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2916640 00:20:07.382 [2024-05-15 00:02:07.937681] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:07.382 00:02:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@668 -- # sleep 1 00:20:08.397 00:02:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@669 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:08.397 00:02:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:08.397 00:02:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:08.397 00:02:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:08.397 00:02:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:08.397 00:02:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:08.397 00:02:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:08.654 00:02:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:08.654 "name": "raid_bdev1", 00:20:08.654 "uuid": "c46d16e1-dd75-47cb-9713-ab94159838a9", 00:20:08.654 "strip_size_kb": 0, 00:20:08.654 "state": "online", 00:20:08.654 "raid_level": "raid1", 00:20:08.654 "superblock": false, 00:20:08.654 "num_base_bdevs": 2, 00:20:08.654 "num_base_bdevs_discovered": 2, 00:20:08.654 "num_base_bdevs_operational": 2, 00:20:08.654 "process": { 00:20:08.654 "type": "rebuild", 00:20:08.654 "target": "spare", 00:20:08.654 "progress": { 00:20:08.654 "blocks": 24576, 00:20:08.654 "percent": 37 00:20:08.654 } 00:20:08.654 }, 00:20:08.654 "base_bdevs_list": [ 00:20:08.654 { 00:20:08.654 "name": "spare", 00:20:08.654 "uuid": "bb33c19c-ca2b-5c6c-aaf4-0b6e2d4a8820", 00:20:08.654 "is_configured": true, 00:20:08.654 "data_offset": 0, 00:20:08.654 "data_size": 65536 00:20:08.654 }, 00:20:08.654 { 00:20:08.654 "name": "BaseBdev2", 00:20:08.654 "uuid": "4b99068f-f7cb-5837-869b-44aa7a9322ba", 00:20:08.654 "is_configured": true, 00:20:08.654 "data_offset": 0, 00:20:08.654 "data_size": 65536 00:20:08.654 } 00:20:08.654 ] 00:20:08.654 }' 00:20:08.654 00:02:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:08.654 00:02:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:08.911 00:02:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:08.911 00:02:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:08.911 00:02:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@671 -- # '[' false = true ']' 00:20:08.911 00:02:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@696 -- # local num_base_bdevs_operational=2 00:20:08.911 00:02:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # '[' raid1 = raid1 ']' 00:20:08.911 00:02:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # '[' 2 -gt 2 ']' 00:20:08.911 00:02:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@711 -- # local timeout=620 00:20:08.911 00:02:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:20:08.911 00:02:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:08.911 00:02:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:08.911 00:02:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:08.911 00:02:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:08.911 00:02:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:08.911 00:02:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:08.911 00:02:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:09.168 00:02:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:09.168 "name": "raid_bdev1", 00:20:09.168 "uuid": "c46d16e1-dd75-47cb-9713-ab94159838a9", 00:20:09.168 "strip_size_kb": 0, 00:20:09.168 "state": "online", 00:20:09.168 "raid_level": "raid1", 00:20:09.168 "superblock": false, 00:20:09.168 "num_base_bdevs": 2, 00:20:09.168 "num_base_bdevs_discovered": 2, 00:20:09.168 "num_base_bdevs_operational": 2, 00:20:09.168 "process": { 00:20:09.168 "type": "rebuild", 00:20:09.168 "target": "spare", 00:20:09.168 "progress": { 00:20:09.168 "blocks": 30720, 00:20:09.168 "percent": 46 00:20:09.168 } 00:20:09.168 }, 00:20:09.168 "base_bdevs_list": [ 00:20:09.168 { 00:20:09.168 "name": "spare", 00:20:09.168 "uuid": "bb33c19c-ca2b-5c6c-aaf4-0b6e2d4a8820", 00:20:09.168 "is_configured": true, 00:20:09.168 "data_offset": 0, 00:20:09.168 "data_size": 65536 00:20:09.168 }, 00:20:09.168 { 00:20:09.168 "name": "BaseBdev2", 00:20:09.168 "uuid": "4b99068f-f7cb-5837-869b-44aa7a9322ba", 00:20:09.168 "is_configured": true, 00:20:09.168 "data_offset": 0, 00:20:09.168 "data_size": 65536 00:20:09.168 } 00:20:09.168 ] 00:20:09.168 }' 00:20:09.168 00:02:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:09.168 00:02:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:09.168 00:02:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:09.168 00:02:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:09.168 00:02:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@716 -- # sleep 1 00:20:10.101 00:02:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:20:10.101 00:02:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:10.101 00:02:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:10.101 00:02:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:10.101 00:02:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:10.101 00:02:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:10.101 00:02:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:10.101 00:02:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:10.359 00:02:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:10.359 "name": "raid_bdev1", 00:20:10.359 "uuid": "c46d16e1-dd75-47cb-9713-ab94159838a9", 00:20:10.359 "strip_size_kb": 0, 00:20:10.359 "state": "online", 00:20:10.359 "raid_level": "raid1", 00:20:10.359 "superblock": false, 00:20:10.359 "num_base_bdevs": 2, 00:20:10.359 "num_base_bdevs_discovered": 2, 00:20:10.359 "num_base_bdevs_operational": 2, 00:20:10.359 "process": { 00:20:10.359 "type": "rebuild", 00:20:10.359 "target": "spare", 00:20:10.359 "progress": { 00:20:10.359 "blocks": 57344, 00:20:10.359 "percent": 87 00:20:10.359 } 00:20:10.359 }, 00:20:10.359 "base_bdevs_list": [ 00:20:10.359 { 00:20:10.359 "name": "spare", 00:20:10.359 "uuid": "bb33c19c-ca2b-5c6c-aaf4-0b6e2d4a8820", 00:20:10.359 "is_configured": true, 00:20:10.359 "data_offset": 0, 00:20:10.359 "data_size": 65536 00:20:10.359 }, 00:20:10.359 { 00:20:10.359 "name": "BaseBdev2", 00:20:10.359 "uuid": "4b99068f-f7cb-5837-869b-44aa7a9322ba", 00:20:10.359 "is_configured": true, 00:20:10.359 "data_offset": 0, 00:20:10.359 "data_size": 65536 00:20:10.359 } 00:20:10.359 ] 00:20:10.359 }' 00:20:10.359 00:02:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:10.359 00:02:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:10.359 00:02:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:10.617 00:02:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:10.617 00:02:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@716 -- # sleep 1 00:20:10.617 [2024-05-15 00:02:11.163190] bdev_raid.c:2741:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:20:10.617 [2024-05-15 00:02:11.163250] bdev_raid.c:2458:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:20:10.617 [2024-05-15 00:02:11.163288] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:11.555 00:02:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:20:11.555 00:02:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:11.555 00:02:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:11.555 00:02:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:11.555 00:02:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:11.555 00:02:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:11.555 00:02:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:11.555 00:02:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:11.812 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:11.812 "name": "raid_bdev1", 00:20:11.812 "uuid": "c46d16e1-dd75-47cb-9713-ab94159838a9", 00:20:11.812 "strip_size_kb": 0, 00:20:11.812 "state": "online", 00:20:11.812 "raid_level": "raid1", 00:20:11.812 "superblock": false, 00:20:11.812 "num_base_bdevs": 2, 00:20:11.812 "num_base_bdevs_discovered": 2, 00:20:11.812 "num_base_bdevs_operational": 2, 00:20:11.812 "base_bdevs_list": [ 00:20:11.812 { 00:20:11.812 "name": "spare", 00:20:11.812 "uuid": "bb33c19c-ca2b-5c6c-aaf4-0b6e2d4a8820", 00:20:11.812 "is_configured": true, 00:20:11.812 "data_offset": 0, 00:20:11.812 "data_size": 65536 00:20:11.812 }, 00:20:11.812 { 00:20:11.812 "name": "BaseBdev2", 00:20:11.812 "uuid": "4b99068f-f7cb-5837-869b-44aa7a9322ba", 00:20:11.812 "is_configured": true, 00:20:11.812 "data_offset": 0, 00:20:11.812 "data_size": 65536 00:20:11.812 } 00:20:11.812 ] 00:20:11.812 }' 00:20:11.812 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:11.812 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:20:11.812 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:11.812 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:20:11.813 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # break 00:20:11.813 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@720 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:11.813 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:11.813 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:20:11.813 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=none 00:20:11.813 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:11.813 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:11.813 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:12.070 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:12.070 "name": "raid_bdev1", 00:20:12.070 "uuid": "c46d16e1-dd75-47cb-9713-ab94159838a9", 00:20:12.070 "strip_size_kb": 0, 00:20:12.070 "state": "online", 00:20:12.070 "raid_level": "raid1", 00:20:12.070 "superblock": false, 00:20:12.070 "num_base_bdevs": 2, 00:20:12.070 "num_base_bdevs_discovered": 2, 00:20:12.070 "num_base_bdevs_operational": 2, 00:20:12.070 "base_bdevs_list": [ 00:20:12.070 { 00:20:12.070 "name": "spare", 00:20:12.070 "uuid": "bb33c19c-ca2b-5c6c-aaf4-0b6e2d4a8820", 00:20:12.070 "is_configured": true, 00:20:12.070 "data_offset": 0, 00:20:12.070 "data_size": 65536 00:20:12.070 }, 00:20:12.070 { 00:20:12.070 "name": "BaseBdev2", 00:20:12.070 "uuid": "4b99068f-f7cb-5837-869b-44aa7a9322ba", 00:20:12.070 "is_configured": true, 00:20:12.070 "data_offset": 0, 00:20:12.070 "data_size": 65536 00:20:12.070 } 00:20:12.070 ] 00:20:12.070 }' 00:20:12.070 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:12.070 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:12.070 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:12.070 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:20:12.070 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:12.070 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:12.070 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:12.070 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:12.070 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:12.070 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:20:12.070 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:12.070 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:12.070 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:12.070 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:12.070 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:12.070 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:12.328 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:12.328 "name": "raid_bdev1", 00:20:12.328 "uuid": "c46d16e1-dd75-47cb-9713-ab94159838a9", 00:20:12.328 "strip_size_kb": 0, 00:20:12.328 "state": "online", 00:20:12.328 "raid_level": "raid1", 00:20:12.328 "superblock": false, 00:20:12.328 "num_base_bdevs": 2, 00:20:12.328 "num_base_bdevs_discovered": 2, 00:20:12.328 "num_base_bdevs_operational": 2, 00:20:12.328 "base_bdevs_list": [ 00:20:12.328 { 00:20:12.328 "name": "spare", 00:20:12.328 "uuid": "bb33c19c-ca2b-5c6c-aaf4-0b6e2d4a8820", 00:20:12.328 "is_configured": true, 00:20:12.328 "data_offset": 0, 00:20:12.328 "data_size": 65536 00:20:12.328 }, 00:20:12.328 { 00:20:12.328 "name": "BaseBdev2", 00:20:12.328 "uuid": "4b99068f-f7cb-5837-869b-44aa7a9322ba", 00:20:12.328 "is_configured": true, 00:20:12.328 "data_offset": 0, 00:20:12.328 "data_size": 65536 00:20:12.328 } 00:20:12.328 ] 00:20:12.328 }' 00:20:12.328 00:02:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:12.328 00:02:12 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:12.891 00:02:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@724 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:13.148 [2024-05-15 00:02:13.658977] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:13.148 [2024-05-15 00:02:13.659006] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:13.148 [2024-05-15 00:02:13.659069] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:13.148 [2024-05-15 00:02:13.659126] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:13.148 [2024-05-15 00:02:13.659139] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x275df00 name raid_bdev1, state offline 00:20:13.148 00:02:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@725 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:13.148 00:02:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@725 -- # jq length 00:20:13.405 00:02:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@725 -- # [[ 0 == 0 ]] 00:20:13.405 00:02:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@727 -- # '[' true = true ']' 00:20:13.405 00:02:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@728 -- # '[' false = true ']' 00:20:13.405 00:02:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:20:13.405 00:02:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:13.405 00:02:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:20:13.405 00:02:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:13.405 00:02:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:20:13.405 00:02:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:13.405 00:02:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:20:13.405 00:02:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:13.405 00:02:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:13.405 00:02:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:20:13.662 /dev/nbd0 00:20:13.662 00:02:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:13.662 00:02:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:13.662 00:02:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:20:13.662 00:02:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@865 -- # local i 00:20:13.662 00:02:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:20:13.662 00:02:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:20:13.662 00:02:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:20:13.662 00:02:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # break 00:20:13.662 00:02:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:20:13.662 00:02:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:20:13.662 00:02:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:13.662 1+0 records in 00:20:13.662 1+0 records out 00:20:13.662 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269134 s, 15.2 MB/s 00:20:13.662 00:02:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:13.662 00:02:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # size=4096 00:20:13.662 00:02:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:13.662 00:02:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:20:13.662 00:02:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # return 0 00:20:13.662 00:02:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:13.662 00:02:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:13.662 00:02:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:20:13.920 /dev/nbd1 00:20:13.920 00:02:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:20:13.920 00:02:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:20:13.920 00:02:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:20:13.920 00:02:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@865 -- # local i 00:20:13.920 00:02:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:20:13.920 00:02:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:20:13.920 00:02:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:20:13.920 00:02:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # break 00:20:13.920 00:02:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:20:13.920 00:02:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:20:13.920 00:02:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:13.920 1+0 records in 00:20:13.920 1+0 records out 00:20:13.920 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000498216 s, 8.2 MB/s 00:20:13.920 00:02:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:13.920 00:02:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # size=4096 00:20:13.920 00:02:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:13.920 00:02:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:20:13.920 00:02:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # return 0 00:20:13.920 00:02:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:13.920 00:02:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:13.920 00:02:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@743 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:20:14.177 00:02:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@744 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:20:14.177 00:02:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:14.177 00:02:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:20:14.177 00:02:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:14.177 00:02:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:20:14.177 00:02:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:14.177 00:02:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:14.434 00:02:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:14.434 00:02:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:14.434 00:02:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:14.434 00:02:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:14.434 00:02:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:14.434 00:02:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:14.434 00:02:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:20:14.434 00:02:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:20:14.434 00:02:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:14.434 00:02:14 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:20:14.692 00:02:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:20:14.692 00:02:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:20:14.692 00:02:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:20:14.692 00:02:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:14.692 00:02:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:14.692 00:02:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:20:14.692 00:02:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:20:14.692 00:02:15 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:20:14.692 00:02:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@748 -- # '[' false = true ']' 00:20:14.692 00:02:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@795 -- # killprocess 472414 00:20:14.692 00:02:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@946 -- # '[' -z 472414 ']' 00:20:14.692 00:02:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@950 -- # kill -0 472414 00:20:14.692 00:02:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@951 -- # uname 00:20:14.692 00:02:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:20:14.692 00:02:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 472414 00:20:14.692 00:02:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:20:14.692 00:02:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:20:14.692 00:02:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 472414' 00:20:14.692 killing process with pid 472414 00:20:14.692 00:02:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@965 -- # kill 472414 00:20:14.692 Received shutdown signal, test time was about 60.000000 seconds 00:20:14.692 00:20:14.692 Latency(us) 00:20:14.692 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:14.692 =================================================================================================================== 00:20:14.692 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:20:14.692 [2024-05-15 00:02:15.173353] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:14.692 00:02:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@970 -- # wait 472414 00:20:14.692 [2024-05-15 00:02:15.201605] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:14.949 00:02:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@797 -- # return 0 00:20:14.949 00:20:14.949 real 0m21.933s 00:20:14.949 user 0m29.212s 00:20:14.949 sys 0m5.119s 00:20:14.950 00:02:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:20:14.950 00:02:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:14.950 ************************************ 00:20:14.950 END TEST raid_rebuild_test 00:20:14.950 ************************************ 00:20:14.950 00:02:15 bdev_raid -- bdev/bdev_raid.sh@824 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:20:14.950 00:02:15 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:20:14.950 00:02:15 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:20:14.950 00:02:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:14.950 ************************************ 00:20:14.950 START TEST raid_rebuild_test_sb 00:20:14.950 ************************************ 00:20:14.950 00:02:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 2 true false true 00:20:14.950 00:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_level=raid1 00:20:14.950 00:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local num_base_bdevs=2 00:20:14.950 00:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local superblock=true 00:20:14.950 00:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local background_io=false 00:20:14.950 00:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local verify=true 00:20:14.950 00:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i = 1 )) 00:20:14.950 00:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:20:14.950 00:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@581 -- # echo BaseBdev1 00:20:14.950 00:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:20:14.950 00:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:20:14.950 00:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@581 -- # echo BaseBdev2 00:20:14.950 00:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:20:14.950 00:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:20:14.950 00:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:20:14.950 00:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # local base_bdevs 00:20:14.950 00:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # local raid_bdev_name=raid_bdev1 00:20:14.950 00:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@581 -- # local strip_size 00:20:15.208 00:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@582 -- # local create_arg 00:20:15.208 00:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@583 -- # local raid_bdev_size 00:20:15.208 00:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@584 -- # local data_offset 00:20:15.208 00:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@586 -- # '[' raid1 '!=' raid1 ']' 00:20:15.208 00:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@594 -- # strip_size=0 00:20:15.208 00:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # '[' true = true ']' 00:20:15.208 00:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@598 -- # create_arg+=' -s' 00:20:15.208 00:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # raid_pid=475471 00:20:15.208 00:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@603 -- # waitforlisten 475471 /var/tmp/spdk-raid.sock 00:20:15.208 00:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:20:15.208 00:02:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@827 -- # '[' -z 475471 ']' 00:20:15.208 00:02:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:15.208 00:02:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:20:15.208 00:02:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:15.208 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:15.208 00:02:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:20:15.208 00:02:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:15.208 [2024-05-15 00:02:15.600786] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:20:15.208 [2024-05-15 00:02:15.600847] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid475471 ] 00:20:15.208 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:15.208 Zero copy mechanism will not be used. 00:20:15.208 [2024-05-15 00:02:15.728154] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:15.466 [2024-05-15 00:02:15.833861] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:20:15.466 [2024-05-15 00:02:15.900674] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:15.466 [2024-05-15 00:02:15.900718] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:16.032 00:02:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:20:16.032 00:02:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@860 -- # return 0 00:20:16.032 00:02:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:20:16.032 00:02:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:16.290 BaseBdev1_malloc 00:20:16.290 00:02:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:16.548 [2024-05-15 00:02:16.998286] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:16.548 [2024-05-15 00:02:16.998334] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:16.548 [2024-05-15 00:02:16.998357] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd06b50 00:20:16.548 [2024-05-15 00:02:16.998370] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:16.548 [2024-05-15 00:02:17.000237] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:16.548 [2024-05-15 00:02:17.000269] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:16.548 BaseBdev1 00:20:16.548 00:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:20:16.548 00:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:16.806 BaseBdev2_malloc 00:20:16.806 00:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:20:17.064 [2024-05-15 00:02:17.488757] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:20:17.064 [2024-05-15 00:02:17.488805] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:17.064 [2024-05-15 00:02:17.488826] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xeacd10 00:20:17.064 [2024-05-15 00:02:17.488839] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:17.064 [2024-05-15 00:02:17.490445] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:17.064 [2024-05-15 00:02:17.490476] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:17.064 BaseBdev2 00:20:17.064 00:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:20:17.322 spare_malloc 00:20:17.322 00:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@613 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:20:17.580 spare_delay 00:20:17.580 00:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@614 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:17.838 [2024-05-15 00:02:18.215278] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:17.838 [2024-05-15 00:02:18.215326] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:17.838 [2024-05-15 00:02:18.215347] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xeaf240 00:20:17.838 [2024-05-15 00:02:18.215360] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:17.838 [2024-05-15 00:02:18.217007] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:17.838 [2024-05-15 00:02:18.217037] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:17.838 spare 00:20:17.838 00:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:20:18.096 [2024-05-15 00:02:18.451932] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:18.096 [2024-05-15 00:02:18.453290] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:18.096 [2024-05-15 00:02:18.453473] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xcfff00 00:20:18.096 [2024-05-15 00:02:18.453487] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:18.096 [2024-05-15 00:02:18.453696] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcfe510 00:20:18.096 [2024-05-15 00:02:18.453844] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcfff00 00:20:18.096 [2024-05-15 00:02:18.453854] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xcfff00 00:20:18.096 [2024-05-15 00:02:18.453956] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:18.096 00:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:18.096 00:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:18.096 00:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:18.096 00:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:18.096 00:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:18.096 00:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:20:18.096 00:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:18.096 00:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:18.096 00:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:18.096 00:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:18.096 00:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:18.096 00:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:18.354 00:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:18.354 "name": "raid_bdev1", 00:20:18.354 "uuid": "fb2b3c58-059e-4167-9b3e-5fe9f41c49f0", 00:20:18.354 "strip_size_kb": 0, 00:20:18.354 "state": "online", 00:20:18.354 "raid_level": "raid1", 00:20:18.354 "superblock": true, 00:20:18.354 "num_base_bdevs": 2, 00:20:18.354 "num_base_bdevs_discovered": 2, 00:20:18.354 "num_base_bdevs_operational": 2, 00:20:18.354 "base_bdevs_list": [ 00:20:18.354 { 00:20:18.354 "name": "BaseBdev1", 00:20:18.354 "uuid": "ced0f674-9853-53fa-a4e5-a916d9b6433c", 00:20:18.354 "is_configured": true, 00:20:18.354 "data_offset": 2048, 00:20:18.354 "data_size": 63488 00:20:18.354 }, 00:20:18.354 { 00:20:18.354 "name": "BaseBdev2", 00:20:18.354 "uuid": "58b6500b-2bf4-57b8-bfb8-8ddf54c2e7d2", 00:20:18.354 "is_configured": true, 00:20:18.354 "data_offset": 2048, 00:20:18.354 "data_size": 63488 00:20:18.354 } 00:20:18.354 ] 00:20:18.354 }' 00:20:18.354 00:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:18.354 00:02:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:18.938 00:02:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@621 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:18.938 00:02:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@621 -- # jq -r '.[].num_blocks' 00:20:18.938 [2024-05-15 00:02:19.526982] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:19.196 00:02:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@621 -- # raid_bdev_size=63488 00:20:19.196 00:02:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:19.196 00:02:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:20:19.453 00:02:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # data_offset=2048 00:20:19.453 00:02:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@626 -- # '[' false = true ']' 00:20:19.453 00:02:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@629 -- # '[' true = true ']' 00:20:19.453 00:02:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@630 -- # local write_unit_size 00:20:19.453 00:02:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@633 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:20:19.453 00:02:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:19.453 00:02:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:20:19.453 00:02:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:19.453 00:02:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:20:19.453 00:02:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:19.453 00:02:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:20:19.453 00:02:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:19.453 00:02:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:19.453 00:02:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:20:19.453 [2024-05-15 00:02:20.020098] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcfe510 00:20:19.453 /dev/nbd0 00:20:19.710 00:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:19.710 00:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:19.710 00:02:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:20:19.710 00:02:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@865 -- # local i 00:20:19.710 00:02:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:20:19.710 00:02:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:20:19.710 00:02:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:20:19.710 00:02:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # break 00:20:19.710 00:02:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:20:19.710 00:02:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:20:19.710 00:02:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:19.710 1+0 records in 00:20:19.710 1+0 records out 00:20:19.710 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000296771 s, 13.8 MB/s 00:20:19.711 00:02:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:19.711 00:02:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # size=4096 00:20:19.711 00:02:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:19.711 00:02:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:20:19.711 00:02:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # return 0 00:20:19.711 00:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:19.711 00:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:19.711 00:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # '[' raid1 = raid5f ']' 00:20:19.711 00:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@638 -- # write_unit_size=1 00:20:19.711 00:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@640 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:20:26.289 63488+0 records in 00:20:26.289 63488+0 records out 00:20:26.289 32505856 bytes (33 MB, 31 MiB) copied, 5.86923 s, 5.5 MB/s 00:20:26.289 00:02:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@641 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:20:26.289 00:02:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:26.289 00:02:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:20:26.289 00:02:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:26.289 00:02:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:20:26.289 00:02:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:26.289 00:02:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:26.289 00:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:26.289 [2024-05-15 00:02:26.227374] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:26.289 00:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:26.289 00:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:26.289 00:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:26.289 00:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:26.289 00:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:26.289 00:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:20:26.289 00:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:20:26.289 00:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:20:26.289 [2024-05-15 00:02:26.460040] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:26.289 00:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@648 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:26.289 00:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:26.289 00:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:26.289 00:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:26.289 00:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:26.289 00:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:20:26.289 00:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:26.289 00:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:26.289 00:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:26.289 00:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:26.289 00:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:26.289 00:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:26.289 00:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:26.289 "name": "raid_bdev1", 00:20:26.289 "uuid": "fb2b3c58-059e-4167-9b3e-5fe9f41c49f0", 00:20:26.289 "strip_size_kb": 0, 00:20:26.289 "state": "online", 00:20:26.289 "raid_level": "raid1", 00:20:26.289 "superblock": true, 00:20:26.289 "num_base_bdevs": 2, 00:20:26.289 "num_base_bdevs_discovered": 1, 00:20:26.289 "num_base_bdevs_operational": 1, 00:20:26.289 "base_bdevs_list": [ 00:20:26.289 { 00:20:26.289 "name": null, 00:20:26.289 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:26.289 "is_configured": false, 00:20:26.289 "data_offset": 2048, 00:20:26.289 "data_size": 63488 00:20:26.289 }, 00:20:26.289 { 00:20:26.289 "name": "BaseBdev2", 00:20:26.289 "uuid": "58b6500b-2bf4-57b8-bfb8-8ddf54c2e7d2", 00:20:26.289 "is_configured": true, 00:20:26.289 "data_offset": 2048, 00:20:26.289 "data_size": 63488 00:20:26.289 } 00:20:26.289 ] 00:20:26.290 }' 00:20:26.290 00:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:26.290 00:02:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:26.855 00:02:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:27.114 [2024-05-15 00:02:27.558964] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:27.114 [2024-05-15 00:02:27.563917] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd02750 00:20:27.114 [2024-05-15 00:02:27.566178] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:27.114 00:02:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # sleep 1 00:20:28.046 00:02:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:28.046 00:02:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:28.046 00:02:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:28.046 00:02:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:28.046 00:02:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:28.046 00:02:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:28.046 00:02:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:28.303 00:02:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:28.303 "name": "raid_bdev1", 00:20:28.303 "uuid": "fb2b3c58-059e-4167-9b3e-5fe9f41c49f0", 00:20:28.303 "strip_size_kb": 0, 00:20:28.303 "state": "online", 00:20:28.303 "raid_level": "raid1", 00:20:28.303 "superblock": true, 00:20:28.303 "num_base_bdevs": 2, 00:20:28.303 "num_base_bdevs_discovered": 2, 00:20:28.303 "num_base_bdevs_operational": 2, 00:20:28.303 "process": { 00:20:28.303 "type": "rebuild", 00:20:28.303 "target": "spare", 00:20:28.304 "progress": { 00:20:28.304 "blocks": 24576, 00:20:28.304 "percent": 38 00:20:28.304 } 00:20:28.304 }, 00:20:28.304 "base_bdevs_list": [ 00:20:28.304 { 00:20:28.304 "name": "spare", 00:20:28.304 "uuid": "e5c60d23-4c1a-56cd-9c50-94ca8b1ff519", 00:20:28.304 "is_configured": true, 00:20:28.304 "data_offset": 2048, 00:20:28.304 "data_size": 63488 00:20:28.304 }, 00:20:28.304 { 00:20:28.304 "name": "BaseBdev2", 00:20:28.304 "uuid": "58b6500b-2bf4-57b8-bfb8-8ddf54c2e7d2", 00:20:28.304 "is_configured": true, 00:20:28.304 "data_offset": 2048, 00:20:28.304 "data_size": 63488 00:20:28.304 } 00:20:28.304 ] 00:20:28.304 }' 00:20:28.304 00:02:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:28.304 00:02:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:28.304 00:02:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:28.561 00:02:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:28.561 00:02:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:28.561 [2024-05-15 00:02:29.131952] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:28.819 [2024-05-15 00:02:29.178992] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:28.819 [2024-05-15 00:02:29.179039] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:28.819 00:02:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:28.819 00:02:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:28.819 00:02:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:28.819 00:02:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:28.819 00:02:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:28.819 00:02:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:20:28.819 00:02:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:28.819 00:02:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:28.819 00:02:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:28.819 00:02:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:28.819 00:02:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:28.819 00:02:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:29.078 00:02:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:29.078 "name": "raid_bdev1", 00:20:29.078 "uuid": "fb2b3c58-059e-4167-9b3e-5fe9f41c49f0", 00:20:29.078 "strip_size_kb": 0, 00:20:29.078 "state": "online", 00:20:29.078 "raid_level": "raid1", 00:20:29.078 "superblock": true, 00:20:29.078 "num_base_bdevs": 2, 00:20:29.078 "num_base_bdevs_discovered": 1, 00:20:29.078 "num_base_bdevs_operational": 1, 00:20:29.078 "base_bdevs_list": [ 00:20:29.078 { 00:20:29.078 "name": null, 00:20:29.078 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:29.078 "is_configured": false, 00:20:29.078 "data_offset": 2048, 00:20:29.078 "data_size": 63488 00:20:29.078 }, 00:20:29.078 { 00:20:29.078 "name": "BaseBdev2", 00:20:29.078 "uuid": "58b6500b-2bf4-57b8-bfb8-8ddf54c2e7d2", 00:20:29.078 "is_configured": true, 00:20:29.078 "data_offset": 2048, 00:20:29.078 "data_size": 63488 00:20:29.078 } 00:20:29.078 ] 00:20:29.078 }' 00:20:29.078 00:02:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:29.078 00:02:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:29.643 00:02:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:29.643 00:02:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:29.643 00:02:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:20:29.643 00:02:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:20:29.643 00:02:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:29.643 00:02:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:29.643 00:02:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:29.901 00:02:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:29.901 "name": "raid_bdev1", 00:20:29.901 "uuid": "fb2b3c58-059e-4167-9b3e-5fe9f41c49f0", 00:20:29.901 "strip_size_kb": 0, 00:20:29.901 "state": "online", 00:20:29.901 "raid_level": "raid1", 00:20:29.901 "superblock": true, 00:20:29.901 "num_base_bdevs": 2, 00:20:29.901 "num_base_bdevs_discovered": 1, 00:20:29.901 "num_base_bdevs_operational": 1, 00:20:29.901 "base_bdevs_list": [ 00:20:29.901 { 00:20:29.901 "name": null, 00:20:29.901 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:29.901 "is_configured": false, 00:20:29.901 "data_offset": 2048, 00:20:29.901 "data_size": 63488 00:20:29.901 }, 00:20:29.901 { 00:20:29.901 "name": "BaseBdev2", 00:20:29.901 "uuid": "58b6500b-2bf4-57b8-bfb8-8ddf54c2e7d2", 00:20:29.901 "is_configured": true, 00:20:29.901 "data_offset": 2048, 00:20:29.901 "data_size": 63488 00:20:29.901 } 00:20:29.901 ] 00:20:29.901 }' 00:20:29.901 00:02:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:29.901 00:02:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:29.901 00:02:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:29.901 00:02:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:20:29.901 00:02:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@667 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:30.158 [2024-05-15 00:02:30.555759] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:30.158 [2024-05-15 00:02:30.560699] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa0ecd0 00:20:30.159 [2024-05-15 00:02:30.562200] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:30.159 00:02:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@668 -- # sleep 1 00:20:31.092 00:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@669 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:31.092 00:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:31.092 00:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:31.092 00:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:31.092 00:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:31.092 00:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:31.092 00:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:31.350 00:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:31.350 "name": "raid_bdev1", 00:20:31.350 "uuid": "fb2b3c58-059e-4167-9b3e-5fe9f41c49f0", 00:20:31.350 "strip_size_kb": 0, 00:20:31.350 "state": "online", 00:20:31.350 "raid_level": "raid1", 00:20:31.350 "superblock": true, 00:20:31.350 "num_base_bdevs": 2, 00:20:31.350 "num_base_bdevs_discovered": 2, 00:20:31.350 "num_base_bdevs_operational": 2, 00:20:31.350 "process": { 00:20:31.350 "type": "rebuild", 00:20:31.350 "target": "spare", 00:20:31.350 "progress": { 00:20:31.350 "blocks": 24576, 00:20:31.350 "percent": 38 00:20:31.350 } 00:20:31.350 }, 00:20:31.350 "base_bdevs_list": [ 00:20:31.350 { 00:20:31.350 "name": "spare", 00:20:31.350 "uuid": "e5c60d23-4c1a-56cd-9c50-94ca8b1ff519", 00:20:31.350 "is_configured": true, 00:20:31.350 "data_offset": 2048, 00:20:31.350 "data_size": 63488 00:20:31.350 }, 00:20:31.350 { 00:20:31.350 "name": "BaseBdev2", 00:20:31.350 "uuid": "58b6500b-2bf4-57b8-bfb8-8ddf54c2e7d2", 00:20:31.350 "is_configured": true, 00:20:31.350 "data_offset": 2048, 00:20:31.350 "data_size": 63488 00:20:31.350 } 00:20:31.350 ] 00:20:31.350 }' 00:20:31.350 00:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:31.350 00:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:31.350 00:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:31.350 00:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:31.350 00:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@671 -- # '[' true = true ']' 00:20:31.350 00:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@671 -- # '[' = false ']' 00:20:31.350 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 671: [: =: unary operator expected 00:20:31.350 00:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@696 -- # local num_base_bdevs_operational=2 00:20:31.350 00:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # '[' raid1 = raid1 ']' 00:20:31.350 00:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # '[' 2 -gt 2 ']' 00:20:31.350 00:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@711 -- # local timeout=642 00:20:31.350 00:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:20:31.350 00:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:31.350 00:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:31.350 00:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:31.350 00:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:31.350 00:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:31.350 00:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:31.350 00:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:31.608 00:02:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:31.608 "name": "raid_bdev1", 00:20:31.608 "uuid": "fb2b3c58-059e-4167-9b3e-5fe9f41c49f0", 00:20:31.608 "strip_size_kb": 0, 00:20:31.608 "state": "online", 00:20:31.608 "raid_level": "raid1", 00:20:31.608 "superblock": true, 00:20:31.608 "num_base_bdevs": 2, 00:20:31.608 "num_base_bdevs_discovered": 2, 00:20:31.608 "num_base_bdevs_operational": 2, 00:20:31.608 "process": { 00:20:31.608 "type": "rebuild", 00:20:31.608 "target": "spare", 00:20:31.608 "progress": { 00:20:31.608 "blocks": 30720, 00:20:31.608 "percent": 48 00:20:31.608 } 00:20:31.608 }, 00:20:31.608 "base_bdevs_list": [ 00:20:31.608 { 00:20:31.608 "name": "spare", 00:20:31.608 "uuid": "e5c60d23-4c1a-56cd-9c50-94ca8b1ff519", 00:20:31.608 "is_configured": true, 00:20:31.608 "data_offset": 2048, 00:20:31.608 "data_size": 63488 00:20:31.608 }, 00:20:31.608 { 00:20:31.608 "name": "BaseBdev2", 00:20:31.608 "uuid": "58b6500b-2bf4-57b8-bfb8-8ddf54c2e7d2", 00:20:31.608 "is_configured": true, 00:20:31.608 "data_offset": 2048, 00:20:31.608 "data_size": 63488 00:20:31.608 } 00:20:31.608 ] 00:20:31.608 }' 00:20:31.608 00:02:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:31.608 00:02:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:31.608 00:02:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:31.865 00:02:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:31.865 00:02:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@716 -- # sleep 1 00:20:32.799 00:02:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:20:32.799 00:02:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:32.799 00:02:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:32.799 00:02:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:32.799 00:02:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:32.799 00:02:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:32.799 00:02:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:32.799 00:02:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:33.063 00:02:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:33.063 "name": "raid_bdev1", 00:20:33.063 "uuid": "fb2b3c58-059e-4167-9b3e-5fe9f41c49f0", 00:20:33.063 "strip_size_kb": 0, 00:20:33.063 "state": "online", 00:20:33.063 "raid_level": "raid1", 00:20:33.063 "superblock": true, 00:20:33.063 "num_base_bdevs": 2, 00:20:33.063 "num_base_bdevs_discovered": 2, 00:20:33.063 "num_base_bdevs_operational": 2, 00:20:33.063 "process": { 00:20:33.063 "type": "rebuild", 00:20:33.063 "target": "spare", 00:20:33.063 "progress": { 00:20:33.063 "blocks": 57344, 00:20:33.063 "percent": 90 00:20:33.063 } 00:20:33.063 }, 00:20:33.063 "base_bdevs_list": [ 00:20:33.063 { 00:20:33.063 "name": "spare", 00:20:33.063 "uuid": "e5c60d23-4c1a-56cd-9c50-94ca8b1ff519", 00:20:33.063 "is_configured": true, 00:20:33.063 "data_offset": 2048, 00:20:33.063 "data_size": 63488 00:20:33.063 }, 00:20:33.063 { 00:20:33.063 "name": "BaseBdev2", 00:20:33.063 "uuid": "58b6500b-2bf4-57b8-bfb8-8ddf54c2e7d2", 00:20:33.063 "is_configured": true, 00:20:33.063 "data_offset": 2048, 00:20:33.063 "data_size": 63488 00:20:33.063 } 00:20:33.063 ] 00:20:33.063 }' 00:20:33.063 00:02:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:33.063 00:02:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:33.063 00:02:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:33.064 00:02:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:33.064 00:02:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@716 -- # sleep 1 00:20:33.320 [2024-05-15 00:02:33.686528] bdev_raid.c:2741:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:20:33.320 [2024-05-15 00:02:33.686593] bdev_raid.c:2458:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:20:33.320 [2024-05-15 00:02:33.686678] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:34.251 00:02:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:20:34.251 00:02:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:34.251 00:02:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:34.251 00:02:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:34.251 00:02:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:34.251 00:02:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:34.251 00:02:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:34.251 00:02:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:34.251 00:02:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:34.251 "name": "raid_bdev1", 00:20:34.251 "uuid": "fb2b3c58-059e-4167-9b3e-5fe9f41c49f0", 00:20:34.251 "strip_size_kb": 0, 00:20:34.251 "state": "online", 00:20:34.251 "raid_level": "raid1", 00:20:34.251 "superblock": true, 00:20:34.251 "num_base_bdevs": 2, 00:20:34.252 "num_base_bdevs_discovered": 2, 00:20:34.252 "num_base_bdevs_operational": 2, 00:20:34.252 "base_bdevs_list": [ 00:20:34.252 { 00:20:34.252 "name": "spare", 00:20:34.252 "uuid": "e5c60d23-4c1a-56cd-9c50-94ca8b1ff519", 00:20:34.252 "is_configured": true, 00:20:34.252 "data_offset": 2048, 00:20:34.252 "data_size": 63488 00:20:34.252 }, 00:20:34.252 { 00:20:34.252 "name": "BaseBdev2", 00:20:34.252 "uuid": "58b6500b-2bf4-57b8-bfb8-8ddf54c2e7d2", 00:20:34.252 "is_configured": true, 00:20:34.252 "data_offset": 2048, 00:20:34.252 "data_size": 63488 00:20:34.252 } 00:20:34.252 ] 00:20:34.252 }' 00:20:34.252 00:02:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:34.252 00:02:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:20:34.252 00:02:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:34.509 00:02:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:20:34.509 00:02:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # break 00:20:34.509 00:02:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@720 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:34.509 00:02:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:34.509 00:02:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:20:34.509 00:02:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:20:34.509 00:02:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:34.509 00:02:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:34.509 00:02:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:34.509 00:02:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:34.509 "name": "raid_bdev1", 00:20:34.509 "uuid": "fb2b3c58-059e-4167-9b3e-5fe9f41c49f0", 00:20:34.509 "strip_size_kb": 0, 00:20:34.509 "state": "online", 00:20:34.509 "raid_level": "raid1", 00:20:34.509 "superblock": true, 00:20:34.509 "num_base_bdevs": 2, 00:20:34.509 "num_base_bdevs_discovered": 2, 00:20:34.509 "num_base_bdevs_operational": 2, 00:20:34.509 "base_bdevs_list": [ 00:20:34.509 { 00:20:34.509 "name": "spare", 00:20:34.509 "uuid": "e5c60d23-4c1a-56cd-9c50-94ca8b1ff519", 00:20:34.509 "is_configured": true, 00:20:34.509 "data_offset": 2048, 00:20:34.509 "data_size": 63488 00:20:34.509 }, 00:20:34.509 { 00:20:34.509 "name": "BaseBdev2", 00:20:34.509 "uuid": "58b6500b-2bf4-57b8-bfb8-8ddf54c2e7d2", 00:20:34.509 "is_configured": true, 00:20:34.509 "data_offset": 2048, 00:20:34.509 "data_size": 63488 00:20:34.509 } 00:20:34.509 ] 00:20:34.509 }' 00:20:34.509 00:02:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:34.767 00:02:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:34.767 00:02:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:34.767 00:02:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:20:34.767 00:02:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:34.767 00:02:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:34.767 00:02:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:34.767 00:02:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:34.767 00:02:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:34.767 00:02:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:20:34.767 00:02:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:34.767 00:02:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:34.767 00:02:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:34.767 00:02:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:34.767 00:02:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:34.767 00:02:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:35.025 00:02:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:35.025 "name": "raid_bdev1", 00:20:35.025 "uuid": "fb2b3c58-059e-4167-9b3e-5fe9f41c49f0", 00:20:35.025 "strip_size_kb": 0, 00:20:35.025 "state": "online", 00:20:35.025 "raid_level": "raid1", 00:20:35.025 "superblock": true, 00:20:35.025 "num_base_bdevs": 2, 00:20:35.025 "num_base_bdevs_discovered": 2, 00:20:35.025 "num_base_bdevs_operational": 2, 00:20:35.025 "base_bdevs_list": [ 00:20:35.025 { 00:20:35.025 "name": "spare", 00:20:35.025 "uuid": "e5c60d23-4c1a-56cd-9c50-94ca8b1ff519", 00:20:35.025 "is_configured": true, 00:20:35.025 "data_offset": 2048, 00:20:35.025 "data_size": 63488 00:20:35.025 }, 00:20:35.025 { 00:20:35.025 "name": "BaseBdev2", 00:20:35.025 "uuid": "58b6500b-2bf4-57b8-bfb8-8ddf54c2e7d2", 00:20:35.025 "is_configured": true, 00:20:35.025 "data_offset": 2048, 00:20:35.025 "data_size": 63488 00:20:35.025 } 00:20:35.025 ] 00:20:35.025 }' 00:20:35.025 00:02:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:35.025 00:02:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:35.591 00:02:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@724 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:35.849 [2024-05-15 00:02:36.202543] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:35.849 [2024-05-15 00:02:36.202574] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:35.849 [2024-05-15 00:02:36.202638] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:35.849 [2024-05-15 00:02:36.202693] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:35.849 [2024-05-15 00:02:36.202705] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcfff00 name raid_bdev1, state offline 00:20:35.849 00:02:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@725 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:35.849 00:02:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@725 -- # jq length 00:20:36.106 00:02:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@725 -- # [[ 0 == 0 ]] 00:20:36.106 00:02:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@727 -- # '[' true = true ']' 00:20:36.106 00:02:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@728 -- # '[' false = true ']' 00:20:36.106 00:02:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:20:36.106 00:02:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:36.106 00:02:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:20:36.106 00:02:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:36.106 00:02:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:20:36.106 00:02:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:36.106 00:02:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:20:36.106 00:02:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:36.106 00:02:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:36.106 00:02:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:20:36.363 /dev/nbd0 00:20:36.363 00:02:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:36.363 00:02:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:36.363 00:02:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:20:36.363 00:02:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@865 -- # local i 00:20:36.363 00:02:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:20:36.363 00:02:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:20:36.364 00:02:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:20:36.364 00:02:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # break 00:20:36.364 00:02:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:20:36.364 00:02:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:20:36.364 00:02:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:36.364 1+0 records in 00:20:36.364 1+0 records out 00:20:36.364 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000277594 s, 14.8 MB/s 00:20:36.364 00:02:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:36.364 00:02:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # size=4096 00:20:36.364 00:02:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:36.364 00:02:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:20:36.364 00:02:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # return 0 00:20:36.364 00:02:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:36.364 00:02:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:36.364 00:02:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:20:36.621 /dev/nbd1 00:20:36.621 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:20:36.621 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:20:36.621 00:02:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:20:36.621 00:02:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@865 -- # local i 00:20:36.621 00:02:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:20:36.622 00:02:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:20:36.622 00:02:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:20:36.622 00:02:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # break 00:20:36.622 00:02:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:20:36.622 00:02:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:20:36.622 00:02:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:36.622 1+0 records in 00:20:36.622 1+0 records out 00:20:36.622 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000369277 s, 11.1 MB/s 00:20:36.622 00:02:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:36.622 00:02:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # size=4096 00:20:36.622 00:02:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:36.622 00:02:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:20:36.622 00:02:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # return 0 00:20:36.622 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:36.622 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:36.622 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@743 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:20:36.622 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:20:36.622 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:36.622 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:20:36.622 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:36.622 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:20:36.622 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:36.622 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:36.879 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:36.879 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:36.879 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:36.879 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:36.879 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:36.880 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:36.880 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:20:36.880 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:20:36.880 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:36.880 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:20:37.138 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:20:37.138 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:20:37.138 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:20:37.138 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:37.138 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:37.138 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:20:37.138 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:20:37.138 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:20:37.138 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # '[' true = true ']' 00:20:37.138 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:20:37.138 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev1 ']' 00:20:37.138 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:20:37.396 00:02:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:37.653 [2024-05-15 00:02:38.161467] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:37.653 [2024-05-15 00:02:38.161512] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:37.653 [2024-05-15 00:02:38.161533] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd06d80 00:20:37.653 [2024-05-15 00:02:38.161546] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:37.653 [2024-05-15 00:02:38.163173] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:37.653 [2024-05-15 00:02:38.163202] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:37.653 [2024-05-15 00:02:38.163270] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev1 00:20:37.653 [2024-05-15 00:02:38.163296] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:37.653 BaseBdev1 00:20:37.653 00:02:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:20:37.653 00:02:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev2 ']' 00:20:37.653 00:02:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev2 00:20:37.911 00:02:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:20:38.168 [2024-05-15 00:02:38.650768] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:20:38.168 [2024-05-15 00:02:38.650817] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:38.168 [2024-05-15 00:02:38.650837] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd06220 00:20:38.168 [2024-05-15 00:02:38.650850] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:38.168 [2024-05-15 00:02:38.651214] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:38.168 [2024-05-15 00:02:38.651233] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:38.168 [2024-05-15 00:02:38.651298] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev2 00:20:38.168 [2024-05-15 00:02:38.651310] bdev_raid.c:3396:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev2 (3) greater than existing raid bdev raid_bdev1 (1) 00:20:38.168 [2024-05-15 00:02:38.651320] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:38.168 [2024-05-15 00:02:38.651336] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xeb8260 name raid_bdev1, state configuring 00:20:38.168 [2024-05-15 00:02:38.651366] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:38.168 BaseBdev2 00:20:38.168 00:02:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@757 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:38.425 00:02:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@758 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:38.683 [2024-05-15 00:02:39.140062] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:38.683 [2024-05-15 00:02:39.140106] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:38.683 [2024-05-15 00:02:39.140127] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xeae510 00:20:38.683 [2024-05-15 00:02:39.140139] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:38.683 [2024-05-15 00:02:39.140518] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:38.683 [2024-05-15 00:02:39.140537] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:38.683 [2024-05-15 00:02:39.140617] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev spare 00:20:38.683 [2024-05-15 00:02:39.140635] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:38.683 spare 00:20:38.683 00:02:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:38.683 00:02:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:38.683 00:02:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:38.683 00:02:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:38.683 00:02:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:38.683 00:02:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:20:38.683 00:02:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:38.683 00:02:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:38.683 00:02:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:38.683 00:02:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:38.683 00:02:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:38.683 00:02:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:38.683 [2024-05-15 00:02:39.240962] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xcfde40 00:20:38.683 [2024-05-15 00:02:39.240979] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:38.683 [2024-05-15 00:02:39.241187] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd01250 00:20:38.683 [2024-05-15 00:02:39.241346] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcfde40 00:20:38.683 [2024-05-15 00:02:39.241356] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xcfde40 00:20:38.683 [2024-05-15 00:02:39.241477] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:38.941 00:02:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:38.941 "name": "raid_bdev1", 00:20:38.941 "uuid": "fb2b3c58-059e-4167-9b3e-5fe9f41c49f0", 00:20:38.941 "strip_size_kb": 0, 00:20:38.941 "state": "online", 00:20:38.941 "raid_level": "raid1", 00:20:38.941 "superblock": true, 00:20:38.941 "num_base_bdevs": 2, 00:20:38.941 "num_base_bdevs_discovered": 2, 00:20:38.941 "num_base_bdevs_operational": 2, 00:20:38.941 "base_bdevs_list": [ 00:20:38.941 { 00:20:38.941 "name": "spare", 00:20:38.941 "uuid": "e5c60d23-4c1a-56cd-9c50-94ca8b1ff519", 00:20:38.941 "is_configured": true, 00:20:38.941 "data_offset": 2048, 00:20:38.941 "data_size": 63488 00:20:38.941 }, 00:20:38.941 { 00:20:38.941 "name": "BaseBdev2", 00:20:38.941 "uuid": "58b6500b-2bf4-57b8-bfb8-8ddf54c2e7d2", 00:20:38.941 "is_configured": true, 00:20:38.941 "data_offset": 2048, 00:20:38.941 "data_size": 63488 00:20:38.941 } 00:20:38.941 ] 00:20:38.941 }' 00:20:38.941 00:02:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:38.941 00:02:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:39.507 00:02:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:39.507 00:02:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:39.507 00:02:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:20:39.507 00:02:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:20:39.507 00:02:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:39.507 00:02:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.507 00:02:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:39.770 00:02:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:39.770 "name": "raid_bdev1", 00:20:39.770 "uuid": "fb2b3c58-059e-4167-9b3e-5fe9f41c49f0", 00:20:39.770 "strip_size_kb": 0, 00:20:39.770 "state": "online", 00:20:39.770 "raid_level": "raid1", 00:20:39.770 "superblock": true, 00:20:39.770 "num_base_bdevs": 2, 00:20:39.770 "num_base_bdevs_discovered": 2, 00:20:39.770 "num_base_bdevs_operational": 2, 00:20:39.770 "base_bdevs_list": [ 00:20:39.770 { 00:20:39.770 "name": "spare", 00:20:39.770 "uuid": "e5c60d23-4c1a-56cd-9c50-94ca8b1ff519", 00:20:39.770 "is_configured": true, 00:20:39.770 "data_offset": 2048, 00:20:39.770 "data_size": 63488 00:20:39.770 }, 00:20:39.770 { 00:20:39.770 "name": "BaseBdev2", 00:20:39.770 "uuid": "58b6500b-2bf4-57b8-bfb8-8ddf54c2e7d2", 00:20:39.770 "is_configured": true, 00:20:39.770 "data_offset": 2048, 00:20:39.770 "data_size": 63488 00:20:39.770 } 00:20:39.770 ] 00:20:39.770 }' 00:20:39.770 00:02:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:39.770 00:02:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:39.770 00:02:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:39.770 00:02:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:20:39.770 00:02:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.770 00:02:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # jq -r '.[].base_bdevs_list[0].name' 00:20:40.028 00:02:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # [[ spare == \s\p\a\r\e ]] 00:20:40.028 00:02:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:40.316 [2024-05-15 00:02:40.716363] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:40.316 00:02:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:40.316 00:02:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:40.316 00:02:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:40.316 00:02:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:40.316 00:02:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:40.316 00:02:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:20:40.316 00:02:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:40.316 00:02:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:40.316 00:02:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:40.316 00:02:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:40.316 00:02:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:40.316 00:02:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:40.573 00:02:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:40.573 "name": "raid_bdev1", 00:20:40.573 "uuid": "fb2b3c58-059e-4167-9b3e-5fe9f41c49f0", 00:20:40.573 "strip_size_kb": 0, 00:20:40.573 "state": "online", 00:20:40.573 "raid_level": "raid1", 00:20:40.573 "superblock": true, 00:20:40.573 "num_base_bdevs": 2, 00:20:40.573 "num_base_bdevs_discovered": 1, 00:20:40.573 "num_base_bdevs_operational": 1, 00:20:40.573 "base_bdevs_list": [ 00:20:40.573 { 00:20:40.573 "name": null, 00:20:40.573 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:40.573 "is_configured": false, 00:20:40.573 "data_offset": 2048, 00:20:40.573 "data_size": 63488 00:20:40.573 }, 00:20:40.573 { 00:20:40.573 "name": "BaseBdev2", 00:20:40.573 "uuid": "58b6500b-2bf4-57b8-bfb8-8ddf54c2e7d2", 00:20:40.573 "is_configured": true, 00:20:40.573 "data_offset": 2048, 00:20:40.573 "data_size": 63488 00:20:40.573 } 00:20:40.573 ] 00:20:40.573 }' 00:20:40.573 00:02:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:40.573 00:02:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:41.140 00:02:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:41.397 [2024-05-15 00:02:41.743099] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:41.398 [2024-05-15 00:02:41.743251] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:20:41.398 [2024-05-15 00:02:41.743268] bdev_raid.c:3452:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:20:41.398 [2024-05-15 00:02:41.743296] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:41.398 [2024-05-15 00:02:41.748084] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd00580 00:20:41.398 [2024-05-15 00:02:41.749474] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:41.398 00:02:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # sleep 1 00:20:42.333 00:02:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:42.333 00:02:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:42.333 00:02:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:42.333 00:02:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:42.333 00:02:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:42.333 00:02:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:42.333 00:02:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:42.593 00:02:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:42.593 "name": "raid_bdev1", 00:20:42.593 "uuid": "fb2b3c58-059e-4167-9b3e-5fe9f41c49f0", 00:20:42.593 "strip_size_kb": 0, 00:20:42.593 "state": "online", 00:20:42.593 "raid_level": "raid1", 00:20:42.593 "superblock": true, 00:20:42.593 "num_base_bdevs": 2, 00:20:42.593 "num_base_bdevs_discovered": 2, 00:20:42.593 "num_base_bdevs_operational": 2, 00:20:42.593 "process": { 00:20:42.593 "type": "rebuild", 00:20:42.593 "target": "spare", 00:20:42.593 "progress": { 00:20:42.593 "blocks": 24576, 00:20:42.593 "percent": 38 00:20:42.593 } 00:20:42.593 }, 00:20:42.593 "base_bdevs_list": [ 00:20:42.593 { 00:20:42.593 "name": "spare", 00:20:42.593 "uuid": "e5c60d23-4c1a-56cd-9c50-94ca8b1ff519", 00:20:42.593 "is_configured": true, 00:20:42.593 "data_offset": 2048, 00:20:42.593 "data_size": 63488 00:20:42.593 }, 00:20:42.593 { 00:20:42.593 "name": "BaseBdev2", 00:20:42.593 "uuid": "58b6500b-2bf4-57b8-bfb8-8ddf54c2e7d2", 00:20:42.593 "is_configured": true, 00:20:42.593 "data_offset": 2048, 00:20:42.593 "data_size": 63488 00:20:42.593 } 00:20:42.593 ] 00:20:42.593 }' 00:20:42.593 00:02:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:42.593 00:02:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:42.593 00:02:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:42.593 00:02:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:42.593 00:02:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:42.851 [2024-05-15 00:02:43.280643] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:42.851 [2024-05-15 00:02:43.361729] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:42.852 [2024-05-15 00:02:43.361775] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:42.852 00:02:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:42.852 00:02:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:42.852 00:02:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:42.852 00:02:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:42.852 00:02:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:42.852 00:02:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:20:42.852 00:02:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:42.852 00:02:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:42.852 00:02:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:42.852 00:02:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:42.852 00:02:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:42.852 00:02:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:43.109 00:02:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:43.109 "name": "raid_bdev1", 00:20:43.109 "uuid": "fb2b3c58-059e-4167-9b3e-5fe9f41c49f0", 00:20:43.109 "strip_size_kb": 0, 00:20:43.109 "state": "online", 00:20:43.109 "raid_level": "raid1", 00:20:43.109 "superblock": true, 00:20:43.109 "num_base_bdevs": 2, 00:20:43.109 "num_base_bdevs_discovered": 1, 00:20:43.109 "num_base_bdevs_operational": 1, 00:20:43.109 "base_bdevs_list": [ 00:20:43.109 { 00:20:43.109 "name": null, 00:20:43.109 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:43.109 "is_configured": false, 00:20:43.109 "data_offset": 2048, 00:20:43.109 "data_size": 63488 00:20:43.109 }, 00:20:43.109 { 00:20:43.109 "name": "BaseBdev2", 00:20:43.109 "uuid": "58b6500b-2bf4-57b8-bfb8-8ddf54c2e7d2", 00:20:43.109 "is_configured": true, 00:20:43.109 "data_offset": 2048, 00:20:43.109 "data_size": 63488 00:20:43.109 } 00:20:43.109 ] 00:20:43.109 }' 00:20:43.109 00:02:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:43.109 00:02:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:43.674 00:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:43.932 [2024-05-15 00:02:44.405050] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:43.932 [2024-05-15 00:02:44.405103] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:43.932 [2024-05-15 00:02:44.405128] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd04aa0 00:20:43.932 [2024-05-15 00:02:44.405141] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:43.932 [2024-05-15 00:02:44.405514] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:43.932 [2024-05-15 00:02:44.405534] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:43.932 [2024-05-15 00:02:44.405621] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev spare 00:20:43.932 [2024-05-15 00:02:44.405634] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:20:43.932 [2024-05-15 00:02:44.405645] bdev_raid.c:3452:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:20:43.932 [2024-05-15 00:02:44.405664] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:43.932 [2024-05-15 00:02:44.410540] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd064b0 00:20:43.932 spare 00:20:43.932 [2024-05-15 00:02:44.411919] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:43.932 00:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # sleep 1 00:20:44.865 00:02:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:44.865 00:02:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:44.865 00:02:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:44.865 00:02:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:44.865 00:02:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:44.865 00:02:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:44.865 00:02:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:45.123 00:02:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:45.123 "name": "raid_bdev1", 00:20:45.123 "uuid": "fb2b3c58-059e-4167-9b3e-5fe9f41c49f0", 00:20:45.123 "strip_size_kb": 0, 00:20:45.123 "state": "online", 00:20:45.123 "raid_level": "raid1", 00:20:45.123 "superblock": true, 00:20:45.123 "num_base_bdevs": 2, 00:20:45.123 "num_base_bdevs_discovered": 2, 00:20:45.123 "num_base_bdevs_operational": 2, 00:20:45.123 "process": { 00:20:45.123 "type": "rebuild", 00:20:45.123 "target": "spare", 00:20:45.123 "progress": { 00:20:45.123 "blocks": 24576, 00:20:45.123 "percent": 38 00:20:45.123 } 00:20:45.123 }, 00:20:45.123 "base_bdevs_list": [ 00:20:45.123 { 00:20:45.123 "name": "spare", 00:20:45.123 "uuid": "e5c60d23-4c1a-56cd-9c50-94ca8b1ff519", 00:20:45.123 "is_configured": true, 00:20:45.123 "data_offset": 2048, 00:20:45.123 "data_size": 63488 00:20:45.123 }, 00:20:45.123 { 00:20:45.123 "name": "BaseBdev2", 00:20:45.123 "uuid": "58b6500b-2bf4-57b8-bfb8-8ddf54c2e7d2", 00:20:45.123 "is_configured": true, 00:20:45.123 "data_offset": 2048, 00:20:45.123 "data_size": 63488 00:20:45.123 } 00:20:45.123 ] 00:20:45.123 }' 00:20:45.123 00:02:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:45.382 00:02:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:45.382 00:02:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:45.382 00:02:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:45.382 00:02:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:45.640 [2024-05-15 00:02:45.995552] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:45.640 [2024-05-15 00:02:46.024459] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:45.640 [2024-05-15 00:02:46.024506] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:45.640 00:02:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@780 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:45.640 00:02:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:45.640 00:02:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:45.640 00:02:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:45.640 00:02:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:45.640 00:02:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:20:45.640 00:02:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:45.640 00:02:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:45.640 00:02:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:45.640 00:02:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:45.640 00:02:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:45.640 00:02:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:45.899 00:02:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:45.899 "name": "raid_bdev1", 00:20:45.899 "uuid": "fb2b3c58-059e-4167-9b3e-5fe9f41c49f0", 00:20:45.899 "strip_size_kb": 0, 00:20:45.899 "state": "online", 00:20:45.899 "raid_level": "raid1", 00:20:45.899 "superblock": true, 00:20:45.899 "num_base_bdevs": 2, 00:20:45.899 "num_base_bdevs_discovered": 1, 00:20:45.899 "num_base_bdevs_operational": 1, 00:20:45.899 "base_bdevs_list": [ 00:20:45.899 { 00:20:45.899 "name": null, 00:20:45.899 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:45.899 "is_configured": false, 00:20:45.899 "data_offset": 2048, 00:20:45.899 "data_size": 63488 00:20:45.899 }, 00:20:45.899 { 00:20:45.899 "name": "BaseBdev2", 00:20:45.899 "uuid": "58b6500b-2bf4-57b8-bfb8-8ddf54c2e7d2", 00:20:45.899 "is_configured": true, 00:20:45.899 "data_offset": 2048, 00:20:45.899 "data_size": 63488 00:20:45.899 } 00:20:45.899 ] 00:20:45.899 }' 00:20:45.899 00:02:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:45.899 00:02:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:46.465 00:02:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@781 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:46.465 00:02:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:46.465 00:02:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:20:46.465 00:02:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:20:46.465 00:02:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:46.465 00:02:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:46.465 00:02:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:46.723 00:02:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:46.723 "name": "raid_bdev1", 00:20:46.723 "uuid": "fb2b3c58-059e-4167-9b3e-5fe9f41c49f0", 00:20:46.723 "strip_size_kb": 0, 00:20:46.723 "state": "online", 00:20:46.723 "raid_level": "raid1", 00:20:46.723 "superblock": true, 00:20:46.723 "num_base_bdevs": 2, 00:20:46.723 "num_base_bdevs_discovered": 1, 00:20:46.723 "num_base_bdevs_operational": 1, 00:20:46.723 "base_bdevs_list": [ 00:20:46.723 { 00:20:46.723 "name": null, 00:20:46.723 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:46.723 "is_configured": false, 00:20:46.723 "data_offset": 2048, 00:20:46.723 "data_size": 63488 00:20:46.723 }, 00:20:46.723 { 00:20:46.723 "name": "BaseBdev2", 00:20:46.723 "uuid": "58b6500b-2bf4-57b8-bfb8-8ddf54c2e7d2", 00:20:46.723 "is_configured": true, 00:20:46.723 "data_offset": 2048, 00:20:46.724 "data_size": 63488 00:20:46.724 } 00:20:46.724 ] 00:20:46.724 }' 00:20:46.724 00:02:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:46.724 00:02:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:46.724 00:02:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:46.724 00:02:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:20:46.724 00:02:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:20:46.982 00:02:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@785 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:47.239 [2024-05-15 00:02:47.713321] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:47.239 [2024-05-15 00:02:47.713370] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:47.239 [2024-05-15 00:02:47.713392] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xeaf530 00:20:47.239 [2024-05-15 00:02:47.713416] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:47.239 [2024-05-15 00:02:47.713754] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:47.239 [2024-05-15 00:02:47.713774] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:47.239 [2024-05-15 00:02:47.713838] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev1 00:20:47.239 [2024-05-15 00:02:47.713851] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:20:47.239 [2024-05-15 00:02:47.713860] bdev_raid.c:3430:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:20:47.239 BaseBdev1 00:20:47.239 00:02:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@786 -- # sleep 1 00:20:48.174 00:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@787 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:48.174 00:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:48.174 00:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:48.174 00:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:48.174 00:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:48.174 00:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:20:48.174 00:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:48.174 00:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:48.174 00:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:48.174 00:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:48.174 00:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:48.174 00:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:48.434 00:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:48.434 "name": "raid_bdev1", 00:20:48.434 "uuid": "fb2b3c58-059e-4167-9b3e-5fe9f41c49f0", 00:20:48.434 "strip_size_kb": 0, 00:20:48.434 "state": "online", 00:20:48.434 "raid_level": "raid1", 00:20:48.434 "superblock": true, 00:20:48.434 "num_base_bdevs": 2, 00:20:48.434 "num_base_bdevs_discovered": 1, 00:20:48.434 "num_base_bdevs_operational": 1, 00:20:48.434 "base_bdevs_list": [ 00:20:48.434 { 00:20:48.434 "name": null, 00:20:48.434 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:48.434 "is_configured": false, 00:20:48.434 "data_offset": 2048, 00:20:48.434 "data_size": 63488 00:20:48.434 }, 00:20:48.434 { 00:20:48.434 "name": "BaseBdev2", 00:20:48.434 "uuid": "58b6500b-2bf4-57b8-bfb8-8ddf54c2e7d2", 00:20:48.434 "is_configured": true, 00:20:48.434 "data_offset": 2048, 00:20:48.434 "data_size": 63488 00:20:48.434 } 00:20:48.434 ] 00:20:48.434 }' 00:20:48.434 00:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:48.434 00:02:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:49.000 00:02:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@788 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:49.000 00:02:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:49.001 00:02:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:20:49.001 00:02:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:20:49.001 00:02:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:49.001 00:02:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:49.001 00:02:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:49.258 00:02:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:49.258 "name": "raid_bdev1", 00:20:49.258 "uuid": "fb2b3c58-059e-4167-9b3e-5fe9f41c49f0", 00:20:49.258 "strip_size_kb": 0, 00:20:49.258 "state": "online", 00:20:49.258 "raid_level": "raid1", 00:20:49.258 "superblock": true, 00:20:49.258 "num_base_bdevs": 2, 00:20:49.258 "num_base_bdevs_discovered": 1, 00:20:49.258 "num_base_bdevs_operational": 1, 00:20:49.258 "base_bdevs_list": [ 00:20:49.258 { 00:20:49.258 "name": null, 00:20:49.258 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:49.258 "is_configured": false, 00:20:49.258 "data_offset": 2048, 00:20:49.258 "data_size": 63488 00:20:49.258 }, 00:20:49.258 { 00:20:49.258 "name": "BaseBdev2", 00:20:49.258 "uuid": "58b6500b-2bf4-57b8-bfb8-8ddf54c2e7d2", 00:20:49.258 "is_configured": true, 00:20:49.258 "data_offset": 2048, 00:20:49.258 "data_size": 63488 00:20:49.258 } 00:20:49.258 ] 00:20:49.258 }' 00:20:49.258 00:02:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:49.515 00:02:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:49.515 00:02:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:49.515 00:02:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:20:49.515 00:02:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@789 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:49.515 00:02:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:20:49.515 00:02:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:49.515 00:02:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:49.515 00:02:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:49.515 00:02:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:49.515 00:02:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:49.515 00:02:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:49.515 00:02:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:49.515 00:02:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:49.515 00:02:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:20:49.515 00:02:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:49.772 [2024-05-15 00:02:50.143777] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:49.772 [2024-05-15 00:02:50.143904] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:20:49.772 [2024-05-15 00:02:50.143920] bdev_raid.c:3430:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:20:49.772 request: 00:20:49.772 { 00:20:49.772 "raid_bdev": "raid_bdev1", 00:20:49.772 "base_bdev": "BaseBdev1", 00:20:49.772 "method": "bdev_raid_add_base_bdev", 00:20:49.772 "req_id": 1 00:20:49.772 } 00:20:49.772 Got JSON-RPC error response 00:20:49.772 response: 00:20:49.772 { 00:20:49.772 "code": -22, 00:20:49.772 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:20:49.772 } 00:20:49.772 00:02:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:20:49.772 00:02:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:20:49.772 00:02:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:20:49.772 00:02:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:20:49.772 00:02:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@790 -- # sleep 1 00:20:50.706 00:02:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:50.706 00:02:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:50.706 00:02:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:50.706 00:02:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:50.706 00:02:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:50.706 00:02:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:20:50.706 00:02:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:50.706 00:02:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:50.706 00:02:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:50.706 00:02:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:50.706 00:02:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:50.706 00:02:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:50.964 00:02:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:50.964 "name": "raid_bdev1", 00:20:50.964 "uuid": "fb2b3c58-059e-4167-9b3e-5fe9f41c49f0", 00:20:50.964 "strip_size_kb": 0, 00:20:50.964 "state": "online", 00:20:50.964 "raid_level": "raid1", 00:20:50.964 "superblock": true, 00:20:50.964 "num_base_bdevs": 2, 00:20:50.964 "num_base_bdevs_discovered": 1, 00:20:50.964 "num_base_bdevs_operational": 1, 00:20:50.964 "base_bdevs_list": [ 00:20:50.964 { 00:20:50.964 "name": null, 00:20:50.964 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:50.964 "is_configured": false, 00:20:50.964 "data_offset": 2048, 00:20:50.964 "data_size": 63488 00:20:50.964 }, 00:20:50.964 { 00:20:50.964 "name": "BaseBdev2", 00:20:50.964 "uuid": "58b6500b-2bf4-57b8-bfb8-8ddf54c2e7d2", 00:20:50.964 "is_configured": true, 00:20:50.964 "data_offset": 2048, 00:20:50.964 "data_size": 63488 00:20:50.964 } 00:20:50.964 ] 00:20:50.964 }' 00:20:50.964 00:02:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:50.964 00:02:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:51.529 00:02:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@792 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:51.529 00:02:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:51.529 00:02:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:20:51.529 00:02:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:20:51.529 00:02:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:51.529 00:02:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:51.529 00:02:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:51.787 00:02:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:51.787 "name": "raid_bdev1", 00:20:51.787 "uuid": "fb2b3c58-059e-4167-9b3e-5fe9f41c49f0", 00:20:51.787 "strip_size_kb": 0, 00:20:51.787 "state": "online", 00:20:51.787 "raid_level": "raid1", 00:20:51.787 "superblock": true, 00:20:51.787 "num_base_bdevs": 2, 00:20:51.787 "num_base_bdevs_discovered": 1, 00:20:51.787 "num_base_bdevs_operational": 1, 00:20:51.787 "base_bdevs_list": [ 00:20:51.787 { 00:20:51.787 "name": null, 00:20:51.787 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:51.787 "is_configured": false, 00:20:51.787 "data_offset": 2048, 00:20:51.787 "data_size": 63488 00:20:51.787 }, 00:20:51.787 { 00:20:51.787 "name": "BaseBdev2", 00:20:51.787 "uuid": "58b6500b-2bf4-57b8-bfb8-8ddf54c2e7d2", 00:20:51.787 "is_configured": true, 00:20:51.787 "data_offset": 2048, 00:20:51.787 "data_size": 63488 00:20:51.787 } 00:20:51.787 ] 00:20:51.787 }' 00:20:51.787 00:02:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:51.787 00:02:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:51.787 00:02:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:51.787 00:02:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:20:51.787 00:02:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@795 -- # killprocess 475471 00:20:51.787 00:02:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@946 -- # '[' -z 475471 ']' 00:20:51.787 00:02:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@950 -- # kill -0 475471 00:20:51.787 00:02:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@951 -- # uname 00:20:51.787 00:02:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:20:51.787 00:02:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 475471 00:20:51.787 00:02:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:20:51.787 00:02:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:20:51.787 00:02:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 475471' 00:20:51.787 killing process with pid 475471 00:20:51.787 00:02:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@965 -- # kill 475471 00:20:51.787 Received shutdown signal, test time was about 60.000000 seconds 00:20:51.787 00:20:51.787 Latency(us) 00:20:51.787 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:51.787 =================================================================================================================== 00:20:51.787 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:20:51.787 [2024-05-15 00:02:52.373527] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:51.787 [2024-05-15 00:02:52.373631] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:51.787 [2024-05-15 00:02:52.373682] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to fr 00:02:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@970 -- # wait 475471 00:20:51.787 ee all in destruct 00:20:51.787 [2024-05-15 00:02:52.373698] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcfde40 name raid_bdev1, state offline 00:20:52.045 [2024-05-15 00:02:52.404899] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@797 -- # return 0 00:20:52.303 00:20:52.303 real 0m37.119s 00:20:52.303 user 0m53.409s 00:20:52.303 sys 0m7.045s 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:52.303 ************************************ 00:20:52.303 END TEST raid_rebuild_test_sb 00:20:52.303 ************************************ 00:20:52.303 00:02:52 bdev_raid -- bdev/bdev_raid.sh@825 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:20:52.303 00:02:52 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:20:52.303 00:02:52 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:20:52.303 00:02:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:52.303 ************************************ 00:20:52.303 START TEST raid_rebuild_test_io 00:20:52.303 ************************************ 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 2 false true true 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_level=raid1 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local num_base_bdevs=2 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local superblock=false 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local background_io=true 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local verify=true 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i = 1 )) 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@581 -- # echo BaseBdev1 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@581 -- # echo BaseBdev2 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # local base_bdevs 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # local raid_bdev_name=raid_bdev1 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@581 -- # local strip_size 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@582 -- # local create_arg 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@583 -- # local raid_bdev_size 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@584 -- # local data_offset 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@586 -- # '[' raid1 '!=' raid1 ']' 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@594 -- # strip_size=0 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # '[' false = true ']' 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # raid_pid=480686 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@603 -- # waitforlisten 480686 /var/tmp/spdk-raid.sock 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@827 -- # '[' -z 480686 ']' 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@832 -- # local max_retries=100 00:20:52.303 00:02:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:52.303 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:52.304 00:02:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # xtrace_disable 00:20:52.304 00:02:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:52.304 [2024-05-15 00:02:52.811438] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:20:52.304 [2024-05-15 00:02:52.811503] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid480686 ] 00:20:52.304 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:52.304 Zero copy mechanism will not be used. 00:20:52.561 [2024-05-15 00:02:52.930728] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:52.561 [2024-05-15 00:02:53.037158] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:20:52.561 [2024-05-15 00:02:53.102762] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:52.561 [2024-05-15 00:02:53.102800] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:53.491 00:02:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:20:53.491 00:02:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@860 -- # return 0 00:20:53.491 00:02:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:20:53.491 00:02:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:53.491 BaseBdev1_malloc 00:20:53.492 00:02:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:53.749 [2024-05-15 00:02:54.207062] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:53.749 [2024-05-15 00:02:54.207114] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:53.749 [2024-05-15 00:02:54.207140] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1553b50 00:20:53.749 [2024-05-15 00:02:54.207154] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:53.749 [2024-05-15 00:02:54.208940] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:53.749 [2024-05-15 00:02:54.208973] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:53.749 BaseBdev1 00:20:53.749 00:02:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:20:53.749 00:02:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:54.006 BaseBdev2_malloc 00:20:54.006 00:02:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:20:54.264 [2024-05-15 00:02:54.714471] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:20:54.264 [2024-05-15 00:02:54.714533] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:54.264 [2024-05-15 00:02:54.714556] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16f9d10 00:20:54.264 [2024-05-15 00:02:54.714569] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:54.264 [2024-05-15 00:02:54.716205] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:54.264 [2024-05-15 00:02:54.716237] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:54.264 BaseBdev2 00:20:54.265 00:02:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:20:54.522 spare_malloc 00:20:54.522 00:02:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@613 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:20:54.781 spare_delay 00:20:54.781 00:02:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@614 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:55.073 [2024-05-15 00:02:55.466309] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:55.073 [2024-05-15 00:02:55.466360] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:55.073 [2024-05-15 00:02:55.466382] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16fc240 00:20:55.073 [2024-05-15 00:02:55.466395] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:55.073 [2024-05-15 00:02:55.467934] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:55.073 [2024-05-15 00:02:55.467964] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:55.073 spare 00:20:55.073 00:02:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:20:55.331 [2024-05-15 00:02:55.723006] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:55.331 [2024-05-15 00:02:55.724421] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:55.331 [2024-05-15 00:02:55.724502] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x154cf00 00:20:55.331 [2024-05-15 00:02:55.724513] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:20:55.331 [2024-05-15 00:02:55.724729] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x154b510 00:20:55.331 [2024-05-15 00:02:55.724885] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x154cf00 00:20:55.331 [2024-05-15 00:02:55.724895] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x154cf00 00:20:55.331 [2024-05-15 00:02:55.725024] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:55.331 00:02:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:55.331 00:02:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:55.331 00:02:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:55.331 00:02:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:55.331 00:02:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:55.331 00:02:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:20:55.331 00:02:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:55.331 00:02:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:55.331 00:02:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:55.331 00:02:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:55.331 00:02:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:55.331 00:02:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:55.591 00:02:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:55.591 "name": "raid_bdev1", 00:20:55.591 "uuid": "cf0ab57b-1bd2-4756-b2f5-0b58d0a75d43", 00:20:55.591 "strip_size_kb": 0, 00:20:55.591 "state": "online", 00:20:55.591 "raid_level": "raid1", 00:20:55.591 "superblock": false, 00:20:55.591 "num_base_bdevs": 2, 00:20:55.591 "num_base_bdevs_discovered": 2, 00:20:55.591 "num_base_bdevs_operational": 2, 00:20:55.591 "base_bdevs_list": [ 00:20:55.591 { 00:20:55.591 "name": "BaseBdev1", 00:20:55.591 "uuid": "42778047-eec0-5af0-8f90-d3becb3170ae", 00:20:55.591 "is_configured": true, 00:20:55.591 "data_offset": 0, 00:20:55.591 "data_size": 65536 00:20:55.591 }, 00:20:55.591 { 00:20:55.591 "name": "BaseBdev2", 00:20:55.591 "uuid": "d649b027-98e1-529b-be31-4bc627c02073", 00:20:55.591 "is_configured": true, 00:20:55.591 "data_offset": 0, 00:20:55.591 "data_size": 65536 00:20:55.591 } 00:20:55.591 ] 00:20:55.591 }' 00:20:55.591 00:02:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:55.591 00:02:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:56.157 00:02:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@621 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:56.157 00:02:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@621 -- # jq -r '.[].num_blocks' 00:20:56.415 [2024-05-15 00:02:56.794048] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:56.415 00:02:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@621 -- # raid_bdev_size=65536 00:20:56.415 00:02:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:56.415 00:02:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@624 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:20:56.673 00:02:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@624 -- # data_offset=0 00:20:56.673 00:02:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@626 -- # '[' true = true ']' 00:20:56.673 00:02:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@628 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:56.673 00:02:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:20:56.673 [2024-05-15 00:02:57.164846] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x154c5b0 00:20:56.673 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:56.673 Zero copy mechanism will not be used. 00:20:56.673 Running I/O for 60 seconds... 00:20:56.931 [2024-05-15 00:02:57.289522] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:56.931 [2024-05-15 00:02:57.297660] bdev_raid.c:1957:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x154c5b0 00:20:56.931 00:02:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@648 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:56.931 00:02:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:56.931 00:02:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:56.931 00:02:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:56.931 00:02:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:56.931 00:02:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:20:56.931 00:02:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:56.931 00:02:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:56.931 00:02:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:56.931 00:02:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:56.931 00:02:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:56.931 00:02:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:57.189 00:02:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:57.189 "name": "raid_bdev1", 00:20:57.189 "uuid": "cf0ab57b-1bd2-4756-b2f5-0b58d0a75d43", 00:20:57.189 "strip_size_kb": 0, 00:20:57.189 "state": "online", 00:20:57.189 "raid_level": "raid1", 00:20:57.189 "superblock": false, 00:20:57.189 "num_base_bdevs": 2, 00:20:57.189 "num_base_bdevs_discovered": 1, 00:20:57.189 "num_base_bdevs_operational": 1, 00:20:57.189 "base_bdevs_list": [ 00:20:57.189 { 00:20:57.189 "name": null, 00:20:57.189 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:57.189 "is_configured": false, 00:20:57.189 "data_offset": 0, 00:20:57.189 "data_size": 65536 00:20:57.189 }, 00:20:57.189 { 00:20:57.189 "name": "BaseBdev2", 00:20:57.189 "uuid": "d649b027-98e1-529b-be31-4bc627c02073", 00:20:57.189 "is_configured": true, 00:20:57.189 "data_offset": 0, 00:20:57.189 "data_size": 65536 00:20:57.189 } 00:20:57.189 ] 00:20:57.189 }' 00:20:57.189 00:02:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:57.189 00:02:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:57.756 00:02:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:58.013 [2024-05-15 00:02:58.403264] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:58.013 00:02:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # sleep 1 00:20:58.013 [2024-05-15 00:02:58.462210] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15ea710 00:20:58.013 [2024-05-15 00:02:58.464590] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:58.013 [2024-05-15 00:02:58.582547] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:58.013 [2024-05-15 00:02:58.582993] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:58.271 [2024-05-15 00:02:58.818722] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:58.271 [2024-05-15 00:02:58.818922] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:58.836 [2024-05-15 00:02:59.181048] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:58.836 [2024-05-15 00:02:59.290404] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:58.836 [2024-05-15 00:02:59.290627] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:59.094 00:02:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:59.094 00:02:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:59.094 00:02:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:59.094 00:02:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:59.094 00:02:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:59.094 00:02:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:59.094 00:02:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:59.094 [2024-05-15 00:02:59.638457] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:20:59.352 00:02:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:59.352 "name": "raid_bdev1", 00:20:59.352 "uuid": "cf0ab57b-1bd2-4756-b2f5-0b58d0a75d43", 00:20:59.352 "strip_size_kb": 0, 00:20:59.352 "state": "online", 00:20:59.352 "raid_level": "raid1", 00:20:59.352 "superblock": false, 00:20:59.352 "num_base_bdevs": 2, 00:20:59.352 "num_base_bdevs_discovered": 2, 00:20:59.352 "num_base_bdevs_operational": 2, 00:20:59.352 "process": { 00:20:59.352 "type": "rebuild", 00:20:59.352 "target": "spare", 00:20:59.352 "progress": { 00:20:59.352 "blocks": 14336, 00:20:59.352 "percent": 21 00:20:59.352 } 00:20:59.352 }, 00:20:59.352 "base_bdevs_list": [ 00:20:59.352 { 00:20:59.352 "name": "spare", 00:20:59.352 "uuid": "148ddc05-c707-59e2-8f74-d44d3a8d7278", 00:20:59.352 "is_configured": true, 00:20:59.352 "data_offset": 0, 00:20:59.352 "data_size": 65536 00:20:59.352 }, 00:20:59.352 { 00:20:59.352 "name": "BaseBdev2", 00:20:59.352 "uuid": "d649b027-98e1-529b-be31-4bc627c02073", 00:20:59.352 "is_configured": true, 00:20:59.352 "data_offset": 0, 00:20:59.352 "data_size": 65536 00:20:59.352 } 00:20:59.352 ] 00:20:59.352 }' 00:20:59.352 00:02:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:59.352 00:02:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:59.352 00:02:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:59.352 00:02:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:59.353 00:02:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:59.610 [2024-05-15 00:03:00.027783] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:59.610 [2024-05-15 00:03:00.095951] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:59.610 [2024-05-15 00:03:00.114627] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:59.610 [2024-05-15 00:03:00.145223] bdev_raid.c:1957:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x154c5b0 00:20:59.610 00:03:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:59.610 00:03:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:59.610 00:03:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:59.610 00:03:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:59.610 00:03:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:59.610 00:03:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:20:59.610 00:03:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:59.610 00:03:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:59.610 00:03:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:59.610 00:03:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:59.610 00:03:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:59.610 00:03:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:59.867 00:03:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:59.867 "name": "raid_bdev1", 00:20:59.867 "uuid": "cf0ab57b-1bd2-4756-b2f5-0b58d0a75d43", 00:20:59.867 "strip_size_kb": 0, 00:20:59.867 "state": "online", 00:20:59.867 "raid_level": "raid1", 00:20:59.867 "superblock": false, 00:20:59.867 "num_base_bdevs": 2, 00:20:59.867 "num_base_bdevs_discovered": 1, 00:20:59.867 "num_base_bdevs_operational": 1, 00:20:59.867 "base_bdevs_list": [ 00:20:59.867 { 00:20:59.867 "name": null, 00:20:59.867 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:59.867 "is_configured": false, 00:20:59.867 "data_offset": 0, 00:20:59.867 "data_size": 65536 00:20:59.867 }, 00:20:59.867 { 00:20:59.867 "name": "BaseBdev2", 00:20:59.867 "uuid": "d649b027-98e1-529b-be31-4bc627c02073", 00:20:59.867 "is_configured": true, 00:20:59.867 "data_offset": 0, 00:20:59.867 "data_size": 65536 00:20:59.867 } 00:20:59.867 ] 00:20:59.867 }' 00:20:59.867 00:03:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:59.867 00:03:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:21:00.800 00:03:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:00.800 00:03:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:00.800 00:03:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:21:00.800 00:03:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:21:00.800 00:03:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:00.800 00:03:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:00.800 00:03:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:00.800 00:03:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:00.800 "name": "raid_bdev1", 00:21:00.800 "uuid": "cf0ab57b-1bd2-4756-b2f5-0b58d0a75d43", 00:21:00.800 "strip_size_kb": 0, 00:21:00.800 "state": "online", 00:21:00.800 "raid_level": "raid1", 00:21:00.800 "superblock": false, 00:21:00.800 "num_base_bdevs": 2, 00:21:00.800 "num_base_bdevs_discovered": 1, 00:21:00.800 "num_base_bdevs_operational": 1, 00:21:00.800 "base_bdevs_list": [ 00:21:00.800 { 00:21:00.800 "name": null, 00:21:00.800 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:00.800 "is_configured": false, 00:21:00.800 "data_offset": 0, 00:21:00.800 "data_size": 65536 00:21:00.800 }, 00:21:00.800 { 00:21:00.800 "name": "BaseBdev2", 00:21:00.800 "uuid": "d649b027-98e1-529b-be31-4bc627c02073", 00:21:00.800 "is_configured": true, 00:21:00.800 "data_offset": 0, 00:21:00.800 "data_size": 65536 00:21:00.800 } 00:21:00.800 ] 00:21:00.800 }' 00:21:00.800 00:03:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:00.800 00:03:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:00.800 00:03:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:00.800 00:03:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:21:00.800 00:03:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@667 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:01.058 [2024-05-15 00:03:01.608024] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:01.315 00:03:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@668 -- # sleep 1 00:21:01.315 [2024-05-15 00:03:01.651415] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x125bcd0 00:21:01.315 [2024-05-15 00:03:01.652948] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:01.315 [2024-05-15 00:03:01.780035] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:01.315 [2024-05-15 00:03:01.780495] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:01.573 [2024-05-15 00:03:02.000162] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:01.573 [2024-05-15 00:03:02.000441] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:01.831 [2024-05-15 00:03:02.356278] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:21:02.088 [2024-05-15 00:03:02.583033] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:21:02.088 [2024-05-15 00:03:02.583259] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:21:02.088 00:03:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@669 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:02.088 00:03:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:02.088 00:03:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:02.088 00:03:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:02.088 00:03:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:02.088 00:03:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:02.088 00:03:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:02.346 00:03:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:02.346 "name": "raid_bdev1", 00:21:02.346 "uuid": "cf0ab57b-1bd2-4756-b2f5-0b58d0a75d43", 00:21:02.346 "strip_size_kb": 0, 00:21:02.346 "state": "online", 00:21:02.346 "raid_level": "raid1", 00:21:02.346 "superblock": false, 00:21:02.346 "num_base_bdevs": 2, 00:21:02.346 "num_base_bdevs_discovered": 2, 00:21:02.346 "num_base_bdevs_operational": 2, 00:21:02.346 "process": { 00:21:02.346 "type": "rebuild", 00:21:02.346 "target": "spare", 00:21:02.346 "progress": { 00:21:02.346 "blocks": 12288, 00:21:02.346 "percent": 18 00:21:02.346 } 00:21:02.346 }, 00:21:02.346 "base_bdevs_list": [ 00:21:02.346 { 00:21:02.346 "name": "spare", 00:21:02.346 "uuid": "148ddc05-c707-59e2-8f74-d44d3a8d7278", 00:21:02.346 "is_configured": true, 00:21:02.346 "data_offset": 0, 00:21:02.346 "data_size": 65536 00:21:02.346 }, 00:21:02.346 { 00:21:02.346 "name": "BaseBdev2", 00:21:02.346 "uuid": "d649b027-98e1-529b-be31-4bc627c02073", 00:21:02.346 "is_configured": true, 00:21:02.346 "data_offset": 0, 00:21:02.346 "data_size": 65536 00:21:02.346 } 00:21:02.346 ] 00:21:02.346 }' 00:21:02.346 00:03:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:02.346 00:03:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:02.346 00:03:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:02.346 [2024-05-15 00:03:02.906985] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:21:02.346 [2024-05-15 00:03:02.907474] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:21:02.605 00:03:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:02.605 00:03:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@671 -- # '[' false = true ']' 00:21:02.605 00:03:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@696 -- # local num_base_bdevs_operational=2 00:21:02.605 00:03:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # '[' raid1 = raid1 ']' 00:21:02.605 00:03:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # '[' 2 -gt 2 ']' 00:21:02.605 00:03:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@711 -- # local timeout=673 00:21:02.605 00:03:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:21:02.605 00:03:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:02.605 00:03:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:02.605 00:03:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:02.605 00:03:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:02.605 00:03:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:02.605 00:03:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:02.605 00:03:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:02.605 [2024-05-15 00:03:03.119791] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:02.605 [2024-05-15 00:03:03.120087] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:02.605 00:03:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:02.605 "name": "raid_bdev1", 00:21:02.605 "uuid": "cf0ab57b-1bd2-4756-b2f5-0b58d0a75d43", 00:21:02.605 "strip_size_kb": 0, 00:21:02.605 "state": "online", 00:21:02.605 "raid_level": "raid1", 00:21:02.605 "superblock": false, 00:21:02.605 "num_base_bdevs": 2, 00:21:02.605 "num_base_bdevs_discovered": 2, 00:21:02.605 "num_base_bdevs_operational": 2, 00:21:02.605 "process": { 00:21:02.605 "type": "rebuild", 00:21:02.605 "target": "spare", 00:21:02.605 "progress": { 00:21:02.605 "blocks": 16384, 00:21:02.605 "percent": 25 00:21:02.605 } 00:21:02.605 }, 00:21:02.605 "base_bdevs_list": [ 00:21:02.605 { 00:21:02.605 "name": "spare", 00:21:02.605 "uuid": "148ddc05-c707-59e2-8f74-d44d3a8d7278", 00:21:02.605 "is_configured": true, 00:21:02.605 "data_offset": 0, 00:21:02.605 "data_size": 65536 00:21:02.605 }, 00:21:02.605 { 00:21:02.605 "name": "BaseBdev2", 00:21:02.605 "uuid": "d649b027-98e1-529b-be31-4bc627c02073", 00:21:02.605 "is_configured": true, 00:21:02.605 "data_offset": 0, 00:21:02.605 "data_size": 65536 00:21:02.605 } 00:21:02.605 ] 00:21:02.605 }' 00:21:02.605 00:03:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:02.863 00:03:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:02.863 00:03:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:02.863 00:03:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:02.863 00:03:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@716 -- # sleep 1 00:21:03.121 [2024-05-15 00:03:03.504513] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:21:03.378 [2024-05-15 00:03:03.742152] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:21:03.635 [2024-05-15 00:03:04.089155] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:21:03.635 [2024-05-15 00:03:04.089623] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:21:03.893 00:03:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:21:03.893 00:03:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:03.893 00:03:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:03.893 00:03:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:03.893 00:03:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:03.893 00:03:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:03.893 00:03:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:03.893 00:03:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:03.893 [2024-05-15 00:03:04.319739] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:21:04.150 00:03:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:04.150 "name": "raid_bdev1", 00:21:04.150 "uuid": "cf0ab57b-1bd2-4756-b2f5-0b58d0a75d43", 00:21:04.150 "strip_size_kb": 0, 00:21:04.150 "state": "online", 00:21:04.150 "raid_level": "raid1", 00:21:04.150 "superblock": false, 00:21:04.150 "num_base_bdevs": 2, 00:21:04.150 "num_base_bdevs_discovered": 2, 00:21:04.150 "num_base_bdevs_operational": 2, 00:21:04.150 "process": { 00:21:04.150 "type": "rebuild", 00:21:04.150 "target": "spare", 00:21:04.151 "progress": { 00:21:04.151 "blocks": 28672, 00:21:04.151 "percent": 43 00:21:04.151 } 00:21:04.151 }, 00:21:04.151 "base_bdevs_list": [ 00:21:04.151 { 00:21:04.151 "name": "spare", 00:21:04.151 "uuid": "148ddc05-c707-59e2-8f74-d44d3a8d7278", 00:21:04.151 "is_configured": true, 00:21:04.151 "data_offset": 0, 00:21:04.151 "data_size": 65536 00:21:04.151 }, 00:21:04.151 { 00:21:04.151 "name": "BaseBdev2", 00:21:04.151 "uuid": "d649b027-98e1-529b-be31-4bc627c02073", 00:21:04.151 "is_configured": true, 00:21:04.151 "data_offset": 0, 00:21:04.151 "data_size": 65536 00:21:04.151 } 00:21:04.151 ] 00:21:04.151 }' 00:21:04.151 00:03:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:04.151 00:03:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:04.151 00:03:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:04.151 00:03:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:04.151 00:03:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@716 -- # sleep 1 00:21:05.082 [2024-05-15 00:03:05.402630] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:21:05.082 00:03:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:21:05.082 00:03:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:05.082 00:03:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:05.082 00:03:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:05.082 00:03:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:05.082 00:03:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:05.082 00:03:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:05.082 00:03:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:05.339 00:03:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:05.339 "name": "raid_bdev1", 00:21:05.339 "uuid": "cf0ab57b-1bd2-4756-b2f5-0b58d0a75d43", 00:21:05.339 "strip_size_kb": 0, 00:21:05.339 "state": "online", 00:21:05.339 "raid_level": "raid1", 00:21:05.339 "superblock": false, 00:21:05.339 "num_base_bdevs": 2, 00:21:05.339 "num_base_bdevs_discovered": 2, 00:21:05.339 "num_base_bdevs_operational": 2, 00:21:05.339 "process": { 00:21:05.339 "type": "rebuild", 00:21:05.339 "target": "spare", 00:21:05.339 "progress": { 00:21:05.339 "blocks": 51200, 00:21:05.339 "percent": 78 00:21:05.339 } 00:21:05.339 }, 00:21:05.339 "base_bdevs_list": [ 00:21:05.339 { 00:21:05.339 "name": "spare", 00:21:05.339 "uuid": "148ddc05-c707-59e2-8f74-d44d3a8d7278", 00:21:05.339 "is_configured": true, 00:21:05.339 "data_offset": 0, 00:21:05.339 "data_size": 65536 00:21:05.339 }, 00:21:05.339 { 00:21:05.339 "name": "BaseBdev2", 00:21:05.339 "uuid": "d649b027-98e1-529b-be31-4bc627c02073", 00:21:05.339 "is_configured": true, 00:21:05.339 "data_offset": 0, 00:21:05.339 "data_size": 65536 00:21:05.339 } 00:21:05.339 ] 00:21:05.339 }' 00:21:05.339 00:03:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:05.339 00:03:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:05.339 00:03:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:05.596 00:03:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:05.596 00:03:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@716 -- # sleep 1 00:21:06.162 [2024-05-15 00:03:06.509126] bdev_raid.c:2741:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:21:06.162 [2024-05-15 00:03:06.609447] bdev_raid.c:2458:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:21:06.162 [2024-05-15 00:03:06.619290] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:06.420 00:03:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:21:06.420 00:03:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:06.420 00:03:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:06.420 00:03:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:06.420 00:03:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:06.420 00:03:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:06.420 00:03:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:06.420 00:03:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:06.677 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:06.677 "name": "raid_bdev1", 00:21:06.677 "uuid": "cf0ab57b-1bd2-4756-b2f5-0b58d0a75d43", 00:21:06.677 "strip_size_kb": 0, 00:21:06.677 "state": "online", 00:21:06.677 "raid_level": "raid1", 00:21:06.677 "superblock": false, 00:21:06.677 "num_base_bdevs": 2, 00:21:06.677 "num_base_bdevs_discovered": 2, 00:21:06.677 "num_base_bdevs_operational": 2, 00:21:06.677 "base_bdevs_list": [ 00:21:06.677 { 00:21:06.677 "name": "spare", 00:21:06.677 "uuid": "148ddc05-c707-59e2-8f74-d44d3a8d7278", 00:21:06.677 "is_configured": true, 00:21:06.677 "data_offset": 0, 00:21:06.677 "data_size": 65536 00:21:06.677 }, 00:21:06.677 { 00:21:06.677 "name": "BaseBdev2", 00:21:06.677 "uuid": "d649b027-98e1-529b-be31-4bc627c02073", 00:21:06.677 "is_configured": true, 00:21:06.677 "data_offset": 0, 00:21:06.677 "data_size": 65536 00:21:06.677 } 00:21:06.677 ] 00:21:06.677 }' 00:21:06.677 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:06.677 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:21:06.934 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:06.934 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:21:06.934 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # break 00:21:06.934 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@720 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:06.934 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:06.934 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:21:06.934 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:21:06.934 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:06.934 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:06.934 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:07.190 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:07.190 "name": "raid_bdev1", 00:21:07.190 "uuid": "cf0ab57b-1bd2-4756-b2f5-0b58d0a75d43", 00:21:07.190 "strip_size_kb": 0, 00:21:07.190 "state": "online", 00:21:07.190 "raid_level": "raid1", 00:21:07.190 "superblock": false, 00:21:07.190 "num_base_bdevs": 2, 00:21:07.190 "num_base_bdevs_discovered": 2, 00:21:07.190 "num_base_bdevs_operational": 2, 00:21:07.190 "base_bdevs_list": [ 00:21:07.190 { 00:21:07.190 "name": "spare", 00:21:07.190 "uuid": "148ddc05-c707-59e2-8f74-d44d3a8d7278", 00:21:07.190 "is_configured": true, 00:21:07.190 "data_offset": 0, 00:21:07.190 "data_size": 65536 00:21:07.190 }, 00:21:07.190 { 00:21:07.190 "name": "BaseBdev2", 00:21:07.190 "uuid": "d649b027-98e1-529b-be31-4bc627c02073", 00:21:07.190 "is_configured": true, 00:21:07.190 "data_offset": 0, 00:21:07.190 "data_size": 65536 00:21:07.190 } 00:21:07.191 ] 00:21:07.191 }' 00:21:07.191 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:07.191 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:07.191 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:07.191 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:21:07.191 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:07.191 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:07.191 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:07.191 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:07.191 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:07.191 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:21:07.191 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:07.191 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:07.191 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:07.191 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:07.191 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:07.191 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:07.448 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:07.448 "name": "raid_bdev1", 00:21:07.448 "uuid": "cf0ab57b-1bd2-4756-b2f5-0b58d0a75d43", 00:21:07.448 "strip_size_kb": 0, 00:21:07.448 "state": "online", 00:21:07.448 "raid_level": "raid1", 00:21:07.448 "superblock": false, 00:21:07.448 "num_base_bdevs": 2, 00:21:07.448 "num_base_bdevs_discovered": 2, 00:21:07.448 "num_base_bdevs_operational": 2, 00:21:07.448 "base_bdevs_list": [ 00:21:07.448 { 00:21:07.448 "name": "spare", 00:21:07.448 "uuid": "148ddc05-c707-59e2-8f74-d44d3a8d7278", 00:21:07.448 "is_configured": true, 00:21:07.448 "data_offset": 0, 00:21:07.448 "data_size": 65536 00:21:07.448 }, 00:21:07.448 { 00:21:07.448 "name": "BaseBdev2", 00:21:07.448 "uuid": "d649b027-98e1-529b-be31-4bc627c02073", 00:21:07.448 "is_configured": true, 00:21:07.448 "data_offset": 0, 00:21:07.448 "data_size": 65536 00:21:07.448 } 00:21:07.448 ] 00:21:07.448 }' 00:21:07.448 00:03:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:07.448 00:03:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:21:08.013 00:03:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:08.271 [2024-05-15 00:03:08.716933] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:08.271 [2024-05-15 00:03:08.716963] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:08.271 00:21:08.271 Latency(us) 00:21:08.271 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:08.271 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:21:08.271 raid_bdev1 : 11.53 97.97 293.92 0.00 0.00 13610.67 293.84 111240.24 00:21:08.271 =================================================================================================================== 00:21:08.271 Total : 97.97 293.92 0.00 0.00 13610.67 293.84 111240.24 00:21:08.271 [2024-05-15 00:03:08.732856] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:08.271 [2024-05-15 00:03:08.732882] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:08.271 [2024-05-15 00:03:08.732954] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:08.271 [2024-05-15 00:03:08.732966] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x154cf00 name raid_bdev1, state offline 00:21:08.271 0 00:21:08.271 00:03:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:08.271 00:03:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # jq length 00:21:08.529 00:03:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # [[ 0 == 0 ]] 00:21:08.529 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # '[' true = true ']' 00:21:08.529 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@728 -- # '[' true = true ']' 00:21:08.529 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:21:08.529 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:08.529 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:21:08.529 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:08.529 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:08.529 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:08.529 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:21:08.529 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:08.529 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:08.529 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:21:08.787 /dev/nbd0 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@865 -- # local i 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # break 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:08.787 1+0 records in 00:21:08.787 1+0 records out 00:21:08.787 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000192183 s, 21.3 MB/s 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # size=4096 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # return 0 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # for bdev in "${base_bdevs[@]:1}" 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@732 -- # '[' -z BaseBdev2 ']' 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:08.787 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:21:09.091 /dev/nbd1 00:21:09.092 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:09.092 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:09.092 00:03:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:21:09.092 00:03:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@865 -- # local i 00:21:09.092 00:03:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:21:09.092 00:03:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:21:09.092 00:03:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:21:09.092 00:03:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # break 00:21:09.092 00:03:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:21:09.092 00:03:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:21:09.092 00:03:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:09.092 1+0 records in 00:21:09.092 1+0 records out 00:21:09.092 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000281245 s, 14.6 MB/s 00:21:09.092 00:03:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:09.092 00:03:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # size=4096 00:21:09.092 00:03:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:09.092 00:03:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:21:09.092 00:03:09 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # return 0 00:21:09.092 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:09.092 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:09.092 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@736 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:21:09.352 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@737 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:21:09.352 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:09.352 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:21:09.352 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:09.352 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:21:09.352 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:09.352 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:21:09.352 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:21:09.609 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:21:09.609 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:21:09.609 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:09.609 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:09.609 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:21:09.609 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:21:09.609 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:09.609 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@739 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:21:09.609 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:09.609 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:21:09.609 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:09.609 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:21:09.609 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:09.609 00:03:09 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:09.866 00:03:10 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:09.866 00:03:10 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:09.866 00:03:10 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:09.866 00:03:10 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:09.866 00:03:10 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:09.866 00:03:10 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:09.866 00:03:10 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:21:09.866 00:03:10 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:09.866 00:03:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@748 -- # '[' false = true ']' 00:21:09.866 00:03:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@795 -- # killprocess 480686 00:21:09.866 00:03:10 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@946 -- # '[' -z 480686 ']' 00:21:09.866 00:03:10 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@950 -- # kill -0 480686 00:21:09.866 00:03:10 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@951 -- # uname 00:21:09.866 00:03:10 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:21:09.866 00:03:10 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 480686 00:21:09.866 00:03:10 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:21:09.866 00:03:10 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:21:09.866 00:03:10 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@964 -- # echo 'killing process with pid 480686' 00:21:09.866 killing process with pid 480686 00:21:09.867 00:03:10 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@965 -- # kill 480686 00:21:09.867 Received shutdown signal, test time was about 13.080140 seconds 00:21:09.867 00:21:09.867 Latency(us) 00:21:09.867 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:09.867 =================================================================================================================== 00:21:09.867 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:21:09.867 [2024-05-15 00:03:10.279098] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:09.867 00:03:10 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@970 -- # wait 480686 00:21:09.867 [2024-05-15 00:03:10.301027] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@797 -- # return 0 00:21:10.124 00:21:10.124 real 0m17.809s 00:21:10.124 user 0m27.088s 00:21:10.124 sys 0m2.807s 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1122 -- # xtrace_disable 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:21:10.124 ************************************ 00:21:10.124 END TEST raid_rebuild_test_io 00:21:10.124 ************************************ 00:21:10.124 00:03:10 bdev_raid -- bdev/bdev_raid.sh@826 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:21:10.124 00:03:10 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:21:10.124 00:03:10 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:21:10.124 00:03:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:10.124 ************************************ 00:21:10.124 START TEST raid_rebuild_test_sb_io 00:21:10.124 ************************************ 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 2 true true true 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_level=raid1 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local num_base_bdevs=2 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local superblock=true 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local background_io=true 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local verify=true 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i = 1 )) 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@581 -- # echo BaseBdev1 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@581 -- # echo BaseBdev2 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # local base_bdevs 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # local raid_bdev_name=raid_bdev1 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@581 -- # local strip_size 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@582 -- # local create_arg 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@583 -- # local raid_bdev_size 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@584 -- # local data_offset 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@586 -- # '[' raid1 '!=' raid1 ']' 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@594 -- # strip_size=0 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # '[' true = true ']' 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@598 -- # create_arg+=' -s' 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # raid_pid=483793 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@603 -- # waitforlisten 483793 /var/tmp/spdk-raid.sock 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@827 -- # '[' -z 483793 ']' 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@832 -- # local max_retries=100 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:10.124 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # xtrace_disable 00:21:10.124 00:03:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:10.124 [2024-05-15 00:03:10.703593] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:21:10.124 [2024-05-15 00:03:10.703663] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid483793 ] 00:21:10.124 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:10.124 Zero copy mechanism will not be used. 00:21:10.381 [2024-05-15 00:03:10.824833] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:10.381 [2024-05-15 00:03:10.932645] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:21:10.639 [2024-05-15 00:03:11.002477] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:10.639 [2024-05-15 00:03:11.002513] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:11.202 00:03:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:21:11.202 00:03:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@860 -- # return 0 00:21:11.202 00:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:21:11.202 00:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:11.460 BaseBdev1_malloc 00:21:11.460 00:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:11.460 [2024-05-15 00:03:12.037699] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:11.460 [2024-05-15 00:03:12.037747] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:11.460 [2024-05-15 00:03:12.037770] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x179eb50 00:21:11.460 [2024-05-15 00:03:12.037783] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:11.460 [2024-05-15 00:03:12.039413] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:11.460 [2024-05-15 00:03:12.039444] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:11.460 BaseBdev1 00:21:11.717 00:03:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:21:11.717 00:03:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:11.717 BaseBdev2_malloc 00:21:11.974 00:03:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:21:11.974 [2024-05-15 00:03:12.480934] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:21:11.974 [2024-05-15 00:03:12.480981] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:11.974 [2024-05-15 00:03:12.481002] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1944d10 00:21:11.974 [2024-05-15 00:03:12.481014] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:11.974 [2024-05-15 00:03:12.482489] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:11.974 [2024-05-15 00:03:12.482519] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:11.974 BaseBdev2 00:21:11.974 00:03:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:21:12.232 spare_malloc 00:21:12.232 00:03:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@613 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:21:12.490 spare_delay 00:21:12.490 00:03:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@614 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:12.747 [2024-05-15 00:03:13.124473] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:12.747 [2024-05-15 00:03:13.124520] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:12.747 [2024-05-15 00:03:13.124543] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1947240 00:21:12.747 [2024-05-15 00:03:13.124556] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:12.747 [2024-05-15 00:03:13.126188] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:12.747 [2024-05-15 00:03:13.126220] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:12.747 spare 00:21:12.747 00:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:21:13.004 [2024-05-15 00:03:13.365140] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:13.004 [2024-05-15 00:03:13.366502] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:13.004 [2024-05-15 00:03:13.366679] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1797f00 00:21:13.004 [2024-05-15 00:03:13.366693] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:13.004 [2024-05-15 00:03:13.366895] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1796510 00:21:13.004 [2024-05-15 00:03:13.367038] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1797f00 00:21:13.004 [2024-05-15 00:03:13.367048] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1797f00 00:21:13.004 [2024-05-15 00:03:13.367150] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:13.004 00:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:13.004 00:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:13.004 00:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:13.004 00:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:13.004 00:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:13.004 00:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:21:13.004 00:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:13.004 00:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:13.004 00:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:13.004 00:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:13.004 00:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:13.004 00:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:13.261 00:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:13.261 "name": "raid_bdev1", 00:21:13.261 "uuid": "d41634da-db79-4230-b900-0bec5b1900d0", 00:21:13.261 "strip_size_kb": 0, 00:21:13.261 "state": "online", 00:21:13.261 "raid_level": "raid1", 00:21:13.261 "superblock": true, 00:21:13.261 "num_base_bdevs": 2, 00:21:13.261 "num_base_bdevs_discovered": 2, 00:21:13.261 "num_base_bdevs_operational": 2, 00:21:13.261 "base_bdevs_list": [ 00:21:13.261 { 00:21:13.261 "name": "BaseBdev1", 00:21:13.261 "uuid": "0e42369f-0a9c-5bf2-bc18-65d244becb9a", 00:21:13.261 "is_configured": true, 00:21:13.261 "data_offset": 2048, 00:21:13.261 "data_size": 63488 00:21:13.261 }, 00:21:13.261 { 00:21:13.261 "name": "BaseBdev2", 00:21:13.261 "uuid": "e0f444fe-1fcb-597c-a425-d595209b03e4", 00:21:13.261 "is_configured": true, 00:21:13.261 "data_offset": 2048, 00:21:13.261 "data_size": 63488 00:21:13.261 } 00:21:13.261 ] 00:21:13.261 }' 00:21:13.261 00:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:13.261 00:03:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:13.826 00:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@621 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:13.826 00:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@621 -- # jq -r '.[].num_blocks' 00:21:13.826 [2024-05-15 00:03:14.376055] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:13.826 00:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@621 -- # raid_bdev_size=63488 00:21:13.826 00:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@624 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:21:13.826 00:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:14.083 00:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@624 -- # data_offset=2048 00:21:14.083 00:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@626 -- # '[' true = true ']' 00:21:14.083 00:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@628 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:14.083 00:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:21:14.341 [2024-05-15 00:03:14.718985] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1796c90 00:21:14.341 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:14.341 Zero copy mechanism will not be used. 00:21:14.341 Running I/O for 60 seconds... 00:21:14.341 [2024-05-15 00:03:14.875416] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:14.341 [2024-05-15 00:03:14.891621] bdev_raid.c:1957:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1796c90 00:21:14.341 00:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@648 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:14.341 00:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:14.341 00:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:14.342 00:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:14.342 00:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:14.342 00:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:21:14.342 00:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:14.342 00:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:14.342 00:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:14.342 00:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:14.342 00:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:14.342 00:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:14.600 00:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:14.600 "name": "raid_bdev1", 00:21:14.600 "uuid": "d41634da-db79-4230-b900-0bec5b1900d0", 00:21:14.600 "strip_size_kb": 0, 00:21:14.600 "state": "online", 00:21:14.600 "raid_level": "raid1", 00:21:14.600 "superblock": true, 00:21:14.600 "num_base_bdevs": 2, 00:21:14.600 "num_base_bdevs_discovered": 1, 00:21:14.600 "num_base_bdevs_operational": 1, 00:21:14.600 "base_bdevs_list": [ 00:21:14.600 { 00:21:14.600 "name": null, 00:21:14.600 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:14.600 "is_configured": false, 00:21:14.600 "data_offset": 2048, 00:21:14.600 "data_size": 63488 00:21:14.600 }, 00:21:14.600 { 00:21:14.600 "name": "BaseBdev2", 00:21:14.600 "uuid": "e0f444fe-1fcb-597c-a425-d595209b03e4", 00:21:14.600 "is_configured": true, 00:21:14.600 "data_offset": 2048, 00:21:14.601 "data_size": 63488 00:21:14.601 } 00:21:14.601 ] 00:21:14.601 }' 00:21:14.601 00:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:14.601 00:03:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:15.534 00:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:15.534 [2024-05-15 00:03:16.019592] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:15.534 00:03:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # sleep 1 00:21:15.534 [2024-05-15 00:03:16.078583] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14a6cd0 00:21:15.534 [2024-05-15 00:03:16.080960] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:15.792 [2024-05-15 00:03:16.191837] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:15.792 [2024-05-15 00:03:16.192330] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:16.049 [2024-05-15 00:03:16.402787] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:16.049 [2024-05-15 00:03:16.403015] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:16.307 [2024-05-15 00:03:16.741579] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:21:16.565 [2024-05-15 00:03:16.962858] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:21:16.565 00:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:16.565 00:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:16.565 00:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:16.565 00:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:16.565 00:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:16.565 00:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:16.565 00:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:16.822 [2024-05-15 00:03:17.303746] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:21:16.823 00:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:16.823 "name": "raid_bdev1", 00:21:16.823 "uuid": "d41634da-db79-4230-b900-0bec5b1900d0", 00:21:16.823 "strip_size_kb": 0, 00:21:16.823 "state": "online", 00:21:16.823 "raid_level": "raid1", 00:21:16.823 "superblock": true, 00:21:16.823 "num_base_bdevs": 2, 00:21:16.823 "num_base_bdevs_discovered": 2, 00:21:16.823 "num_base_bdevs_operational": 2, 00:21:16.823 "process": { 00:21:16.823 "type": "rebuild", 00:21:16.823 "target": "spare", 00:21:16.823 "progress": { 00:21:16.823 "blocks": 12288, 00:21:16.823 "percent": 19 00:21:16.823 } 00:21:16.823 }, 00:21:16.823 "base_bdevs_list": [ 00:21:16.823 { 00:21:16.823 "name": "spare", 00:21:16.823 "uuid": "afe6ad3a-b46e-5efc-ad5c-0f7fa168b6e1", 00:21:16.823 "is_configured": true, 00:21:16.823 "data_offset": 2048, 00:21:16.823 "data_size": 63488 00:21:16.823 }, 00:21:16.823 { 00:21:16.823 "name": "BaseBdev2", 00:21:16.823 "uuid": "e0f444fe-1fcb-597c-a425-d595209b03e4", 00:21:16.823 "is_configured": true, 00:21:16.823 "data_offset": 2048, 00:21:16.823 "data_size": 63488 00:21:16.823 } 00:21:16.823 ] 00:21:16.823 }' 00:21:16.823 00:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:16.823 00:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:16.823 00:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:16.823 00:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:16.823 00:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:17.081 [2024-05-15 00:03:17.540984] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:17.081 [2024-05-15 00:03:17.541215] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:17.081 [2024-05-15 00:03:17.629332] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:17.081 [2024-05-15 00:03:17.652375] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:17.339 [2024-05-15 00:03:17.760673] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:17.339 [2024-05-15 00:03:17.770937] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:17.339 [2024-05-15 00:03:17.793699] bdev_raid.c:1957:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1796c90 00:21:17.339 00:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:17.339 00:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:17.339 00:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:17.339 00:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:17.339 00:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:17.339 00:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:21:17.339 00:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:17.339 00:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:17.339 00:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:17.339 00:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:17.339 00:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:17.339 00:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:17.597 00:03:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:17.597 "name": "raid_bdev1", 00:21:17.597 "uuid": "d41634da-db79-4230-b900-0bec5b1900d0", 00:21:17.597 "strip_size_kb": 0, 00:21:17.597 "state": "online", 00:21:17.597 "raid_level": "raid1", 00:21:17.597 "superblock": true, 00:21:17.597 "num_base_bdevs": 2, 00:21:17.597 "num_base_bdevs_discovered": 1, 00:21:17.597 "num_base_bdevs_operational": 1, 00:21:17.597 "base_bdevs_list": [ 00:21:17.597 { 00:21:17.597 "name": null, 00:21:17.597 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:17.597 "is_configured": false, 00:21:17.597 "data_offset": 2048, 00:21:17.597 "data_size": 63488 00:21:17.597 }, 00:21:17.597 { 00:21:17.597 "name": "BaseBdev2", 00:21:17.597 "uuid": "e0f444fe-1fcb-597c-a425-d595209b03e4", 00:21:17.597 "is_configured": true, 00:21:17.597 "data_offset": 2048, 00:21:17.597 "data_size": 63488 00:21:17.597 } 00:21:17.597 ] 00:21:17.597 }' 00:21:17.597 00:03:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:17.597 00:03:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:18.163 00:03:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:18.163 00:03:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:18.163 00:03:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:21:18.163 00:03:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:21:18.163 00:03:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:18.163 00:03:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:18.163 00:03:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:18.421 00:03:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:18.421 "name": "raid_bdev1", 00:21:18.421 "uuid": "d41634da-db79-4230-b900-0bec5b1900d0", 00:21:18.421 "strip_size_kb": 0, 00:21:18.421 "state": "online", 00:21:18.421 "raid_level": "raid1", 00:21:18.421 "superblock": true, 00:21:18.421 "num_base_bdevs": 2, 00:21:18.421 "num_base_bdevs_discovered": 1, 00:21:18.421 "num_base_bdevs_operational": 1, 00:21:18.421 "base_bdevs_list": [ 00:21:18.421 { 00:21:18.421 "name": null, 00:21:18.421 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:18.421 "is_configured": false, 00:21:18.421 "data_offset": 2048, 00:21:18.421 "data_size": 63488 00:21:18.421 }, 00:21:18.421 { 00:21:18.421 "name": "BaseBdev2", 00:21:18.421 "uuid": "e0f444fe-1fcb-597c-a425-d595209b03e4", 00:21:18.421 "is_configured": true, 00:21:18.421 "data_offset": 2048, 00:21:18.421 "data_size": 63488 00:21:18.421 } 00:21:18.421 ] 00:21:18.421 }' 00:21:18.421 00:03:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:18.680 00:03:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:18.680 00:03:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:18.680 00:03:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:21:18.680 00:03:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@667 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:18.938 [2024-05-15 00:03:19.306952] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:18.938 00:03:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@668 -- # sleep 1 00:21:18.938 [2024-05-15 00:03:19.349104] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14a6cd0 00:21:18.938 [2024-05-15 00:03:19.350612] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:18.938 [2024-05-15 00:03:19.485725] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:18.938 [2024-05-15 00:03:19.486094] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:19.195 [2024-05-15 00:03:19.706479] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:19.195 [2024-05-15 00:03:19.706676] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:19.760 [2024-05-15 00:03:20.055311] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:21:19.760 [2024-05-15 00:03:20.259479] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:21:20.018 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@669 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:20.018 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:20.018 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:20.018 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:20.019 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:20.019 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:20.019 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:20.019 [2024-05-15 00:03:20.541594] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:21:20.019 [2024-05-15 00:03:20.542115] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:21:20.019 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:20.019 "name": "raid_bdev1", 00:21:20.019 "uuid": "d41634da-db79-4230-b900-0bec5b1900d0", 00:21:20.019 "strip_size_kb": 0, 00:21:20.019 "state": "online", 00:21:20.019 "raid_level": "raid1", 00:21:20.019 "superblock": true, 00:21:20.019 "num_base_bdevs": 2, 00:21:20.019 "num_base_bdevs_discovered": 2, 00:21:20.019 "num_base_bdevs_operational": 2, 00:21:20.019 "process": { 00:21:20.019 "type": "rebuild", 00:21:20.019 "target": "spare", 00:21:20.019 "progress": { 00:21:20.019 "blocks": 14336, 00:21:20.019 "percent": 22 00:21:20.019 } 00:21:20.019 }, 00:21:20.019 "base_bdevs_list": [ 00:21:20.019 { 00:21:20.019 "name": "spare", 00:21:20.019 "uuid": "afe6ad3a-b46e-5efc-ad5c-0f7fa168b6e1", 00:21:20.019 "is_configured": true, 00:21:20.019 "data_offset": 2048, 00:21:20.019 "data_size": 63488 00:21:20.019 }, 00:21:20.019 { 00:21:20.019 "name": "BaseBdev2", 00:21:20.019 "uuid": "e0f444fe-1fcb-597c-a425-d595209b03e4", 00:21:20.019 "is_configured": true, 00:21:20.019 "data_offset": 2048, 00:21:20.019 "data_size": 63488 00:21:20.019 } 00:21:20.019 ] 00:21:20.019 }' 00:21:20.019 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:20.277 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:20.277 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:20.277 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:20.277 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@671 -- # '[' true = true ']' 00:21:20.277 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@671 -- # '[' = false ']' 00:21:20.277 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 671: [: =: unary operator expected 00:21:20.277 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@696 -- # local num_base_bdevs_operational=2 00:21:20.277 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # '[' raid1 = raid1 ']' 00:21:20.277 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # '[' 2 -gt 2 ']' 00:21:20.277 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@711 -- # local timeout=691 00:21:20.277 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:21:20.277 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:20.277 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:20.277 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:20.277 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:20.277 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:20.277 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:20.277 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:20.277 [2024-05-15 00:03:20.753006] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:20.536 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:20.536 "name": "raid_bdev1", 00:21:20.536 "uuid": "d41634da-db79-4230-b900-0bec5b1900d0", 00:21:20.536 "strip_size_kb": 0, 00:21:20.536 "state": "online", 00:21:20.536 "raid_level": "raid1", 00:21:20.536 "superblock": true, 00:21:20.536 "num_base_bdevs": 2, 00:21:20.536 "num_base_bdevs_discovered": 2, 00:21:20.536 "num_base_bdevs_operational": 2, 00:21:20.536 "process": { 00:21:20.536 "type": "rebuild", 00:21:20.536 "target": "spare", 00:21:20.536 "progress": { 00:21:20.536 "blocks": 18432, 00:21:20.536 "percent": 29 00:21:20.536 } 00:21:20.536 }, 00:21:20.536 "base_bdevs_list": [ 00:21:20.536 { 00:21:20.536 "name": "spare", 00:21:20.536 "uuid": "afe6ad3a-b46e-5efc-ad5c-0f7fa168b6e1", 00:21:20.536 "is_configured": true, 00:21:20.536 "data_offset": 2048, 00:21:20.536 "data_size": 63488 00:21:20.536 }, 00:21:20.536 { 00:21:20.536 "name": "BaseBdev2", 00:21:20.536 "uuid": "e0f444fe-1fcb-597c-a425-d595209b03e4", 00:21:20.536 "is_configured": true, 00:21:20.536 "data_offset": 2048, 00:21:20.536 "data_size": 63488 00:21:20.536 } 00:21:20.536 ] 00:21:20.536 }' 00:21:20.536 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:20.536 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:20.536 00:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:20.536 00:03:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:20.536 00:03:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@716 -- # sleep 1 00:21:20.536 [2024-05-15 00:03:21.086365] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:21:21.101 [2024-05-15 00:03:21.426572] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:21:21.101 [2024-05-15 00:03:21.426933] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:21:21.668 [2024-05-15 00:03:21.967058] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:21:21.668 00:03:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:21:21.668 00:03:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:21.668 00:03:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:21.668 00:03:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:21.668 00:03:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:21.668 00:03:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:21.668 00:03:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:21.668 00:03:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:21.668 [2024-05-15 00:03:22.179509] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:21:21.926 [2024-05-15 00:03:22.280934] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:21:21.926 00:03:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:21.926 "name": "raid_bdev1", 00:21:21.926 "uuid": "d41634da-db79-4230-b900-0bec5b1900d0", 00:21:21.926 "strip_size_kb": 0, 00:21:21.926 "state": "online", 00:21:21.926 "raid_level": "raid1", 00:21:21.926 "superblock": true, 00:21:21.926 "num_base_bdevs": 2, 00:21:21.926 "num_base_bdevs_discovered": 2, 00:21:21.926 "num_base_bdevs_operational": 2, 00:21:21.926 "process": { 00:21:21.926 "type": "rebuild", 00:21:21.926 "target": "spare", 00:21:21.926 "progress": { 00:21:21.926 "blocks": 38912, 00:21:21.926 "percent": 61 00:21:21.926 } 00:21:21.926 }, 00:21:21.926 "base_bdevs_list": [ 00:21:21.926 { 00:21:21.926 "name": "spare", 00:21:21.926 "uuid": "afe6ad3a-b46e-5efc-ad5c-0f7fa168b6e1", 00:21:21.926 "is_configured": true, 00:21:21.926 "data_offset": 2048, 00:21:21.926 "data_size": 63488 00:21:21.926 }, 00:21:21.926 { 00:21:21.926 "name": "BaseBdev2", 00:21:21.926 "uuid": "e0f444fe-1fcb-597c-a425-d595209b03e4", 00:21:21.926 "is_configured": true, 00:21:21.926 "data_offset": 2048, 00:21:21.926 "data_size": 63488 00:21:21.926 } 00:21:21.926 ] 00:21:21.926 }' 00:21:21.926 00:03:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:21.926 00:03:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:21.926 00:03:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:21.926 00:03:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:21.926 00:03:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@716 -- # sleep 1 00:21:21.926 [2024-05-15 00:03:22.511030] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:21:22.184 [2024-05-15 00:03:22.740271] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:21:22.748 [2024-05-15 00:03:23.182324] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:21:23.009 00:03:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:21:23.009 00:03:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:23.009 00:03:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:23.009 00:03:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:23.009 00:03:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:23.009 00:03:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:23.009 00:03:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:23.009 00:03:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:23.009 [2024-05-15 00:03:23.411454] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:21:23.009 [2024-05-15 00:03:23.529446] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:21:23.289 00:03:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:23.289 "name": "raid_bdev1", 00:21:23.289 "uuid": "d41634da-db79-4230-b900-0bec5b1900d0", 00:21:23.289 "strip_size_kb": 0, 00:21:23.289 "state": "online", 00:21:23.289 "raid_level": "raid1", 00:21:23.289 "superblock": true, 00:21:23.289 "num_base_bdevs": 2, 00:21:23.289 "num_base_bdevs_discovered": 2, 00:21:23.289 "num_base_bdevs_operational": 2, 00:21:23.289 "process": { 00:21:23.289 "type": "rebuild", 00:21:23.289 "target": "spare", 00:21:23.289 "progress": { 00:21:23.289 "blocks": 59392, 00:21:23.289 "percent": 93 00:21:23.289 } 00:21:23.289 }, 00:21:23.289 "base_bdevs_list": [ 00:21:23.289 { 00:21:23.289 "name": "spare", 00:21:23.289 "uuid": "afe6ad3a-b46e-5efc-ad5c-0f7fa168b6e1", 00:21:23.289 "is_configured": true, 00:21:23.289 "data_offset": 2048, 00:21:23.289 "data_size": 63488 00:21:23.289 }, 00:21:23.289 { 00:21:23.289 "name": "BaseBdev2", 00:21:23.289 "uuid": "e0f444fe-1fcb-597c-a425-d595209b03e4", 00:21:23.289 "is_configured": true, 00:21:23.289 "data_offset": 2048, 00:21:23.289 "data_size": 63488 00:21:23.289 } 00:21:23.289 ] 00:21:23.289 }' 00:21:23.289 00:03:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:23.289 00:03:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:23.289 00:03:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:23.289 00:03:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:23.289 00:03:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@716 -- # sleep 1 00:21:23.289 [2024-05-15 00:03:23.860966] bdev_raid.c:2741:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:21:23.546 [2024-05-15 00:03:23.969254] bdev_raid.c:2458:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:21:23.546 [2024-05-15 00:03:23.971019] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:24.477 00:03:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:21:24.477 00:03:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:24.477 00:03:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:24.477 00:03:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:24.477 00:03:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:24.477 00:03:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:24.477 00:03:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:24.477 00:03:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:24.477 00:03:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:24.477 "name": "raid_bdev1", 00:21:24.477 "uuid": "d41634da-db79-4230-b900-0bec5b1900d0", 00:21:24.477 "strip_size_kb": 0, 00:21:24.477 "state": "online", 00:21:24.477 "raid_level": "raid1", 00:21:24.477 "superblock": true, 00:21:24.477 "num_base_bdevs": 2, 00:21:24.477 "num_base_bdevs_discovered": 2, 00:21:24.477 "num_base_bdevs_operational": 2, 00:21:24.477 "base_bdevs_list": [ 00:21:24.477 { 00:21:24.477 "name": "spare", 00:21:24.477 "uuid": "afe6ad3a-b46e-5efc-ad5c-0f7fa168b6e1", 00:21:24.477 "is_configured": true, 00:21:24.477 "data_offset": 2048, 00:21:24.477 "data_size": 63488 00:21:24.477 }, 00:21:24.477 { 00:21:24.477 "name": "BaseBdev2", 00:21:24.477 "uuid": "e0f444fe-1fcb-597c-a425-d595209b03e4", 00:21:24.477 "is_configured": true, 00:21:24.477 "data_offset": 2048, 00:21:24.478 "data_size": 63488 00:21:24.478 } 00:21:24.478 ] 00:21:24.478 }' 00:21:24.478 00:03:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:24.478 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:21:24.478 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:24.478 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:21:24.478 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # break 00:21:24.478 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@720 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:24.478 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:24.478 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:21:24.478 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:21:24.478 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:24.478 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:24.478 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:24.735 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:24.735 "name": "raid_bdev1", 00:21:24.735 "uuid": "d41634da-db79-4230-b900-0bec5b1900d0", 00:21:24.735 "strip_size_kb": 0, 00:21:24.735 "state": "online", 00:21:24.735 "raid_level": "raid1", 00:21:24.735 "superblock": true, 00:21:24.735 "num_base_bdevs": 2, 00:21:24.735 "num_base_bdevs_discovered": 2, 00:21:24.735 "num_base_bdevs_operational": 2, 00:21:24.735 "base_bdevs_list": [ 00:21:24.735 { 00:21:24.735 "name": "spare", 00:21:24.735 "uuid": "afe6ad3a-b46e-5efc-ad5c-0f7fa168b6e1", 00:21:24.735 "is_configured": true, 00:21:24.735 "data_offset": 2048, 00:21:24.735 "data_size": 63488 00:21:24.735 }, 00:21:24.735 { 00:21:24.735 "name": "BaseBdev2", 00:21:24.735 "uuid": "e0f444fe-1fcb-597c-a425-d595209b03e4", 00:21:24.735 "is_configured": true, 00:21:24.735 "data_offset": 2048, 00:21:24.735 "data_size": 63488 00:21:24.735 } 00:21:24.735 ] 00:21:24.735 }' 00:21:24.735 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:24.992 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:24.992 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:24.992 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:21:24.992 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:24.992 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:24.992 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:24.992 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:24.992 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:24.992 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:21:24.992 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:24.992 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:24.992 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:24.992 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:24.992 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:24.992 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:25.249 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:25.249 "name": "raid_bdev1", 00:21:25.249 "uuid": "d41634da-db79-4230-b900-0bec5b1900d0", 00:21:25.249 "strip_size_kb": 0, 00:21:25.249 "state": "online", 00:21:25.249 "raid_level": "raid1", 00:21:25.249 "superblock": true, 00:21:25.249 "num_base_bdevs": 2, 00:21:25.249 "num_base_bdevs_discovered": 2, 00:21:25.249 "num_base_bdevs_operational": 2, 00:21:25.249 "base_bdevs_list": [ 00:21:25.249 { 00:21:25.249 "name": "spare", 00:21:25.249 "uuid": "afe6ad3a-b46e-5efc-ad5c-0f7fa168b6e1", 00:21:25.249 "is_configured": true, 00:21:25.249 "data_offset": 2048, 00:21:25.249 "data_size": 63488 00:21:25.249 }, 00:21:25.249 { 00:21:25.249 "name": "BaseBdev2", 00:21:25.249 "uuid": "e0f444fe-1fcb-597c-a425-d595209b03e4", 00:21:25.249 "is_configured": true, 00:21:25.249 "data_offset": 2048, 00:21:25.249 "data_size": 63488 00:21:25.249 } 00:21:25.249 ] 00:21:25.249 }' 00:21:25.249 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:25.249 00:03:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:25.813 00:03:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:26.070 [2024-05-15 00:03:26.471891] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:26.070 [2024-05-15 00:03:26.471924] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:26.070 00:21:26.070 Latency(us) 00:21:26.070 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:26.070 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:21:26.070 raid_bdev1 : 11.75 93.02 279.06 0.00 0.00 13905.98 297.41 110328.43 00:21:26.070 =================================================================================================================== 00:21:26.070 Total : 93.02 279.06 0.00 0.00 13905.98 297.41 110328.43 00:21:26.070 [2024-05-15 00:03:26.503906] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:26.070 [2024-05-15 00:03:26.503935] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:26.070 [2024-05-15 00:03:26.504008] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:26.070 [2024-05-15 00:03:26.504020] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1797f00 name raid_bdev1, state offline 00:21:26.070 0 00:21:26.070 00:03:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:26.070 00:03:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # jq length 00:21:26.328 00:03:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # [[ 0 == 0 ]] 00:21:26.328 00:03:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # '[' true = true ']' 00:21:26.328 00:03:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@728 -- # '[' true = true ']' 00:21:26.328 00:03:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:21:26.328 00:03:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:26.328 00:03:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:21:26.328 00:03:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:26.328 00:03:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:26.328 00:03:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:26.328 00:03:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:21:26.328 00:03:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:26.328 00:03:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:26.328 00:03:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:21:26.586 /dev/nbd0 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@865 -- # local i 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # break 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:26.586 1+0 records in 00:21:26.586 1+0 records out 00:21:26.586 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000280102 s, 14.6 MB/s 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # size=4096 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # return 0 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # for bdev in "${base_bdevs[@]:1}" 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@732 -- # '[' -z BaseBdev2 ']' 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:26.586 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:21:26.844 /dev/nbd1 00:21:26.844 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:26.844 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:26.844 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:21:26.844 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@865 -- # local i 00:21:26.844 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:21:26.844 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:21:26.844 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:21:26.844 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # break 00:21:26.844 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:21:26.844 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:21:26.844 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:26.844 1+0 records in 00:21:26.844 1+0 records out 00:21:26.844 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000307416 s, 13.3 MB/s 00:21:26.844 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:26.844 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # size=4096 00:21:26.844 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:26.844 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:21:26.844 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # return 0 00:21:26.844 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:26.844 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:26.844 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@736 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:21:26.844 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@737 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:21:26.844 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:26.844 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:21:26.844 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:26.844 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:21:26.844 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:26.844 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:21:27.101 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:21:27.101 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:21:27.101 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:21:27.101 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:27.101 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:27.101 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:21:27.101 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:21:27.101 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:27.101 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@739 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:21:27.101 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:27.101 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:21:27.101 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:27.101 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:21:27.101 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:27.101 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:27.665 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:27.665 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:27.665 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:27.665 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:27.665 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:27.665 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:27.665 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:21:27.665 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:27.665 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # '[' true = true ']' 00:21:27.665 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:21:27.665 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev1 ']' 00:21:27.665 00:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:21:27.665 00:03:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:27.923 [2024-05-15 00:03:28.429671] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:27.923 [2024-05-15 00:03:28.429718] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:27.923 [2024-05-15 00:03:28.429742] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x179ed80 00:21:27.923 [2024-05-15 00:03:28.429755] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:27.923 [2024-05-15 00:03:28.431345] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:27.923 [2024-05-15 00:03:28.431375] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:27.923 [2024-05-15 00:03:28.431451] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev1 00:21:27.923 [2024-05-15 00:03:28.431478] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:27.923 BaseBdev1 00:21:27.923 00:03:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:21:27.923 00:03:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev2 ']' 00:21:27.923 00:03:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev2 00:21:28.180 00:03:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:21:28.437 [2024-05-15 00:03:28.919045] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:21:28.437 [2024-05-15 00:03:28.919088] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:28.437 [2024-05-15 00:03:28.919113] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18348f0 00:21:28.437 [2024-05-15 00:03:28.919126] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:28.437 [2024-05-15 00:03:28.919457] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:28.437 [2024-05-15 00:03:28.919475] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:28.437 [2024-05-15 00:03:28.919536] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev2 00:21:28.437 [2024-05-15 00:03:28.919548] bdev_raid.c:3396:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev2 (3) greater than existing raid bdev raid_bdev1 (1) 00:21:28.437 [2024-05-15 00:03:28.919558] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:28.437 [2024-05-15 00:03:28.919574] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1799420 name raid_bdev1, state configuring 00:21:28.437 [2024-05-15 00:03:28.919602] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:28.437 BaseBdev2 00:21:28.437 00:03:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@757 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:28.694 00:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@758 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:28.951 [2024-05-15 00:03:29.408410] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:28.951 [2024-05-15 00:03:29.408452] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:28.951 [2024-05-15 00:03:29.408474] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1944f40 00:21:28.951 [2024-05-15 00:03:29.408486] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:28.951 [2024-05-15 00:03:29.408842] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:28.951 [2024-05-15 00:03:29.408860] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:28.951 [2024-05-15 00:03:29.408936] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev spare 00:21:28.951 [2024-05-15 00:03:29.408955] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:28.951 spare 00:21:28.951 00:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:28.951 00:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:28.951 00:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:28.951 00:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:28.951 00:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:28.951 00:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:21:28.951 00:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:28.951 00:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:28.951 00:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:28.951 00:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:28.951 00:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:28.952 00:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:28.952 [2024-05-15 00:03:29.509280] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x18344c0 00:21:28.952 [2024-05-15 00:03:29.509300] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:28.952 [2024-05-15 00:03:29.509494] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1796a10 00:21:28.952 [2024-05-15 00:03:29.509651] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18344c0 00:21:28.952 [2024-05-15 00:03:29.509661] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18344c0 00:21:28.952 [2024-05-15 00:03:29.509770] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:29.208 00:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:29.209 "name": "raid_bdev1", 00:21:29.209 "uuid": "d41634da-db79-4230-b900-0bec5b1900d0", 00:21:29.209 "strip_size_kb": 0, 00:21:29.209 "state": "online", 00:21:29.209 "raid_level": "raid1", 00:21:29.209 "superblock": true, 00:21:29.209 "num_base_bdevs": 2, 00:21:29.209 "num_base_bdevs_discovered": 2, 00:21:29.209 "num_base_bdevs_operational": 2, 00:21:29.209 "base_bdevs_list": [ 00:21:29.209 { 00:21:29.209 "name": "spare", 00:21:29.209 "uuid": "afe6ad3a-b46e-5efc-ad5c-0f7fa168b6e1", 00:21:29.209 "is_configured": true, 00:21:29.209 "data_offset": 2048, 00:21:29.209 "data_size": 63488 00:21:29.209 }, 00:21:29.209 { 00:21:29.209 "name": "BaseBdev2", 00:21:29.209 "uuid": "e0f444fe-1fcb-597c-a425-d595209b03e4", 00:21:29.209 "is_configured": true, 00:21:29.209 "data_offset": 2048, 00:21:29.209 "data_size": 63488 00:21:29.209 } 00:21:29.209 ] 00:21:29.209 }' 00:21:29.209 00:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:29.209 00:03:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:29.776 00:03:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:29.776 00:03:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:29.776 00:03:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:21:29.776 00:03:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:21:29.776 00:03:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:29.776 00:03:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:29.776 00:03:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:30.033 00:03:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:30.033 "name": "raid_bdev1", 00:21:30.033 "uuid": "d41634da-db79-4230-b900-0bec5b1900d0", 00:21:30.034 "strip_size_kb": 0, 00:21:30.034 "state": "online", 00:21:30.034 "raid_level": "raid1", 00:21:30.034 "superblock": true, 00:21:30.034 "num_base_bdevs": 2, 00:21:30.034 "num_base_bdevs_discovered": 2, 00:21:30.034 "num_base_bdevs_operational": 2, 00:21:30.034 "base_bdevs_list": [ 00:21:30.034 { 00:21:30.034 "name": "spare", 00:21:30.034 "uuid": "afe6ad3a-b46e-5efc-ad5c-0f7fa168b6e1", 00:21:30.034 "is_configured": true, 00:21:30.034 "data_offset": 2048, 00:21:30.034 "data_size": 63488 00:21:30.034 }, 00:21:30.034 { 00:21:30.034 "name": "BaseBdev2", 00:21:30.034 "uuid": "e0f444fe-1fcb-597c-a425-d595209b03e4", 00:21:30.034 "is_configured": true, 00:21:30.034 "data_offset": 2048, 00:21:30.034 "data_size": 63488 00:21:30.034 } 00:21:30.034 ] 00:21:30.034 }' 00:21:30.034 00:03:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:30.034 00:03:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:30.034 00:03:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:30.034 00:03:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:21:30.291 00:03:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:30.291 00:03:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # jq -r '.[].base_bdevs_list[0].name' 00:21:30.291 00:03:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # [[ spare == \s\p\a\r\e ]] 00:21:30.292 00:03:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:30.550 [2024-05-15 00:03:31.089271] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:30.550 00:03:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:30.550 00:03:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:30.550 00:03:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:30.550 00:03:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:30.550 00:03:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:30.550 00:03:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:21:30.550 00:03:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:30.550 00:03:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:30.550 00:03:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:30.550 00:03:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:30.550 00:03:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:30.550 00:03:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:30.809 00:03:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:30.809 "name": "raid_bdev1", 00:21:30.809 "uuid": "d41634da-db79-4230-b900-0bec5b1900d0", 00:21:30.809 "strip_size_kb": 0, 00:21:30.809 "state": "online", 00:21:30.809 "raid_level": "raid1", 00:21:30.809 "superblock": true, 00:21:30.809 "num_base_bdevs": 2, 00:21:30.809 "num_base_bdevs_discovered": 1, 00:21:30.809 "num_base_bdevs_operational": 1, 00:21:30.809 "base_bdevs_list": [ 00:21:30.809 { 00:21:30.809 "name": null, 00:21:30.809 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:30.809 "is_configured": false, 00:21:30.809 "data_offset": 2048, 00:21:30.809 "data_size": 63488 00:21:30.809 }, 00:21:30.809 { 00:21:30.809 "name": "BaseBdev2", 00:21:30.809 "uuid": "e0f444fe-1fcb-597c-a425-d595209b03e4", 00:21:30.809 "is_configured": true, 00:21:30.809 "data_offset": 2048, 00:21:30.809 "data_size": 63488 00:21:30.809 } 00:21:30.809 ] 00:21:30.809 }' 00:21:30.809 00:03:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:30.809 00:03:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:31.373 00:03:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:31.631 [2024-05-15 00:03:32.120130] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:31.631 [2024-05-15 00:03:32.120269] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:21:31.631 [2024-05-15 00:03:32.120285] bdev_raid.c:3452:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:21:31.631 [2024-05-15 00:03:32.120312] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:31.631 [2024-05-15 00:03:32.125547] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x179e490 00:21:31.631 [2024-05-15 00:03:32.127007] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:31.631 00:03:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # sleep 1 00:21:32.563 00:03:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:32.563 00:03:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:32.563 00:03:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:32.563 00:03:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:32.563 00:03:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:32.820 00:03:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:32.820 00:03:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:32.820 00:03:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:32.820 "name": "raid_bdev1", 00:21:32.820 "uuid": "d41634da-db79-4230-b900-0bec5b1900d0", 00:21:32.820 "strip_size_kb": 0, 00:21:32.820 "state": "online", 00:21:32.820 "raid_level": "raid1", 00:21:32.820 "superblock": true, 00:21:32.820 "num_base_bdevs": 2, 00:21:32.820 "num_base_bdevs_discovered": 2, 00:21:32.820 "num_base_bdevs_operational": 2, 00:21:32.820 "process": { 00:21:32.820 "type": "rebuild", 00:21:32.820 "target": "spare", 00:21:32.820 "progress": { 00:21:32.820 "blocks": 24576, 00:21:32.820 "percent": 38 00:21:32.820 } 00:21:32.820 }, 00:21:32.820 "base_bdevs_list": [ 00:21:32.820 { 00:21:32.820 "name": "spare", 00:21:32.820 "uuid": "afe6ad3a-b46e-5efc-ad5c-0f7fa168b6e1", 00:21:32.820 "is_configured": true, 00:21:32.820 "data_offset": 2048, 00:21:32.820 "data_size": 63488 00:21:32.820 }, 00:21:32.820 { 00:21:32.820 "name": "BaseBdev2", 00:21:32.820 "uuid": "e0f444fe-1fcb-597c-a425-d595209b03e4", 00:21:32.820 "is_configured": true, 00:21:32.820 "data_offset": 2048, 00:21:32.820 "data_size": 63488 00:21:32.820 } 00:21:32.820 ] 00:21:32.820 }' 00:21:32.820 00:03:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:33.078 00:03:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:33.078 00:03:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:33.078 00:03:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:33.078 00:03:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:33.336 [2024-05-15 00:03:33.710291] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:33.336 [2024-05-15 00:03:33.739355] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:33.336 [2024-05-15 00:03:33.739405] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:33.336 00:03:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:33.336 00:03:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:33.336 00:03:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:33.337 00:03:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:33.337 00:03:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:33.337 00:03:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:21:33.337 00:03:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:33.337 00:03:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:33.337 00:03:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:33.337 00:03:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:33.337 00:03:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:33.337 00:03:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:33.595 00:03:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:33.595 "name": "raid_bdev1", 00:21:33.595 "uuid": "d41634da-db79-4230-b900-0bec5b1900d0", 00:21:33.595 "strip_size_kb": 0, 00:21:33.595 "state": "online", 00:21:33.595 "raid_level": "raid1", 00:21:33.595 "superblock": true, 00:21:33.595 "num_base_bdevs": 2, 00:21:33.595 "num_base_bdevs_discovered": 1, 00:21:33.595 "num_base_bdevs_operational": 1, 00:21:33.595 "base_bdevs_list": [ 00:21:33.595 { 00:21:33.595 "name": null, 00:21:33.595 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:33.595 "is_configured": false, 00:21:33.595 "data_offset": 2048, 00:21:33.595 "data_size": 63488 00:21:33.595 }, 00:21:33.595 { 00:21:33.595 "name": "BaseBdev2", 00:21:33.595 "uuid": "e0f444fe-1fcb-597c-a425-d595209b03e4", 00:21:33.595 "is_configured": true, 00:21:33.595 "data_offset": 2048, 00:21:33.595 "data_size": 63488 00:21:33.595 } 00:21:33.595 ] 00:21:33.595 }' 00:21:33.595 00:03:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:33.595 00:03:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:34.162 00:03:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:34.162 [2024-05-15 00:03:34.735414] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:34.162 [2024-05-15 00:03:34.735470] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:34.162 [2024-05-15 00:03:34.735494] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1946510 00:21:34.162 [2024-05-15 00:03:34.735507] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:34.162 [2024-05-15 00:03:34.735874] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:34.162 [2024-05-15 00:03:34.735891] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:34.162 [2024-05-15 00:03:34.735971] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev spare 00:21:34.162 [2024-05-15 00:03:34.735984] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:21:34.162 [2024-05-15 00:03:34.735994] bdev_raid.c:3452:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:21:34.162 [2024-05-15 00:03:34.736014] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:34.162 [2024-05-15 00:03:34.741326] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x179da90 00:21:34.162 spare 00:21:34.162 [2024-05-15 00:03:34.742668] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:34.421 00:03:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # sleep 1 00:21:35.357 00:03:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:35.357 00:03:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:35.357 00:03:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:35.357 00:03:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:35.357 00:03:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:35.357 00:03:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:35.357 00:03:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:35.616 00:03:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:35.616 "name": "raid_bdev1", 00:21:35.616 "uuid": "d41634da-db79-4230-b900-0bec5b1900d0", 00:21:35.616 "strip_size_kb": 0, 00:21:35.616 "state": "online", 00:21:35.616 "raid_level": "raid1", 00:21:35.616 "superblock": true, 00:21:35.616 "num_base_bdevs": 2, 00:21:35.616 "num_base_bdevs_discovered": 2, 00:21:35.616 "num_base_bdevs_operational": 2, 00:21:35.616 "process": { 00:21:35.616 "type": "rebuild", 00:21:35.616 "target": "spare", 00:21:35.616 "progress": { 00:21:35.616 "blocks": 24576, 00:21:35.616 "percent": 38 00:21:35.616 } 00:21:35.616 }, 00:21:35.616 "base_bdevs_list": [ 00:21:35.616 { 00:21:35.616 "name": "spare", 00:21:35.616 "uuid": "afe6ad3a-b46e-5efc-ad5c-0f7fa168b6e1", 00:21:35.616 "is_configured": true, 00:21:35.616 "data_offset": 2048, 00:21:35.616 "data_size": 63488 00:21:35.616 }, 00:21:35.616 { 00:21:35.616 "name": "BaseBdev2", 00:21:35.616 "uuid": "e0f444fe-1fcb-597c-a425-d595209b03e4", 00:21:35.616 "is_configured": true, 00:21:35.616 "data_offset": 2048, 00:21:35.616 "data_size": 63488 00:21:35.616 } 00:21:35.616 ] 00:21:35.616 }' 00:21:35.616 00:03:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:35.616 00:03:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:35.616 00:03:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:35.616 00:03:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:35.616 00:03:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:35.874 [2024-05-15 00:03:36.326901] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:35.874 [2024-05-15 00:03:36.355347] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:35.874 [2024-05-15 00:03:36.355391] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:35.874 00:03:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@780 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:35.874 00:03:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:35.874 00:03:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:35.874 00:03:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:35.874 00:03:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:35.874 00:03:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:21:35.874 00:03:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:35.874 00:03:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:35.874 00:03:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:35.874 00:03:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:35.874 00:03:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:35.874 00:03:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:36.133 00:03:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:36.133 "name": "raid_bdev1", 00:21:36.133 "uuid": "d41634da-db79-4230-b900-0bec5b1900d0", 00:21:36.133 "strip_size_kb": 0, 00:21:36.133 "state": "online", 00:21:36.133 "raid_level": "raid1", 00:21:36.133 "superblock": true, 00:21:36.133 "num_base_bdevs": 2, 00:21:36.133 "num_base_bdevs_discovered": 1, 00:21:36.133 "num_base_bdevs_operational": 1, 00:21:36.133 "base_bdevs_list": [ 00:21:36.133 { 00:21:36.133 "name": null, 00:21:36.133 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:36.133 "is_configured": false, 00:21:36.133 "data_offset": 2048, 00:21:36.133 "data_size": 63488 00:21:36.133 }, 00:21:36.133 { 00:21:36.133 "name": "BaseBdev2", 00:21:36.133 "uuid": "e0f444fe-1fcb-597c-a425-d595209b03e4", 00:21:36.133 "is_configured": true, 00:21:36.133 "data_offset": 2048, 00:21:36.133 "data_size": 63488 00:21:36.133 } 00:21:36.133 ] 00:21:36.133 }' 00:21:36.133 00:03:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:36.133 00:03:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:36.701 00:03:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@781 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:36.701 00:03:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:36.701 00:03:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:21:36.701 00:03:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:21:36.701 00:03:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:36.701 00:03:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:36.701 00:03:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:36.959 00:03:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:36.959 "name": "raid_bdev1", 00:21:36.959 "uuid": "d41634da-db79-4230-b900-0bec5b1900d0", 00:21:36.959 "strip_size_kb": 0, 00:21:36.959 "state": "online", 00:21:36.959 "raid_level": "raid1", 00:21:36.959 "superblock": true, 00:21:36.959 "num_base_bdevs": 2, 00:21:36.959 "num_base_bdevs_discovered": 1, 00:21:36.959 "num_base_bdevs_operational": 1, 00:21:36.959 "base_bdevs_list": [ 00:21:36.959 { 00:21:36.959 "name": null, 00:21:36.959 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:36.959 "is_configured": false, 00:21:36.959 "data_offset": 2048, 00:21:36.959 "data_size": 63488 00:21:36.959 }, 00:21:36.959 { 00:21:36.959 "name": "BaseBdev2", 00:21:36.959 "uuid": "e0f444fe-1fcb-597c-a425-d595209b03e4", 00:21:36.959 "is_configured": true, 00:21:36.959 "data_offset": 2048, 00:21:36.959 "data_size": 63488 00:21:36.959 } 00:21:36.959 ] 00:21:36.959 }' 00:21:36.959 00:03:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:36.959 00:03:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:36.959 00:03:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:37.248 00:03:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:21:37.248 00:03:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:21:37.248 00:03:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@785 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:37.506 [2024-05-15 00:03:38.032761] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:37.506 [2024-05-15 00:03:38.032810] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:37.506 [2024-05-15 00:03:38.032832] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17967d0 00:21:37.506 [2024-05-15 00:03:38.032845] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:37.506 [2024-05-15 00:03:38.033185] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:37.506 [2024-05-15 00:03:38.033203] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:37.506 [2024-05-15 00:03:38.033266] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev1 00:21:37.506 [2024-05-15 00:03:38.033277] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:21:37.506 [2024-05-15 00:03:38.033287] bdev_raid.c:3430:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:21:37.506 BaseBdev1 00:21:37.506 00:03:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@786 -- # sleep 1 00:21:38.880 00:03:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@787 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:38.880 00:03:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:38.880 00:03:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:38.880 00:03:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:38.880 00:03:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:38.880 00:03:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:21:38.880 00:03:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:38.880 00:03:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:38.880 00:03:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:38.880 00:03:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:38.880 00:03:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:38.880 00:03:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:38.880 00:03:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:38.880 "name": "raid_bdev1", 00:21:38.880 "uuid": "d41634da-db79-4230-b900-0bec5b1900d0", 00:21:38.880 "strip_size_kb": 0, 00:21:38.880 "state": "online", 00:21:38.880 "raid_level": "raid1", 00:21:38.880 "superblock": true, 00:21:38.880 "num_base_bdevs": 2, 00:21:38.880 "num_base_bdevs_discovered": 1, 00:21:38.880 "num_base_bdevs_operational": 1, 00:21:38.880 "base_bdevs_list": [ 00:21:38.880 { 00:21:38.880 "name": null, 00:21:38.880 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:38.880 "is_configured": false, 00:21:38.880 "data_offset": 2048, 00:21:38.880 "data_size": 63488 00:21:38.880 }, 00:21:38.880 { 00:21:38.880 "name": "BaseBdev2", 00:21:38.880 "uuid": "e0f444fe-1fcb-597c-a425-d595209b03e4", 00:21:38.881 "is_configured": true, 00:21:38.881 "data_offset": 2048, 00:21:38.881 "data_size": 63488 00:21:38.881 } 00:21:38.881 ] 00:21:38.881 }' 00:21:38.881 00:03:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:38.881 00:03:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:39.445 00:03:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@788 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:39.445 00:03:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:39.445 00:03:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:21:39.445 00:03:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:21:39.445 00:03:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:39.445 00:03:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:39.445 00:03:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:39.701 00:03:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:39.701 "name": "raid_bdev1", 00:21:39.701 "uuid": "d41634da-db79-4230-b900-0bec5b1900d0", 00:21:39.701 "strip_size_kb": 0, 00:21:39.701 "state": "online", 00:21:39.701 "raid_level": "raid1", 00:21:39.701 "superblock": true, 00:21:39.701 "num_base_bdevs": 2, 00:21:39.701 "num_base_bdevs_discovered": 1, 00:21:39.701 "num_base_bdevs_operational": 1, 00:21:39.701 "base_bdevs_list": [ 00:21:39.701 { 00:21:39.701 "name": null, 00:21:39.701 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:39.701 "is_configured": false, 00:21:39.701 "data_offset": 2048, 00:21:39.701 "data_size": 63488 00:21:39.701 }, 00:21:39.701 { 00:21:39.701 "name": "BaseBdev2", 00:21:39.701 "uuid": "e0f444fe-1fcb-597c-a425-d595209b03e4", 00:21:39.701 "is_configured": true, 00:21:39.702 "data_offset": 2048, 00:21:39.702 "data_size": 63488 00:21:39.702 } 00:21:39.702 ] 00:21:39.702 }' 00:21:39.702 00:03:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:39.702 00:03:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:39.702 00:03:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:39.702 00:03:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:21:39.702 00:03:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@789 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:39.702 00:03:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:21:39.702 00:03:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:39.702 00:03:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:39.702 00:03:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:39.702 00:03:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:39.702 00:03:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:39.702 00:03:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:39.702 00:03:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:39.702 00:03:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:39.702 00:03:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:39.702 00:03:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:39.959 [2024-05-15 00:03:40.375289] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:39.959 [2024-05-15 00:03:40.375418] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:21:39.959 [2024-05-15 00:03:40.375434] bdev_raid.c:3430:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:21:39.959 request: 00:21:39.959 { 00:21:39.959 "raid_bdev": "raid_bdev1", 00:21:39.959 "base_bdev": "BaseBdev1", 00:21:39.959 "method": "bdev_raid_add_base_bdev", 00:21:39.959 "req_id": 1 00:21:39.959 } 00:21:39.959 Got JSON-RPC error response 00:21:39.959 response: 00:21:39.959 { 00:21:39.959 "code": -22, 00:21:39.959 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:21:39.959 } 00:21:39.959 00:03:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:21:39.959 00:03:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:21:39.959 00:03:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:21:39.959 00:03:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:21:39.959 00:03:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@790 -- # sleep 1 00:21:40.892 00:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:40.892 00:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:40.892 00:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:40.892 00:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:40.892 00:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:40.892 00:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:21:40.892 00:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:40.892 00:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:40.892 00:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:40.892 00:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:40.892 00:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:40.892 00:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:41.150 00:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:41.150 "name": "raid_bdev1", 00:21:41.150 "uuid": "d41634da-db79-4230-b900-0bec5b1900d0", 00:21:41.150 "strip_size_kb": 0, 00:21:41.150 "state": "online", 00:21:41.150 "raid_level": "raid1", 00:21:41.150 "superblock": true, 00:21:41.150 "num_base_bdevs": 2, 00:21:41.150 "num_base_bdevs_discovered": 1, 00:21:41.150 "num_base_bdevs_operational": 1, 00:21:41.150 "base_bdevs_list": [ 00:21:41.150 { 00:21:41.150 "name": null, 00:21:41.150 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:41.150 "is_configured": false, 00:21:41.150 "data_offset": 2048, 00:21:41.150 "data_size": 63488 00:21:41.150 }, 00:21:41.150 { 00:21:41.150 "name": "BaseBdev2", 00:21:41.150 "uuid": "e0f444fe-1fcb-597c-a425-d595209b03e4", 00:21:41.150 "is_configured": true, 00:21:41.150 "data_offset": 2048, 00:21:41.150 "data_size": 63488 00:21:41.150 } 00:21:41.150 ] 00:21:41.150 }' 00:21:41.150 00:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:41.150 00:03:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:41.715 00:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@792 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:41.715 00:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:41.715 00:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:21:41.715 00:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:21:41.715 00:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:41.715 00:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:41.715 00:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:41.973 00:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:41.973 "name": "raid_bdev1", 00:21:41.973 "uuid": "d41634da-db79-4230-b900-0bec5b1900d0", 00:21:41.973 "strip_size_kb": 0, 00:21:41.973 "state": "online", 00:21:41.973 "raid_level": "raid1", 00:21:41.973 "superblock": true, 00:21:41.973 "num_base_bdevs": 2, 00:21:41.973 "num_base_bdevs_discovered": 1, 00:21:41.973 "num_base_bdevs_operational": 1, 00:21:41.973 "base_bdevs_list": [ 00:21:41.973 { 00:21:41.973 "name": null, 00:21:41.973 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:41.973 "is_configured": false, 00:21:41.973 "data_offset": 2048, 00:21:41.973 "data_size": 63488 00:21:41.973 }, 00:21:41.973 { 00:21:41.973 "name": "BaseBdev2", 00:21:41.973 "uuid": "e0f444fe-1fcb-597c-a425-d595209b03e4", 00:21:41.973 "is_configured": true, 00:21:41.973 "data_offset": 2048, 00:21:41.973 "data_size": 63488 00:21:41.973 } 00:21:41.973 ] 00:21:41.973 }' 00:21:41.974 00:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:41.974 00:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:41.974 00:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:42.232 00:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:21:42.232 00:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@795 -- # killprocess 483793 00:21:42.232 00:03:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@946 -- # '[' -z 483793 ']' 00:21:42.232 00:03:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@950 -- # kill -0 483793 00:21:42.232 00:03:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@951 -- # uname 00:21:42.232 00:03:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:21:42.232 00:03:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 483793 00:21:42.232 00:03:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:21:42.232 00:03:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:21:42.232 00:03:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@964 -- # echo 'killing process with pid 483793' 00:21:42.232 killing process with pid 483793 00:21:42.232 00:03:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@965 -- # kill 483793 00:21:42.232 Received shutdown signal, test time was about 27.857257 seconds 00:21:42.232 00:21:42.232 Latency(us) 00:21:42.232 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:42.232 =================================================================================================================== 00:21:42.232 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:21:42.232 [2024-05-15 00:03:42.646382] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:42.232 [2024-05-15 00:03:42.646497] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:42.232 [2024-05-15 00:03:42.646550] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:42.232 00:03:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@970 -- # wait 483793 00:21:42.232 [2024-05-15 00:03:42.646562] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18344c0 name raid_bdev1, state offline 00:21:42.232 [2024-05-15 00:03:42.671229] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:42.489 00:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@797 -- # return 0 00:21:42.489 00:21:42.489 real 0m32.287s 00:21:42.489 user 0m50.541s 00:21:42.489 sys 0m4.660s 00:21:42.489 00:03:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1122 -- # xtrace_disable 00:21:42.489 00:03:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:42.489 ************************************ 00:21:42.489 END TEST raid_rebuild_test_sb_io 00:21:42.489 ************************************ 00:21:42.489 00:03:42 bdev_raid -- bdev/bdev_raid.sh@822 -- # for n in 2 4 00:21:42.489 00:03:42 bdev_raid -- bdev/bdev_raid.sh@823 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:21:42.489 00:03:42 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:21:42.489 00:03:42 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:21:42.489 00:03:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:42.489 ************************************ 00:21:42.489 START TEST raid_rebuild_test 00:21:42.489 ************************************ 00:21:42.489 00:03:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 4 false false true 00:21:42.489 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_level=raid1 00:21:42.489 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local num_base_bdevs=4 00:21:42.489 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local superblock=false 00:21:42.489 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local background_io=false 00:21:42.489 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local verify=true 00:21:42.489 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i = 1 )) 00:21:42.489 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:21:42.489 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@581 -- # echo BaseBdev1 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@581 -- # echo BaseBdev2 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@581 -- # echo BaseBdev3 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@581 -- # echo BaseBdev4 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # local base_bdevs 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # local raid_bdev_name=raid_bdev1 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@581 -- # local strip_size 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@582 -- # local create_arg 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@583 -- # local raid_bdev_size 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@584 -- # local data_offset 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@586 -- # '[' raid1 '!=' raid1 ']' 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@594 -- # strip_size=0 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # '[' false = true ']' 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # raid_pid=488378 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@603 -- # waitforlisten 488378 /var/tmp/spdk-raid.sock 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@827 -- # '[' -z 488378 ']' 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:42.490 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:21:42.490 00:03:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:42.747 [2024-05-15 00:03:43.080112] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:21:42.747 [2024-05-15 00:03:43.080176] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid488378 ] 00:21:42.747 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:42.747 Zero copy mechanism will not be used. 00:21:42.747 [2024-05-15 00:03:43.208731] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:42.747 [2024-05-15 00:03:43.317999] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:21:43.004 [2024-05-15 00:03:43.380470] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:43.004 [2024-05-15 00:03:43.380500] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:43.569 00:03:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:21:43.569 00:03:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@860 -- # return 0 00:21:43.569 00:03:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:21:43.569 00:03:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:43.827 BaseBdev1_malloc 00:21:43.827 00:03:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:44.084 [2024-05-15 00:03:44.468149] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:44.084 [2024-05-15 00:03:44.468195] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:44.084 [2024-05-15 00:03:44.468217] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21aab50 00:21:44.084 [2024-05-15 00:03:44.468230] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:44.084 [2024-05-15 00:03:44.469949] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:44.084 [2024-05-15 00:03:44.469980] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:44.084 BaseBdev1 00:21:44.085 00:03:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:21:44.085 00:03:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:44.342 BaseBdev2_malloc 00:21:44.342 00:03:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:21:44.600 [2024-05-15 00:03:44.955453] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:21:44.600 [2024-05-15 00:03:44.955495] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:44.600 [2024-05-15 00:03:44.955514] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2350d10 00:21:44.600 [2024-05-15 00:03:44.955526] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:44.600 [2024-05-15 00:03:44.957030] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:44.600 [2024-05-15 00:03:44.957058] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:44.600 BaseBdev2 00:21:44.600 00:03:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:21:44.600 00:03:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:44.857 BaseBdev3_malloc 00:21:44.857 00:03:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:21:45.114 [2024-05-15 00:03:45.462613] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:21:45.114 [2024-05-15 00:03:45.462660] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:45.114 [2024-05-15 00:03:45.462680] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2352510 00:21:45.114 [2024-05-15 00:03:45.462693] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:45.114 [2024-05-15 00:03:45.464294] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:45.114 [2024-05-15 00:03:45.464323] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:45.114 BaseBdev3 00:21:45.114 00:03:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:21:45.114 00:03:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:45.372 BaseBdev4_malloc 00:21:45.372 00:03:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:21:45.629 [2024-05-15 00:03:45.968613] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:21:45.629 [2024-05-15 00:03:45.968659] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:45.629 [2024-05-15 00:03:45.968679] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x235c290 00:21:45.629 [2024-05-15 00:03:45.968692] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:45.629 [2024-05-15 00:03:45.970133] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:45.629 [2024-05-15 00:03:45.970161] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:45.629 BaseBdev4 00:21:45.629 00:03:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:21:45.886 spare_malloc 00:21:45.887 00:03:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@613 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:21:45.887 spare_delay 00:21:46.143 00:03:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@614 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:46.143 [2024-05-15 00:03:46.700354] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:46.143 [2024-05-15 00:03:46.700411] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:46.143 [2024-05-15 00:03:46.700434] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21a1e00 00:21:46.143 [2024-05-15 00:03:46.700447] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:46.143 [2024-05-15 00:03:46.702070] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:46.143 [2024-05-15 00:03:46.702099] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:46.143 spare 00:21:46.143 00:03:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:21:46.400 [2024-05-15 00:03:46.945020] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:46.400 [2024-05-15 00:03:46.946379] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:46.400 [2024-05-15 00:03:46.946443] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:46.400 [2024-05-15 00:03:46.946487] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:46.400 [2024-05-15 00:03:46.946566] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x21a5840 00:21:46.400 [2024-05-15 00:03:46.946575] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:21:46.400 [2024-05-15 00:03:46.946793] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21a91f0 00:21:46.400 [2024-05-15 00:03:46.946950] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21a5840 00:21:46.400 [2024-05-15 00:03:46.946959] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21a5840 00:21:46.400 [2024-05-15 00:03:46.947079] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:46.400 00:03:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:21:46.400 00:03:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:46.400 00:03:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:46.400 00:03:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:46.400 00:03:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:46.400 00:03:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:21:46.400 00:03:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:46.400 00:03:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:46.400 00:03:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:46.400 00:03:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:46.400 00:03:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:46.400 00:03:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:46.658 00:03:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:46.658 "name": "raid_bdev1", 00:21:46.658 "uuid": "93f82d16-e7d7-43c6-a7c1-310365a9c870", 00:21:46.658 "strip_size_kb": 0, 00:21:46.658 "state": "online", 00:21:46.658 "raid_level": "raid1", 00:21:46.658 "superblock": false, 00:21:46.658 "num_base_bdevs": 4, 00:21:46.658 "num_base_bdevs_discovered": 4, 00:21:46.658 "num_base_bdevs_operational": 4, 00:21:46.658 "base_bdevs_list": [ 00:21:46.658 { 00:21:46.658 "name": "BaseBdev1", 00:21:46.658 "uuid": "77c68871-1833-5c93-8146-faf290f569a1", 00:21:46.658 "is_configured": true, 00:21:46.658 "data_offset": 0, 00:21:46.658 "data_size": 65536 00:21:46.658 }, 00:21:46.658 { 00:21:46.658 "name": "BaseBdev2", 00:21:46.658 "uuid": "6a585ba6-6499-542c-ac0a-0d53c8521209", 00:21:46.658 "is_configured": true, 00:21:46.658 "data_offset": 0, 00:21:46.658 "data_size": 65536 00:21:46.658 }, 00:21:46.658 { 00:21:46.658 "name": "BaseBdev3", 00:21:46.658 "uuid": "5c654d3a-9ff9-5276-91b2-2380a5ada108", 00:21:46.658 "is_configured": true, 00:21:46.658 "data_offset": 0, 00:21:46.658 "data_size": 65536 00:21:46.658 }, 00:21:46.658 { 00:21:46.658 "name": "BaseBdev4", 00:21:46.658 "uuid": "e7d53134-263e-51ab-832f-8b66249a8aaa", 00:21:46.658 "is_configured": true, 00:21:46.658 "data_offset": 0, 00:21:46.658 "data_size": 65536 00:21:46.658 } 00:21:46.658 ] 00:21:46.658 }' 00:21:46.658 00:03:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:46.658 00:03:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:47.221 00:03:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@621 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:47.221 00:03:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@621 -- # jq -r '.[].num_blocks' 00:21:47.479 [2024-05-15 00:03:48.008065] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:47.479 00:03:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@621 -- # raid_bdev_size=65536 00:21:47.479 00:03:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:47.479 00:03:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:21:47.736 00:03:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # data_offset=0 00:21:47.736 00:03:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@626 -- # '[' false = true ']' 00:21:47.736 00:03:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@629 -- # '[' true = true ']' 00:21:47.736 00:03:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@630 -- # local write_unit_size 00:21:47.736 00:03:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@633 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:21:47.736 00:03:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:47.736 00:03:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:21:47.736 00:03:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:47.736 00:03:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:47.736 00:03:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:47.736 00:03:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:21:47.736 00:03:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:47.736 00:03:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:47.736 00:03:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:21:47.993 [2024-05-15 00:03:48.501143] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21a4700 00:21:47.993 /dev/nbd0 00:21:47.993 00:03:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:47.993 00:03:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:47.993 00:03:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:21:47.993 00:03:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@865 -- # local i 00:21:47.993 00:03:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:21:47.993 00:03:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:21:47.993 00:03:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:21:47.993 00:03:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # break 00:21:47.993 00:03:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:21:47.993 00:03:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:21:47.993 00:03:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:47.993 1+0 records in 00:21:47.993 1+0 records out 00:21:47.993 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000280741 s, 14.6 MB/s 00:21:47.993 00:03:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:47.993 00:03:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # size=4096 00:21:47.993 00:03:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:47.993 00:03:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:21:47.993 00:03:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # return 0 00:21:47.993 00:03:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:47.993 00:03:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:47.993 00:03:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # '[' raid1 = raid5f ']' 00:21:47.993 00:03:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@638 -- # write_unit_size=1 00:21:47.993 00:03:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@640 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:21:56.089 65536+0 records in 00:21:56.089 65536+0 records out 00:21:56.089 33554432 bytes (34 MB, 32 MiB) copied, 6.60129 s, 5.1 MB/s 00:21:56.089 00:03:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@641 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:21:56.089 00:03:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:56.089 00:03:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:21:56.089 00:03:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:56.089 00:03:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:21:56.089 00:03:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:56.089 00:03:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:56.089 00:03:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:56.090 [2024-05-15 00:03:55.365991] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:56.090 00:03:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:56.090 00:03:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:56.090 00:03:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:56.090 00:03:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:56.090 00:03:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:56.090 00:03:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:21:56.090 00:03:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:21:56.090 00:03:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:21:56.090 [2024-05-15 00:03:55.594644] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:56.090 00:03:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@648 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:56.090 00:03:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:56.090 00:03:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:56.090 00:03:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:56.090 00:03:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:56.090 00:03:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:21:56.090 00:03:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:56.090 00:03:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:56.090 00:03:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:56.090 00:03:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:56.090 00:03:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:56.090 00:03:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:56.090 00:03:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:56.090 "name": "raid_bdev1", 00:21:56.090 "uuid": "93f82d16-e7d7-43c6-a7c1-310365a9c870", 00:21:56.090 "strip_size_kb": 0, 00:21:56.090 "state": "online", 00:21:56.090 "raid_level": "raid1", 00:21:56.090 "superblock": false, 00:21:56.090 "num_base_bdevs": 4, 00:21:56.090 "num_base_bdevs_discovered": 3, 00:21:56.090 "num_base_bdevs_operational": 3, 00:21:56.090 "base_bdevs_list": [ 00:21:56.090 { 00:21:56.090 "name": null, 00:21:56.090 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:56.090 "is_configured": false, 00:21:56.090 "data_offset": 0, 00:21:56.090 "data_size": 65536 00:21:56.090 }, 00:21:56.090 { 00:21:56.090 "name": "BaseBdev2", 00:21:56.090 "uuid": "6a585ba6-6499-542c-ac0a-0d53c8521209", 00:21:56.090 "is_configured": true, 00:21:56.090 "data_offset": 0, 00:21:56.090 "data_size": 65536 00:21:56.090 }, 00:21:56.090 { 00:21:56.090 "name": "BaseBdev3", 00:21:56.090 "uuid": "5c654d3a-9ff9-5276-91b2-2380a5ada108", 00:21:56.090 "is_configured": true, 00:21:56.090 "data_offset": 0, 00:21:56.090 "data_size": 65536 00:21:56.090 }, 00:21:56.090 { 00:21:56.090 "name": "BaseBdev4", 00:21:56.090 "uuid": "e7d53134-263e-51ab-832f-8b66249a8aaa", 00:21:56.090 "is_configured": true, 00:21:56.090 "data_offset": 0, 00:21:56.090 "data_size": 65536 00:21:56.090 } 00:21:56.090 ] 00:21:56.090 }' 00:21:56.090 00:03:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:56.090 00:03:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:56.090 00:03:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:56.090 [2024-05-15 00:03:56.609336] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:56.090 [2024-05-15 00:03:56.613389] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1eb2cd0 00:21:56.090 [2024-05-15 00:03:56.615636] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:56.090 00:03:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # sleep 1 00:21:57.486 00:03:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:57.486 00:03:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:57.486 00:03:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:57.486 00:03:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:57.486 00:03:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:57.486 00:03:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:57.486 00:03:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:57.486 00:03:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:57.486 "name": "raid_bdev1", 00:21:57.486 "uuid": "93f82d16-e7d7-43c6-a7c1-310365a9c870", 00:21:57.486 "strip_size_kb": 0, 00:21:57.486 "state": "online", 00:21:57.486 "raid_level": "raid1", 00:21:57.486 "superblock": false, 00:21:57.486 "num_base_bdevs": 4, 00:21:57.486 "num_base_bdevs_discovered": 4, 00:21:57.486 "num_base_bdevs_operational": 4, 00:21:57.486 "process": { 00:21:57.486 "type": "rebuild", 00:21:57.486 "target": "spare", 00:21:57.486 "progress": { 00:21:57.486 "blocks": 24576, 00:21:57.486 "percent": 37 00:21:57.486 } 00:21:57.486 }, 00:21:57.486 "base_bdevs_list": [ 00:21:57.486 { 00:21:57.486 "name": "spare", 00:21:57.486 "uuid": "f81ac732-77e0-5450-81d7-4fde39eee789", 00:21:57.486 "is_configured": true, 00:21:57.486 "data_offset": 0, 00:21:57.486 "data_size": 65536 00:21:57.486 }, 00:21:57.486 { 00:21:57.486 "name": "BaseBdev2", 00:21:57.486 "uuid": "6a585ba6-6499-542c-ac0a-0d53c8521209", 00:21:57.486 "is_configured": true, 00:21:57.486 "data_offset": 0, 00:21:57.486 "data_size": 65536 00:21:57.486 }, 00:21:57.486 { 00:21:57.486 "name": "BaseBdev3", 00:21:57.486 "uuid": "5c654d3a-9ff9-5276-91b2-2380a5ada108", 00:21:57.486 "is_configured": true, 00:21:57.486 "data_offset": 0, 00:21:57.486 "data_size": 65536 00:21:57.486 }, 00:21:57.486 { 00:21:57.486 "name": "BaseBdev4", 00:21:57.486 "uuid": "e7d53134-263e-51ab-832f-8b66249a8aaa", 00:21:57.486 "is_configured": true, 00:21:57.486 "data_offset": 0, 00:21:57.486 "data_size": 65536 00:21:57.486 } 00:21:57.486 ] 00:21:57.486 }' 00:21:57.486 00:03:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:57.486 00:03:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:57.486 00:03:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:57.486 00:03:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:57.486 00:03:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:57.743 [2024-05-15 00:03:58.178965] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:57.743 [2024-05-15 00:03:58.228002] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:57.743 [2024-05-15 00:03:58.228046] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:57.744 00:03:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:57.744 00:03:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:57.744 00:03:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:57.744 00:03:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:57.744 00:03:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:57.744 00:03:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:21:57.744 00:03:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:57.744 00:03:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:57.744 00:03:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:57.744 00:03:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:57.744 00:03:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:57.744 00:03:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:58.000 00:03:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:58.000 "name": "raid_bdev1", 00:21:58.000 "uuid": "93f82d16-e7d7-43c6-a7c1-310365a9c870", 00:21:58.000 "strip_size_kb": 0, 00:21:58.000 "state": "online", 00:21:58.000 "raid_level": "raid1", 00:21:58.000 "superblock": false, 00:21:58.000 "num_base_bdevs": 4, 00:21:58.000 "num_base_bdevs_discovered": 3, 00:21:58.000 "num_base_bdevs_operational": 3, 00:21:58.000 "base_bdevs_list": [ 00:21:58.000 { 00:21:58.000 "name": null, 00:21:58.000 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:58.000 "is_configured": false, 00:21:58.000 "data_offset": 0, 00:21:58.000 "data_size": 65536 00:21:58.000 }, 00:21:58.000 { 00:21:58.000 "name": "BaseBdev2", 00:21:58.000 "uuid": "6a585ba6-6499-542c-ac0a-0d53c8521209", 00:21:58.000 "is_configured": true, 00:21:58.000 "data_offset": 0, 00:21:58.000 "data_size": 65536 00:21:58.000 }, 00:21:58.000 { 00:21:58.000 "name": "BaseBdev3", 00:21:58.000 "uuid": "5c654d3a-9ff9-5276-91b2-2380a5ada108", 00:21:58.000 "is_configured": true, 00:21:58.000 "data_offset": 0, 00:21:58.000 "data_size": 65536 00:21:58.000 }, 00:21:58.000 { 00:21:58.000 "name": "BaseBdev4", 00:21:58.000 "uuid": "e7d53134-263e-51ab-832f-8b66249a8aaa", 00:21:58.000 "is_configured": true, 00:21:58.000 "data_offset": 0, 00:21:58.000 "data_size": 65536 00:21:58.000 } 00:21:58.000 ] 00:21:58.000 }' 00:21:58.000 00:03:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:58.000 00:03:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:58.565 00:03:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:58.565 00:03:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:58.565 00:03:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:21:58.565 00:03:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=none 00:21:58.565 00:03:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:58.565 00:03:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:58.565 00:03:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:58.822 00:03:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:58.822 "name": "raid_bdev1", 00:21:58.822 "uuid": "93f82d16-e7d7-43c6-a7c1-310365a9c870", 00:21:58.822 "strip_size_kb": 0, 00:21:58.822 "state": "online", 00:21:58.822 "raid_level": "raid1", 00:21:58.822 "superblock": false, 00:21:58.822 "num_base_bdevs": 4, 00:21:58.822 "num_base_bdevs_discovered": 3, 00:21:58.822 "num_base_bdevs_operational": 3, 00:21:58.822 "base_bdevs_list": [ 00:21:58.822 { 00:21:58.822 "name": null, 00:21:58.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:58.822 "is_configured": false, 00:21:58.822 "data_offset": 0, 00:21:58.822 "data_size": 65536 00:21:58.822 }, 00:21:58.822 { 00:21:58.822 "name": "BaseBdev2", 00:21:58.822 "uuid": "6a585ba6-6499-542c-ac0a-0d53c8521209", 00:21:58.822 "is_configured": true, 00:21:58.822 "data_offset": 0, 00:21:58.822 "data_size": 65536 00:21:58.822 }, 00:21:58.822 { 00:21:58.822 "name": "BaseBdev3", 00:21:58.822 "uuid": "5c654d3a-9ff9-5276-91b2-2380a5ada108", 00:21:58.822 "is_configured": true, 00:21:58.822 "data_offset": 0, 00:21:58.822 "data_size": 65536 00:21:58.822 }, 00:21:58.822 { 00:21:58.822 "name": "BaseBdev4", 00:21:58.822 "uuid": "e7d53134-263e-51ab-832f-8b66249a8aaa", 00:21:58.822 "is_configured": true, 00:21:58.822 "data_offset": 0, 00:21:58.822 "data_size": 65536 00:21:58.822 } 00:21:58.822 ] 00:21:58.822 }' 00:21:58.822 00:03:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:58.822 00:03:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:58.822 00:03:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:59.078 00:03:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:21:59.078 00:03:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@667 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:59.078 [2024-05-15 00:03:59.647877] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:59.078 [2024-05-15 00:03:59.651959] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21a4700 00:21:59.078 [2024-05-15 00:03:59.653552] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:59.336 00:03:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@668 -- # sleep 1 00:22:00.264 00:04:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@669 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:00.264 00:04:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:00.264 00:04:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:00.264 00:04:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:00.264 00:04:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:00.264 00:04:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:00.264 00:04:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:00.522 00:04:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:00.522 "name": "raid_bdev1", 00:22:00.522 "uuid": "93f82d16-e7d7-43c6-a7c1-310365a9c870", 00:22:00.522 "strip_size_kb": 0, 00:22:00.522 "state": "online", 00:22:00.522 "raid_level": "raid1", 00:22:00.522 "superblock": false, 00:22:00.522 "num_base_bdevs": 4, 00:22:00.522 "num_base_bdevs_discovered": 4, 00:22:00.522 "num_base_bdevs_operational": 4, 00:22:00.522 "process": { 00:22:00.522 "type": "rebuild", 00:22:00.522 "target": "spare", 00:22:00.522 "progress": { 00:22:00.522 "blocks": 24576, 00:22:00.522 "percent": 37 00:22:00.522 } 00:22:00.522 }, 00:22:00.522 "base_bdevs_list": [ 00:22:00.522 { 00:22:00.522 "name": "spare", 00:22:00.522 "uuid": "f81ac732-77e0-5450-81d7-4fde39eee789", 00:22:00.522 "is_configured": true, 00:22:00.522 "data_offset": 0, 00:22:00.522 "data_size": 65536 00:22:00.522 }, 00:22:00.522 { 00:22:00.522 "name": "BaseBdev2", 00:22:00.522 "uuid": "6a585ba6-6499-542c-ac0a-0d53c8521209", 00:22:00.522 "is_configured": true, 00:22:00.522 "data_offset": 0, 00:22:00.522 "data_size": 65536 00:22:00.522 }, 00:22:00.522 { 00:22:00.522 "name": "BaseBdev3", 00:22:00.522 "uuid": "5c654d3a-9ff9-5276-91b2-2380a5ada108", 00:22:00.522 "is_configured": true, 00:22:00.522 "data_offset": 0, 00:22:00.522 "data_size": 65536 00:22:00.522 }, 00:22:00.522 { 00:22:00.522 "name": "BaseBdev4", 00:22:00.522 "uuid": "e7d53134-263e-51ab-832f-8b66249a8aaa", 00:22:00.522 "is_configured": true, 00:22:00.522 "data_offset": 0, 00:22:00.522 "data_size": 65536 00:22:00.522 } 00:22:00.522 ] 00:22:00.522 }' 00:22:00.522 00:04:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:00.522 00:04:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:00.522 00:04:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:00.522 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:00.522 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@671 -- # '[' false = true ']' 00:22:00.522 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@696 -- # local num_base_bdevs_operational=4 00:22:00.522 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # '[' raid1 = raid1 ']' 00:22:00.522 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # '[' 4 -gt 2 ']' 00:22:00.522 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@700 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:00.779 [2024-05-15 00:04:01.245234] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:00.779 [2024-05-15 00:04:01.266129] bdev_raid.c:1957:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x21a4700 00:22:00.779 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@703 -- # base_bdevs[1]= 00:22:00.779 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@704 -- # (( num_base_bdevs_operational-- )) 00:22:00.779 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:00.779 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:00.779 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:00.779 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:00.779 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:00.779 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:00.779 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:01.036 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:01.036 "name": "raid_bdev1", 00:22:01.036 "uuid": "93f82d16-e7d7-43c6-a7c1-310365a9c870", 00:22:01.036 "strip_size_kb": 0, 00:22:01.036 "state": "online", 00:22:01.036 "raid_level": "raid1", 00:22:01.036 "superblock": false, 00:22:01.036 "num_base_bdevs": 4, 00:22:01.036 "num_base_bdevs_discovered": 3, 00:22:01.036 "num_base_bdevs_operational": 3, 00:22:01.036 "process": { 00:22:01.036 "type": "rebuild", 00:22:01.036 "target": "spare", 00:22:01.036 "progress": { 00:22:01.036 "blocks": 36864, 00:22:01.036 "percent": 56 00:22:01.036 } 00:22:01.036 }, 00:22:01.036 "base_bdevs_list": [ 00:22:01.036 { 00:22:01.036 "name": "spare", 00:22:01.036 "uuid": "f81ac732-77e0-5450-81d7-4fde39eee789", 00:22:01.036 "is_configured": true, 00:22:01.036 "data_offset": 0, 00:22:01.036 "data_size": 65536 00:22:01.036 }, 00:22:01.036 { 00:22:01.036 "name": null, 00:22:01.036 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:01.036 "is_configured": false, 00:22:01.036 "data_offset": 0, 00:22:01.036 "data_size": 65536 00:22:01.036 }, 00:22:01.036 { 00:22:01.036 "name": "BaseBdev3", 00:22:01.036 "uuid": "5c654d3a-9ff9-5276-91b2-2380a5ada108", 00:22:01.036 "is_configured": true, 00:22:01.036 "data_offset": 0, 00:22:01.036 "data_size": 65536 00:22:01.036 }, 00:22:01.036 { 00:22:01.036 "name": "BaseBdev4", 00:22:01.036 "uuid": "e7d53134-263e-51ab-832f-8b66249a8aaa", 00:22:01.036 "is_configured": true, 00:22:01.036 "data_offset": 0, 00:22:01.036 "data_size": 65536 00:22:01.036 } 00:22:01.036 ] 00:22:01.036 }' 00:22:01.036 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:01.036 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:01.036 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:01.296 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:01.296 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@711 -- # local timeout=732 00:22:01.296 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:22:01.296 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:01.296 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:01.296 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:01.296 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:01.296 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:01.296 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:01.296 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:01.296 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:01.296 "name": "raid_bdev1", 00:22:01.296 "uuid": "93f82d16-e7d7-43c6-a7c1-310365a9c870", 00:22:01.296 "strip_size_kb": 0, 00:22:01.296 "state": "online", 00:22:01.296 "raid_level": "raid1", 00:22:01.296 "superblock": false, 00:22:01.296 "num_base_bdevs": 4, 00:22:01.296 "num_base_bdevs_discovered": 3, 00:22:01.296 "num_base_bdevs_operational": 3, 00:22:01.296 "process": { 00:22:01.296 "type": "rebuild", 00:22:01.296 "target": "spare", 00:22:01.296 "progress": { 00:22:01.296 "blocks": 43008, 00:22:01.296 "percent": 65 00:22:01.296 } 00:22:01.296 }, 00:22:01.296 "base_bdevs_list": [ 00:22:01.296 { 00:22:01.296 "name": "spare", 00:22:01.296 "uuid": "f81ac732-77e0-5450-81d7-4fde39eee789", 00:22:01.296 "is_configured": true, 00:22:01.296 "data_offset": 0, 00:22:01.296 "data_size": 65536 00:22:01.296 }, 00:22:01.296 { 00:22:01.296 "name": null, 00:22:01.296 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:01.296 "is_configured": false, 00:22:01.296 "data_offset": 0, 00:22:01.296 "data_size": 65536 00:22:01.296 }, 00:22:01.296 { 00:22:01.296 "name": "BaseBdev3", 00:22:01.296 "uuid": "5c654d3a-9ff9-5276-91b2-2380a5ada108", 00:22:01.296 "is_configured": true, 00:22:01.296 "data_offset": 0, 00:22:01.296 "data_size": 65536 00:22:01.296 }, 00:22:01.296 { 00:22:01.296 "name": "BaseBdev4", 00:22:01.296 "uuid": "e7d53134-263e-51ab-832f-8b66249a8aaa", 00:22:01.296 "is_configured": true, 00:22:01.296 "data_offset": 0, 00:22:01.296 "data_size": 65536 00:22:01.296 } 00:22:01.296 ] 00:22:01.296 }' 00:22:01.296 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:01.554 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:01.554 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:01.554 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:01.554 00:04:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@716 -- # sleep 1 00:22:02.484 [2024-05-15 00:04:02.878403] bdev_raid.c:2741:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:02.484 [2024-05-15 00:04:02.878468] bdev_raid.c:2458:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:02.484 [2024-05-15 00:04:02.878507] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:02.484 00:04:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:22:02.484 00:04:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:02.484 00:04:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:02.484 00:04:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:02.484 00:04:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:02.484 00:04:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:02.484 00:04:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:02.484 00:04:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:02.741 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:02.741 "name": "raid_bdev1", 00:22:02.741 "uuid": "93f82d16-e7d7-43c6-a7c1-310365a9c870", 00:22:02.741 "strip_size_kb": 0, 00:22:02.741 "state": "online", 00:22:02.741 "raid_level": "raid1", 00:22:02.741 "superblock": false, 00:22:02.741 "num_base_bdevs": 4, 00:22:02.741 "num_base_bdevs_discovered": 3, 00:22:02.741 "num_base_bdevs_operational": 3, 00:22:02.741 "base_bdevs_list": [ 00:22:02.741 { 00:22:02.741 "name": "spare", 00:22:02.741 "uuid": "f81ac732-77e0-5450-81d7-4fde39eee789", 00:22:02.741 "is_configured": true, 00:22:02.741 "data_offset": 0, 00:22:02.741 "data_size": 65536 00:22:02.741 }, 00:22:02.741 { 00:22:02.741 "name": null, 00:22:02.741 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:02.741 "is_configured": false, 00:22:02.741 "data_offset": 0, 00:22:02.741 "data_size": 65536 00:22:02.741 }, 00:22:02.741 { 00:22:02.741 "name": "BaseBdev3", 00:22:02.741 "uuid": "5c654d3a-9ff9-5276-91b2-2380a5ada108", 00:22:02.741 "is_configured": true, 00:22:02.741 "data_offset": 0, 00:22:02.741 "data_size": 65536 00:22:02.741 }, 00:22:02.741 { 00:22:02.741 "name": "BaseBdev4", 00:22:02.741 "uuid": "e7d53134-263e-51ab-832f-8b66249a8aaa", 00:22:02.741 "is_configured": true, 00:22:02.742 "data_offset": 0, 00:22:02.742 "data_size": 65536 00:22:02.742 } 00:22:02.742 ] 00:22:02.742 }' 00:22:02.742 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:02.742 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:02.742 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:02.742 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:22:02.742 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # break 00:22:02.742 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@720 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:02.742 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:02.742 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:22:02.742 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=none 00:22:02.742 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:02.742 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:02.742 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:02.999 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:02.999 "name": "raid_bdev1", 00:22:02.999 "uuid": "93f82d16-e7d7-43c6-a7c1-310365a9c870", 00:22:02.999 "strip_size_kb": 0, 00:22:02.999 "state": "online", 00:22:02.999 "raid_level": "raid1", 00:22:02.999 "superblock": false, 00:22:02.999 "num_base_bdevs": 4, 00:22:02.999 "num_base_bdevs_discovered": 3, 00:22:02.999 "num_base_bdevs_operational": 3, 00:22:02.999 "base_bdevs_list": [ 00:22:02.999 { 00:22:02.999 "name": "spare", 00:22:02.999 "uuid": "f81ac732-77e0-5450-81d7-4fde39eee789", 00:22:02.999 "is_configured": true, 00:22:02.999 "data_offset": 0, 00:22:02.999 "data_size": 65536 00:22:02.999 }, 00:22:02.999 { 00:22:02.999 "name": null, 00:22:02.999 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:02.999 "is_configured": false, 00:22:02.999 "data_offset": 0, 00:22:02.999 "data_size": 65536 00:22:02.999 }, 00:22:02.999 { 00:22:02.999 "name": "BaseBdev3", 00:22:02.999 "uuid": "5c654d3a-9ff9-5276-91b2-2380a5ada108", 00:22:02.999 "is_configured": true, 00:22:02.999 "data_offset": 0, 00:22:02.999 "data_size": 65536 00:22:02.999 }, 00:22:02.999 { 00:22:02.999 "name": "BaseBdev4", 00:22:02.999 "uuid": "e7d53134-263e-51ab-832f-8b66249a8aaa", 00:22:02.999 "is_configured": true, 00:22:02.999 "data_offset": 0, 00:22:02.999 "data_size": 65536 00:22:02.999 } 00:22:02.999 ] 00:22:02.999 }' 00:22:02.999 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:02.999 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:02.999 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:03.255 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:22:03.255 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:03.255 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:03.255 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:03.255 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:03.255 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:03.255 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:22:03.255 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:03.255 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:03.255 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:03.255 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:03.255 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:03.255 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:03.512 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:03.512 "name": "raid_bdev1", 00:22:03.512 "uuid": "93f82d16-e7d7-43c6-a7c1-310365a9c870", 00:22:03.512 "strip_size_kb": 0, 00:22:03.512 "state": "online", 00:22:03.512 "raid_level": "raid1", 00:22:03.512 "superblock": false, 00:22:03.512 "num_base_bdevs": 4, 00:22:03.512 "num_base_bdevs_discovered": 3, 00:22:03.512 "num_base_bdevs_operational": 3, 00:22:03.512 "base_bdevs_list": [ 00:22:03.512 { 00:22:03.512 "name": "spare", 00:22:03.512 "uuid": "f81ac732-77e0-5450-81d7-4fde39eee789", 00:22:03.512 "is_configured": true, 00:22:03.512 "data_offset": 0, 00:22:03.512 "data_size": 65536 00:22:03.512 }, 00:22:03.512 { 00:22:03.512 "name": null, 00:22:03.512 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:03.512 "is_configured": false, 00:22:03.512 "data_offset": 0, 00:22:03.512 "data_size": 65536 00:22:03.512 }, 00:22:03.512 { 00:22:03.512 "name": "BaseBdev3", 00:22:03.512 "uuid": "5c654d3a-9ff9-5276-91b2-2380a5ada108", 00:22:03.512 "is_configured": true, 00:22:03.512 "data_offset": 0, 00:22:03.512 "data_size": 65536 00:22:03.512 }, 00:22:03.512 { 00:22:03.512 "name": "BaseBdev4", 00:22:03.512 "uuid": "e7d53134-263e-51ab-832f-8b66249a8aaa", 00:22:03.512 "is_configured": true, 00:22:03.512 "data_offset": 0, 00:22:03.512 "data_size": 65536 00:22:03.512 } 00:22:03.512 ] 00:22:03.512 }' 00:22:03.512 00:04:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:03.512 00:04:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:04.076 00:04:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@724 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:04.077 [2024-05-15 00:04:04.663423] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:04.077 [2024-05-15 00:04:04.663452] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:04.077 [2024-05-15 00:04:04.663512] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:04.077 [2024-05-15 00:04:04.663584] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:04.077 [2024-05-15 00:04:04.663596] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21a5840 name raid_bdev1, state offline 00:22:04.333 00:04:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@725 -- # jq length 00:22:04.334 00:04:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@725 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:04.591 00:04:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@725 -- # [[ 0 == 0 ]] 00:22:04.591 00:04:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@727 -- # '[' true = true ']' 00:22:04.591 00:04:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@728 -- # '[' false = true ']' 00:22:04.591 00:04:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:22:04.591 00:04:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:04.591 00:04:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:22:04.591 00:04:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:04.591 00:04:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:04.591 00:04:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:04.591 00:04:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:22:04.591 00:04:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:04.591 00:04:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:04.591 00:04:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:22:04.591 /dev/nbd0 00:22:04.848 00:04:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:04.848 00:04:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:04.848 00:04:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:22:04.848 00:04:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@865 -- # local i 00:22:04.848 00:04:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:22:04.848 00:04:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:22:04.848 00:04:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:22:04.848 00:04:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # break 00:22:04.848 00:04:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:22:04.848 00:04:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:22:04.848 00:04:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:04.848 1+0 records in 00:22:04.848 1+0 records out 00:22:04.848 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00017442 s, 23.5 MB/s 00:22:04.848 00:04:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:04.848 00:04:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # size=4096 00:22:04.848 00:04:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:04.849 00:04:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:22:04.849 00:04:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # return 0 00:22:04.849 00:04:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:04.849 00:04:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:04.849 00:04:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:22:04.849 /dev/nbd1 00:22:04.849 00:04:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:04.849 00:04:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:04.849 00:04:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:22:04.849 00:04:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@865 -- # local i 00:22:04.849 00:04:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:22:04.849 00:04:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:22:04.849 00:04:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:22:04.849 00:04:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # break 00:22:04.849 00:04:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:22:04.849 00:04:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:22:04.849 00:04:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:04.849 1+0 records in 00:22:04.849 1+0 records out 00:22:04.849 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000347083 s, 11.8 MB/s 00:22:04.849 00:04:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:04.849 00:04:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # size=4096 00:22:04.849 00:04:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:04.849 00:04:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:22:04.849 00:04:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # return 0 00:22:04.849 00:04:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:04.849 00:04:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:04.849 00:04:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@743 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:22:05.106 00:04:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@744 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:22:05.106 00:04:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:05.106 00:04:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:05.106 00:04:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:05.106 00:04:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:22:05.106 00:04:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:05.106 00:04:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:05.363 00:04:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:05.363 00:04:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:05.363 00:04:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:05.363 00:04:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:05.363 00:04:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:05.363 00:04:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:05.363 00:04:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:22:05.363 00:04:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:22:05.363 00:04:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:05.363 00:04:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:05.621 00:04:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:05.621 00:04:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:05.621 00:04:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:05.621 00:04:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:05.621 00:04:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:05.621 00:04:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:05.621 00:04:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:22:05.621 00:04:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:22:05.621 00:04:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@748 -- # '[' false = true ']' 00:22:05.621 00:04:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@795 -- # killprocess 488378 00:22:05.621 00:04:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@946 -- # '[' -z 488378 ']' 00:22:05.621 00:04:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@950 -- # kill -0 488378 00:22:05.621 00:04:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@951 -- # uname 00:22:05.621 00:04:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:22:05.621 00:04:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 488378 00:22:05.621 00:04:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:22:05.621 00:04:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:22:05.621 00:04:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 488378' 00:22:05.621 killing process with pid 488378 00:22:05.621 00:04:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@965 -- # kill 488378 00:22:05.621 Received shutdown signal, test time was about 60.000000 seconds 00:22:05.621 00:22:05.621 Latency(us) 00:22:05.621 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:05.621 =================================================================================================================== 00:22:05.621 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:22:05.621 [2024-05-15 00:04:06.129930] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:05.621 00:04:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@970 -- # wait 488378 00:22:05.621 [2024-05-15 00:04:06.178159] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:05.879 00:04:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@797 -- # return 0 00:22:05.879 00:22:05.879 real 0m23.404s 00:22:05.879 user 0m32.062s 00:22:05.879 sys 0m4.791s 00:22:05.879 00:04:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:22:05.879 00:04:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:05.879 ************************************ 00:22:05.879 END TEST raid_rebuild_test 00:22:05.879 ************************************ 00:22:05.879 00:04:06 bdev_raid -- bdev/bdev_raid.sh@824 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:22:05.879 00:04:06 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:22:05.879 00:04:06 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:22:05.879 00:04:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:06.137 ************************************ 00:22:06.137 START TEST raid_rebuild_test_sb 00:22:06.137 ************************************ 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 4 true false true 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_level=raid1 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local num_base_bdevs=4 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local superblock=true 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local background_io=false 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local verify=true 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i = 1 )) 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@581 -- # echo BaseBdev1 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@581 -- # echo BaseBdev2 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@581 -- # echo BaseBdev3 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@581 -- # echo BaseBdev4 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # local base_bdevs 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # local raid_bdev_name=raid_bdev1 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@581 -- # local strip_size 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@582 -- # local create_arg 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@583 -- # local raid_bdev_size 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@584 -- # local data_offset 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@586 -- # '[' raid1 '!=' raid1 ']' 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@594 -- # strip_size=0 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # '[' true = true ']' 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@598 -- # create_arg+=' -s' 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # raid_pid=491604 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@603 -- # waitforlisten 491604 /var/tmp/spdk-raid.sock 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@827 -- # '[' -z 491604 ']' 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:06.137 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:22:06.137 00:04:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:06.137 [2024-05-15 00:04:06.581000] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:22:06.137 [2024-05-15 00:04:06.581055] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid491604 ] 00:22:06.137 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:06.137 Zero copy mechanism will not be used. 00:22:06.137 [2024-05-15 00:04:06.695092] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:06.395 [2024-05-15 00:04:06.799863] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:22:06.395 [2024-05-15 00:04:06.865487] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:06.395 [2024-05-15 00:04:06.865539] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:06.960 00:04:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:22:06.960 00:04:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@860 -- # return 0 00:22:06.960 00:04:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:22:06.960 00:04:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:07.217 BaseBdev1_malloc 00:22:07.217 00:04:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:07.475 [2024-05-15 00:04:07.978577] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:07.475 [2024-05-15 00:04:07.978623] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:07.475 [2024-05-15 00:04:07.978644] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1be1b50 00:22:07.475 [2024-05-15 00:04:07.978662] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:07.475 [2024-05-15 00:04:07.980219] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:07.475 [2024-05-15 00:04:07.980249] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:07.475 BaseBdev1 00:22:07.475 00:04:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:22:07.475 00:04:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:07.733 BaseBdev2_malloc 00:22:07.733 00:04:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:07.990 [2024-05-15 00:04:08.412467] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:07.990 [2024-05-15 00:04:08.412512] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:07.990 [2024-05-15 00:04:08.412530] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d87d10 00:22:07.990 [2024-05-15 00:04:08.412543] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:07.990 [2024-05-15 00:04:08.413934] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:07.990 [2024-05-15 00:04:08.413962] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:07.990 BaseBdev2 00:22:07.990 00:04:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:22:07.990 00:04:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:08.248 BaseBdev3_malloc 00:22:08.248 00:04:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:22:08.505 [2024-05-15 00:04:08.906348] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:22:08.505 [2024-05-15 00:04:08.906396] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:08.505 [2024-05-15 00:04:08.906420] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d89510 00:22:08.505 [2024-05-15 00:04:08.906432] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:08.505 [2024-05-15 00:04:08.907796] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:08.505 [2024-05-15 00:04:08.907825] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:08.505 BaseBdev3 00:22:08.505 00:04:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:22:08.505 00:04:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:08.763 BaseBdev4_malloc 00:22:08.763 00:04:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:22:09.020 [2024-05-15 00:04:09.396217] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:22:09.020 [2024-05-15 00:04:09.396261] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:09.020 [2024-05-15 00:04:09.396282] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d93290 00:22:09.020 [2024-05-15 00:04:09.396295] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:09.020 [2024-05-15 00:04:09.397672] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:09.020 [2024-05-15 00:04:09.397700] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:09.020 BaseBdev4 00:22:09.020 00:04:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:09.277 spare_malloc 00:22:09.277 00:04:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@613 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:09.534 spare_delay 00:22:09.534 00:04:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@614 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:09.791 [2024-05-15 00:04:10.143059] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:09.791 [2024-05-15 00:04:10.143112] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:09.791 [2024-05-15 00:04:10.143135] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bd8e00 00:22:09.791 [2024-05-15 00:04:10.143147] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:09.791 [2024-05-15 00:04:10.144734] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:09.791 [2024-05-15 00:04:10.144763] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:09.791 spare 00:22:09.791 00:04:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:22:10.049 [2024-05-15 00:04:10.387737] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:10.049 [2024-05-15 00:04:10.388978] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:10.049 [2024-05-15 00:04:10.389037] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:10.049 [2024-05-15 00:04:10.389081] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:10.049 [2024-05-15 00:04:10.389274] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bdc840 00:22:10.049 [2024-05-15 00:04:10.389286] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:10.049 [2024-05-15 00:04:10.389493] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d86540 00:22:10.049 [2024-05-15 00:04:10.389645] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bdc840 00:22:10.049 [2024-05-15 00:04:10.389655] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1bdc840 00:22:10.049 [2024-05-15 00:04:10.389752] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:10.049 00:04:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:10.049 00:04:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:10.049 00:04:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:10.049 00:04:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:10.049 00:04:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:10.049 00:04:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:22:10.049 00:04:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:10.049 00:04:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:10.049 00:04:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:10.049 00:04:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:10.049 00:04:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:10.049 00:04:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:10.306 00:04:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:10.306 "name": "raid_bdev1", 00:22:10.306 "uuid": "b04840ae-01dc-405e-b0a4-113a0cc8a7fc", 00:22:10.306 "strip_size_kb": 0, 00:22:10.306 "state": "online", 00:22:10.306 "raid_level": "raid1", 00:22:10.306 "superblock": true, 00:22:10.306 "num_base_bdevs": 4, 00:22:10.306 "num_base_bdevs_discovered": 4, 00:22:10.306 "num_base_bdevs_operational": 4, 00:22:10.306 "base_bdevs_list": [ 00:22:10.306 { 00:22:10.306 "name": "BaseBdev1", 00:22:10.306 "uuid": "6f18bf1d-4296-5685-8862-17fd1314bb35", 00:22:10.306 "is_configured": true, 00:22:10.306 "data_offset": 2048, 00:22:10.306 "data_size": 63488 00:22:10.306 }, 00:22:10.306 { 00:22:10.306 "name": "BaseBdev2", 00:22:10.307 "uuid": "be7b3e04-d912-5e96-96c7-60fc3e2a7ed1", 00:22:10.307 "is_configured": true, 00:22:10.307 "data_offset": 2048, 00:22:10.307 "data_size": 63488 00:22:10.307 }, 00:22:10.307 { 00:22:10.307 "name": "BaseBdev3", 00:22:10.307 "uuid": "efb5ca72-43f1-532c-b2ea-04be2ae0d59e", 00:22:10.307 "is_configured": true, 00:22:10.307 "data_offset": 2048, 00:22:10.307 "data_size": 63488 00:22:10.307 }, 00:22:10.307 { 00:22:10.307 "name": "BaseBdev4", 00:22:10.307 "uuid": "5def3d69-471e-5f4f-8de8-612c080e5cf0", 00:22:10.307 "is_configured": true, 00:22:10.307 "data_offset": 2048, 00:22:10.307 "data_size": 63488 00:22:10.307 } 00:22:10.307 ] 00:22:10.307 }' 00:22:10.307 00:04:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:10.307 00:04:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:10.920 00:04:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@621 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:10.920 00:04:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@621 -- # jq -r '.[].num_blocks' 00:22:10.920 [2024-05-15 00:04:11.414708] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:10.920 00:04:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@621 -- # raid_bdev_size=63488 00:22:10.920 00:04:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:10.920 00:04:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:11.177 00:04:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # data_offset=2048 00:22:11.177 00:04:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@626 -- # '[' false = true ']' 00:22:11.177 00:04:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@629 -- # '[' true = true ']' 00:22:11.177 00:04:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@630 -- # local write_unit_size 00:22:11.177 00:04:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@633 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:22:11.177 00:04:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:11.177 00:04:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:22:11.177 00:04:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:11.177 00:04:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:11.177 00:04:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:11.177 00:04:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:22:11.177 00:04:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:11.177 00:04:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:11.177 00:04:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:22:11.433 [2024-05-15 00:04:11.903777] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d864b0 00:22:11.433 /dev/nbd0 00:22:11.433 00:04:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:11.433 00:04:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:11.433 00:04:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:22:11.433 00:04:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@865 -- # local i 00:22:11.433 00:04:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:22:11.433 00:04:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:22:11.433 00:04:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:22:11.433 00:04:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # break 00:22:11.433 00:04:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:22:11.433 00:04:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:22:11.433 00:04:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:11.433 1+0 records in 00:22:11.433 1+0 records out 00:22:11.433 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000483249 s, 8.5 MB/s 00:22:11.433 00:04:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:11.433 00:04:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # size=4096 00:22:11.433 00:04:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:11.433 00:04:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:22:11.433 00:04:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # return 0 00:22:11.433 00:04:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:11.433 00:04:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:11.433 00:04:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # '[' raid1 = raid5f ']' 00:22:11.433 00:04:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@638 -- # write_unit_size=1 00:22:11.433 00:04:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@640 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:22:19.524 63488+0 records in 00:22:19.524 63488+0 records out 00:22:19.524 32505856 bytes (33 MB, 31 MiB) copied, 7.67819 s, 4.2 MB/s 00:22:19.524 00:04:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@641 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:22:19.524 00:04:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:19.524 00:04:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:22:19.524 00:04:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:19.524 00:04:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:22:19.524 00:04:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:19.524 00:04:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:19.524 00:04:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:19.524 [2024-05-15 00:04:19.910924] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:19.524 00:04:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:19.524 00:04:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:19.524 00:04:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:19.524 00:04:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:19.524 00:04:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:19.524 00:04:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:22:19.524 00:04:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:22:19.524 00:04:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:19.781 [2024-05-15 00:04:20.143608] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:19.781 00:04:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@648 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:19.781 00:04:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:19.781 00:04:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:19.781 00:04:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:19.781 00:04:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:19.781 00:04:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:22:19.781 00:04:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:19.781 00:04:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:19.781 00:04:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:19.781 00:04:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:19.781 00:04:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:19.781 00:04:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:20.038 00:04:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:20.038 "name": "raid_bdev1", 00:22:20.038 "uuid": "b04840ae-01dc-405e-b0a4-113a0cc8a7fc", 00:22:20.038 "strip_size_kb": 0, 00:22:20.038 "state": "online", 00:22:20.038 "raid_level": "raid1", 00:22:20.038 "superblock": true, 00:22:20.038 "num_base_bdevs": 4, 00:22:20.038 "num_base_bdevs_discovered": 3, 00:22:20.038 "num_base_bdevs_operational": 3, 00:22:20.038 "base_bdevs_list": [ 00:22:20.038 { 00:22:20.038 "name": null, 00:22:20.038 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:20.038 "is_configured": false, 00:22:20.038 "data_offset": 2048, 00:22:20.038 "data_size": 63488 00:22:20.038 }, 00:22:20.038 { 00:22:20.038 "name": "BaseBdev2", 00:22:20.038 "uuid": "be7b3e04-d912-5e96-96c7-60fc3e2a7ed1", 00:22:20.038 "is_configured": true, 00:22:20.038 "data_offset": 2048, 00:22:20.038 "data_size": 63488 00:22:20.038 }, 00:22:20.038 { 00:22:20.038 "name": "BaseBdev3", 00:22:20.038 "uuid": "efb5ca72-43f1-532c-b2ea-04be2ae0d59e", 00:22:20.038 "is_configured": true, 00:22:20.038 "data_offset": 2048, 00:22:20.038 "data_size": 63488 00:22:20.038 }, 00:22:20.038 { 00:22:20.038 "name": "BaseBdev4", 00:22:20.038 "uuid": "5def3d69-471e-5f4f-8de8-612c080e5cf0", 00:22:20.038 "is_configured": true, 00:22:20.038 "data_offset": 2048, 00:22:20.038 "data_size": 63488 00:22:20.038 } 00:22:20.038 ] 00:22:20.038 }' 00:22:20.038 00:04:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:20.038 00:04:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:20.602 00:04:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:20.859 [2024-05-15 00:04:21.230485] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:20.859 [2024-05-15 00:04:21.234577] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1be0ab0 00:22:20.859 [2024-05-15 00:04:21.236837] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:20.859 00:04:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # sleep 1 00:22:21.790 00:04:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:21.790 00:04:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:21.790 00:04:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:21.790 00:04:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:21.790 00:04:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:21.790 00:04:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:21.790 00:04:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:22.047 00:04:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:22.047 "name": "raid_bdev1", 00:22:22.047 "uuid": "b04840ae-01dc-405e-b0a4-113a0cc8a7fc", 00:22:22.047 "strip_size_kb": 0, 00:22:22.047 "state": "online", 00:22:22.047 "raid_level": "raid1", 00:22:22.047 "superblock": true, 00:22:22.047 "num_base_bdevs": 4, 00:22:22.047 "num_base_bdevs_discovered": 4, 00:22:22.047 "num_base_bdevs_operational": 4, 00:22:22.047 "process": { 00:22:22.047 "type": "rebuild", 00:22:22.047 "target": "spare", 00:22:22.047 "progress": { 00:22:22.047 "blocks": 24576, 00:22:22.047 "percent": 38 00:22:22.047 } 00:22:22.047 }, 00:22:22.047 "base_bdevs_list": [ 00:22:22.047 { 00:22:22.047 "name": "spare", 00:22:22.047 "uuid": "4688e284-b146-5e21-873c-e12d564614ca", 00:22:22.047 "is_configured": true, 00:22:22.047 "data_offset": 2048, 00:22:22.047 "data_size": 63488 00:22:22.047 }, 00:22:22.047 { 00:22:22.047 "name": "BaseBdev2", 00:22:22.047 "uuid": "be7b3e04-d912-5e96-96c7-60fc3e2a7ed1", 00:22:22.047 "is_configured": true, 00:22:22.047 "data_offset": 2048, 00:22:22.047 "data_size": 63488 00:22:22.047 }, 00:22:22.047 { 00:22:22.047 "name": "BaseBdev3", 00:22:22.047 "uuid": "efb5ca72-43f1-532c-b2ea-04be2ae0d59e", 00:22:22.047 "is_configured": true, 00:22:22.047 "data_offset": 2048, 00:22:22.047 "data_size": 63488 00:22:22.047 }, 00:22:22.047 { 00:22:22.047 "name": "BaseBdev4", 00:22:22.047 "uuid": "5def3d69-471e-5f4f-8de8-612c080e5cf0", 00:22:22.047 "is_configured": true, 00:22:22.047 "data_offset": 2048, 00:22:22.047 "data_size": 63488 00:22:22.047 } 00:22:22.047 ] 00:22:22.047 }' 00:22:22.047 00:04:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:22.047 00:04:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:22.047 00:04:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:22.047 00:04:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:22.047 00:04:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:22.305 [2024-05-15 00:04:22.823670] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:22.305 [2024-05-15 00:04:22.849109] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:22.305 [2024-05-15 00:04:22.849151] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:22.305 00:04:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:22.305 00:04:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:22.305 00:04:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:22.305 00:04:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:22.305 00:04:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:22.305 00:04:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:22:22.305 00:04:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:22.305 00:04:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:22.305 00:04:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:22.305 00:04:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:22.305 00:04:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:22.305 00:04:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:22.563 00:04:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:22.563 "name": "raid_bdev1", 00:22:22.563 "uuid": "b04840ae-01dc-405e-b0a4-113a0cc8a7fc", 00:22:22.563 "strip_size_kb": 0, 00:22:22.563 "state": "online", 00:22:22.563 "raid_level": "raid1", 00:22:22.563 "superblock": true, 00:22:22.563 "num_base_bdevs": 4, 00:22:22.563 "num_base_bdevs_discovered": 3, 00:22:22.563 "num_base_bdevs_operational": 3, 00:22:22.563 "base_bdevs_list": [ 00:22:22.563 { 00:22:22.563 "name": null, 00:22:22.563 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:22.563 "is_configured": false, 00:22:22.563 "data_offset": 2048, 00:22:22.563 "data_size": 63488 00:22:22.563 }, 00:22:22.563 { 00:22:22.563 "name": "BaseBdev2", 00:22:22.563 "uuid": "be7b3e04-d912-5e96-96c7-60fc3e2a7ed1", 00:22:22.563 "is_configured": true, 00:22:22.563 "data_offset": 2048, 00:22:22.563 "data_size": 63488 00:22:22.563 }, 00:22:22.563 { 00:22:22.563 "name": "BaseBdev3", 00:22:22.563 "uuid": "efb5ca72-43f1-532c-b2ea-04be2ae0d59e", 00:22:22.563 "is_configured": true, 00:22:22.563 "data_offset": 2048, 00:22:22.563 "data_size": 63488 00:22:22.563 }, 00:22:22.563 { 00:22:22.563 "name": "BaseBdev4", 00:22:22.563 "uuid": "5def3d69-471e-5f4f-8de8-612c080e5cf0", 00:22:22.563 "is_configured": true, 00:22:22.563 "data_offset": 2048, 00:22:22.563 "data_size": 63488 00:22:22.563 } 00:22:22.563 ] 00:22:22.563 }' 00:22:22.563 00:04:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:22.563 00:04:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:23.127 00:04:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:23.127 00:04:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:23.127 00:04:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:22:23.127 00:04:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:22:23.127 00:04:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:23.127 00:04:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:23.127 00:04:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:23.384 00:04:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:23.384 "name": "raid_bdev1", 00:22:23.384 "uuid": "b04840ae-01dc-405e-b0a4-113a0cc8a7fc", 00:22:23.384 "strip_size_kb": 0, 00:22:23.384 "state": "online", 00:22:23.384 "raid_level": "raid1", 00:22:23.384 "superblock": true, 00:22:23.384 "num_base_bdevs": 4, 00:22:23.384 "num_base_bdevs_discovered": 3, 00:22:23.384 "num_base_bdevs_operational": 3, 00:22:23.384 "base_bdevs_list": [ 00:22:23.384 { 00:22:23.384 "name": null, 00:22:23.384 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:23.384 "is_configured": false, 00:22:23.384 "data_offset": 2048, 00:22:23.384 "data_size": 63488 00:22:23.384 }, 00:22:23.384 { 00:22:23.384 "name": "BaseBdev2", 00:22:23.384 "uuid": "be7b3e04-d912-5e96-96c7-60fc3e2a7ed1", 00:22:23.384 "is_configured": true, 00:22:23.384 "data_offset": 2048, 00:22:23.384 "data_size": 63488 00:22:23.384 }, 00:22:23.384 { 00:22:23.384 "name": "BaseBdev3", 00:22:23.384 "uuid": "efb5ca72-43f1-532c-b2ea-04be2ae0d59e", 00:22:23.384 "is_configured": true, 00:22:23.384 "data_offset": 2048, 00:22:23.384 "data_size": 63488 00:22:23.384 }, 00:22:23.384 { 00:22:23.384 "name": "BaseBdev4", 00:22:23.384 "uuid": "5def3d69-471e-5f4f-8de8-612c080e5cf0", 00:22:23.384 "is_configured": true, 00:22:23.384 "data_offset": 2048, 00:22:23.384 "data_size": 63488 00:22:23.384 } 00:22:23.384 ] 00:22:23.384 }' 00:22:23.384 00:04:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:23.640 00:04:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:23.640 00:04:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:23.640 00:04:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:22:23.640 00:04:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@667 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:23.897 [2024-05-15 00:04:24.281569] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:23.897 [2024-05-15 00:04:24.285700] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1be10b0 00:22:23.897 [2024-05-15 00:04:24.287207] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:23.897 00:04:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@668 -- # sleep 1 00:22:24.827 00:04:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@669 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:24.827 00:04:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:24.827 00:04:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:24.827 00:04:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:24.827 00:04:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:24.827 00:04:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:24.827 00:04:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:25.083 00:04:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:25.083 "name": "raid_bdev1", 00:22:25.083 "uuid": "b04840ae-01dc-405e-b0a4-113a0cc8a7fc", 00:22:25.083 "strip_size_kb": 0, 00:22:25.083 "state": "online", 00:22:25.083 "raid_level": "raid1", 00:22:25.083 "superblock": true, 00:22:25.083 "num_base_bdevs": 4, 00:22:25.083 "num_base_bdevs_discovered": 4, 00:22:25.083 "num_base_bdevs_operational": 4, 00:22:25.083 "process": { 00:22:25.083 "type": "rebuild", 00:22:25.083 "target": "spare", 00:22:25.083 "progress": { 00:22:25.083 "blocks": 24576, 00:22:25.083 "percent": 38 00:22:25.083 } 00:22:25.083 }, 00:22:25.083 "base_bdevs_list": [ 00:22:25.083 { 00:22:25.083 "name": "spare", 00:22:25.083 "uuid": "4688e284-b146-5e21-873c-e12d564614ca", 00:22:25.083 "is_configured": true, 00:22:25.083 "data_offset": 2048, 00:22:25.083 "data_size": 63488 00:22:25.083 }, 00:22:25.083 { 00:22:25.083 "name": "BaseBdev2", 00:22:25.083 "uuid": "be7b3e04-d912-5e96-96c7-60fc3e2a7ed1", 00:22:25.083 "is_configured": true, 00:22:25.083 "data_offset": 2048, 00:22:25.083 "data_size": 63488 00:22:25.083 }, 00:22:25.083 { 00:22:25.083 "name": "BaseBdev3", 00:22:25.083 "uuid": "efb5ca72-43f1-532c-b2ea-04be2ae0d59e", 00:22:25.083 "is_configured": true, 00:22:25.083 "data_offset": 2048, 00:22:25.083 "data_size": 63488 00:22:25.083 }, 00:22:25.083 { 00:22:25.083 "name": "BaseBdev4", 00:22:25.083 "uuid": "5def3d69-471e-5f4f-8de8-612c080e5cf0", 00:22:25.083 "is_configured": true, 00:22:25.083 "data_offset": 2048, 00:22:25.083 "data_size": 63488 00:22:25.084 } 00:22:25.084 ] 00:22:25.084 }' 00:22:25.084 00:04:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:25.084 00:04:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:25.084 00:04:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:25.084 00:04:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:25.084 00:04:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@671 -- # '[' true = true ']' 00:22:25.084 00:04:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@671 -- # '[' = false ']' 00:22:25.084 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 671: [: =: unary operator expected 00:22:25.084 00:04:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@696 -- # local num_base_bdevs_operational=4 00:22:25.084 00:04:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # '[' raid1 = raid1 ']' 00:22:25.084 00:04:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # '[' 4 -gt 2 ']' 00:22:25.084 00:04:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@700 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:25.341 [2024-05-15 00:04:25.870878] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:25.341 [2024-05-15 00:04:25.899928] bdev_raid.c:1957:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1be10b0 00:22:25.598 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@703 -- # base_bdevs[1]= 00:22:25.598 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@704 -- # (( num_base_bdevs_operational-- )) 00:22:25.598 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:25.598 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:25.598 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:25.598 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:25.598 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:25.598 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:25.598 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:25.855 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:25.855 "name": "raid_bdev1", 00:22:25.855 "uuid": "b04840ae-01dc-405e-b0a4-113a0cc8a7fc", 00:22:25.855 "strip_size_kb": 0, 00:22:25.855 "state": "online", 00:22:25.855 "raid_level": "raid1", 00:22:25.855 "superblock": true, 00:22:25.855 "num_base_bdevs": 4, 00:22:25.855 "num_base_bdevs_discovered": 3, 00:22:25.855 "num_base_bdevs_operational": 3, 00:22:25.855 "process": { 00:22:25.855 "type": "rebuild", 00:22:25.855 "target": "spare", 00:22:25.855 "progress": { 00:22:25.856 "blocks": 38912, 00:22:25.856 "percent": 61 00:22:25.856 } 00:22:25.856 }, 00:22:25.856 "base_bdevs_list": [ 00:22:25.856 { 00:22:25.856 "name": "spare", 00:22:25.856 "uuid": "4688e284-b146-5e21-873c-e12d564614ca", 00:22:25.856 "is_configured": true, 00:22:25.856 "data_offset": 2048, 00:22:25.856 "data_size": 63488 00:22:25.856 }, 00:22:25.856 { 00:22:25.856 "name": null, 00:22:25.856 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:25.856 "is_configured": false, 00:22:25.856 "data_offset": 2048, 00:22:25.856 "data_size": 63488 00:22:25.856 }, 00:22:25.856 { 00:22:25.856 "name": "BaseBdev3", 00:22:25.856 "uuid": "efb5ca72-43f1-532c-b2ea-04be2ae0d59e", 00:22:25.856 "is_configured": true, 00:22:25.856 "data_offset": 2048, 00:22:25.856 "data_size": 63488 00:22:25.856 }, 00:22:25.856 { 00:22:25.856 "name": "BaseBdev4", 00:22:25.856 "uuid": "5def3d69-471e-5f4f-8de8-612c080e5cf0", 00:22:25.856 "is_configured": true, 00:22:25.856 "data_offset": 2048, 00:22:25.856 "data_size": 63488 00:22:25.856 } 00:22:25.856 ] 00:22:25.856 }' 00:22:25.856 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:25.856 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:25.856 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:25.856 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:25.856 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@711 -- # local timeout=757 00:22:25.856 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:22:25.856 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:25.856 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:25.856 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:25.856 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:25.856 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:25.856 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:25.856 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:26.113 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:26.113 "name": "raid_bdev1", 00:22:26.113 "uuid": "b04840ae-01dc-405e-b0a4-113a0cc8a7fc", 00:22:26.113 "strip_size_kb": 0, 00:22:26.113 "state": "online", 00:22:26.113 "raid_level": "raid1", 00:22:26.113 "superblock": true, 00:22:26.113 "num_base_bdevs": 4, 00:22:26.113 "num_base_bdevs_discovered": 3, 00:22:26.113 "num_base_bdevs_operational": 3, 00:22:26.113 "process": { 00:22:26.113 "type": "rebuild", 00:22:26.113 "target": "spare", 00:22:26.113 "progress": { 00:22:26.113 "blocks": 45056, 00:22:26.113 "percent": 70 00:22:26.113 } 00:22:26.113 }, 00:22:26.113 "base_bdevs_list": [ 00:22:26.113 { 00:22:26.113 "name": "spare", 00:22:26.113 "uuid": "4688e284-b146-5e21-873c-e12d564614ca", 00:22:26.113 "is_configured": true, 00:22:26.113 "data_offset": 2048, 00:22:26.113 "data_size": 63488 00:22:26.113 }, 00:22:26.113 { 00:22:26.113 "name": null, 00:22:26.113 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:26.113 "is_configured": false, 00:22:26.113 "data_offset": 2048, 00:22:26.113 "data_size": 63488 00:22:26.113 }, 00:22:26.113 { 00:22:26.113 "name": "BaseBdev3", 00:22:26.113 "uuid": "efb5ca72-43f1-532c-b2ea-04be2ae0d59e", 00:22:26.113 "is_configured": true, 00:22:26.113 "data_offset": 2048, 00:22:26.113 "data_size": 63488 00:22:26.113 }, 00:22:26.113 { 00:22:26.113 "name": "BaseBdev4", 00:22:26.113 "uuid": "5def3d69-471e-5f4f-8de8-612c080e5cf0", 00:22:26.113 "is_configured": true, 00:22:26.113 "data_offset": 2048, 00:22:26.113 "data_size": 63488 00:22:26.113 } 00:22:26.113 ] 00:22:26.113 }' 00:22:26.113 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:26.113 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:26.113 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:26.113 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:26.113 00:04:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@716 -- # sleep 1 00:22:27.044 [2024-05-15 00:04:27.411688] bdev_raid.c:2741:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:27.044 [2024-05-15 00:04:27.411753] bdev_raid.c:2458:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:27.044 [2024-05-15 00:04:27.411855] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:27.301 00:04:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:22:27.301 00:04:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:27.301 00:04:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:27.301 00:04:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:27.301 00:04:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:27.301 00:04:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:27.301 00:04:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.301 00:04:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:27.558 00:04:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:27.558 "name": "raid_bdev1", 00:22:27.558 "uuid": "b04840ae-01dc-405e-b0a4-113a0cc8a7fc", 00:22:27.558 "strip_size_kb": 0, 00:22:27.558 "state": "online", 00:22:27.558 "raid_level": "raid1", 00:22:27.558 "superblock": true, 00:22:27.558 "num_base_bdevs": 4, 00:22:27.558 "num_base_bdevs_discovered": 3, 00:22:27.558 "num_base_bdevs_operational": 3, 00:22:27.558 "base_bdevs_list": [ 00:22:27.558 { 00:22:27.558 "name": "spare", 00:22:27.558 "uuid": "4688e284-b146-5e21-873c-e12d564614ca", 00:22:27.558 "is_configured": true, 00:22:27.558 "data_offset": 2048, 00:22:27.558 "data_size": 63488 00:22:27.558 }, 00:22:27.558 { 00:22:27.558 "name": null, 00:22:27.558 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:27.558 "is_configured": false, 00:22:27.558 "data_offset": 2048, 00:22:27.558 "data_size": 63488 00:22:27.558 }, 00:22:27.558 { 00:22:27.558 "name": "BaseBdev3", 00:22:27.558 "uuid": "efb5ca72-43f1-532c-b2ea-04be2ae0d59e", 00:22:27.558 "is_configured": true, 00:22:27.558 "data_offset": 2048, 00:22:27.558 "data_size": 63488 00:22:27.558 }, 00:22:27.558 { 00:22:27.558 "name": "BaseBdev4", 00:22:27.558 "uuid": "5def3d69-471e-5f4f-8de8-612c080e5cf0", 00:22:27.558 "is_configured": true, 00:22:27.558 "data_offset": 2048, 00:22:27.558 "data_size": 63488 00:22:27.558 } 00:22:27.558 ] 00:22:27.558 }' 00:22:27.558 00:04:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:27.558 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:27.558 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:27.558 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:22:27.558 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # break 00:22:27.558 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@720 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:27.558 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:27.558 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:22:27.558 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:22:27.558 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:27.558 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.558 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:27.815 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:27.815 "name": "raid_bdev1", 00:22:27.815 "uuid": "b04840ae-01dc-405e-b0a4-113a0cc8a7fc", 00:22:27.815 "strip_size_kb": 0, 00:22:27.815 "state": "online", 00:22:27.815 "raid_level": "raid1", 00:22:27.815 "superblock": true, 00:22:27.815 "num_base_bdevs": 4, 00:22:27.815 "num_base_bdevs_discovered": 3, 00:22:27.815 "num_base_bdevs_operational": 3, 00:22:27.815 "base_bdevs_list": [ 00:22:27.815 { 00:22:27.815 "name": "spare", 00:22:27.815 "uuid": "4688e284-b146-5e21-873c-e12d564614ca", 00:22:27.815 "is_configured": true, 00:22:27.815 "data_offset": 2048, 00:22:27.815 "data_size": 63488 00:22:27.815 }, 00:22:27.815 { 00:22:27.815 "name": null, 00:22:27.815 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:27.815 "is_configured": false, 00:22:27.815 "data_offset": 2048, 00:22:27.815 "data_size": 63488 00:22:27.815 }, 00:22:27.815 { 00:22:27.815 "name": "BaseBdev3", 00:22:27.815 "uuid": "efb5ca72-43f1-532c-b2ea-04be2ae0d59e", 00:22:27.815 "is_configured": true, 00:22:27.815 "data_offset": 2048, 00:22:27.815 "data_size": 63488 00:22:27.815 }, 00:22:27.815 { 00:22:27.815 "name": "BaseBdev4", 00:22:27.815 "uuid": "5def3d69-471e-5f4f-8de8-612c080e5cf0", 00:22:27.815 "is_configured": true, 00:22:27.815 "data_offset": 2048, 00:22:27.815 "data_size": 63488 00:22:27.815 } 00:22:27.815 ] 00:22:27.815 }' 00:22:27.815 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:27.815 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:27.815 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:27.815 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:22:27.815 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:27.815 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:27.815 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:27.815 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:27.815 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:27.815 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:22:27.815 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:27.815 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:27.816 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:27.816 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:27.816 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.816 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:28.073 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:28.073 "name": "raid_bdev1", 00:22:28.073 "uuid": "b04840ae-01dc-405e-b0a4-113a0cc8a7fc", 00:22:28.073 "strip_size_kb": 0, 00:22:28.073 "state": "online", 00:22:28.073 "raid_level": "raid1", 00:22:28.073 "superblock": true, 00:22:28.073 "num_base_bdevs": 4, 00:22:28.073 "num_base_bdevs_discovered": 3, 00:22:28.073 "num_base_bdevs_operational": 3, 00:22:28.073 "base_bdevs_list": [ 00:22:28.073 { 00:22:28.073 "name": "spare", 00:22:28.073 "uuid": "4688e284-b146-5e21-873c-e12d564614ca", 00:22:28.073 "is_configured": true, 00:22:28.073 "data_offset": 2048, 00:22:28.073 "data_size": 63488 00:22:28.073 }, 00:22:28.073 { 00:22:28.073 "name": null, 00:22:28.073 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:28.073 "is_configured": false, 00:22:28.073 "data_offset": 2048, 00:22:28.073 "data_size": 63488 00:22:28.073 }, 00:22:28.073 { 00:22:28.073 "name": "BaseBdev3", 00:22:28.073 "uuid": "efb5ca72-43f1-532c-b2ea-04be2ae0d59e", 00:22:28.073 "is_configured": true, 00:22:28.073 "data_offset": 2048, 00:22:28.073 "data_size": 63488 00:22:28.073 }, 00:22:28.073 { 00:22:28.073 "name": "BaseBdev4", 00:22:28.073 "uuid": "5def3d69-471e-5f4f-8de8-612c080e5cf0", 00:22:28.073 "is_configured": true, 00:22:28.073 "data_offset": 2048, 00:22:28.073 "data_size": 63488 00:22:28.073 } 00:22:28.073 ] 00:22:28.073 }' 00:22:28.073 00:04:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:28.073 00:04:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:28.637 00:04:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@724 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:28.894 [2024-05-15 00:04:29.289158] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:28.894 [2024-05-15 00:04:29.289187] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:28.894 [2024-05-15 00:04:29.289249] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:28.894 [2024-05-15 00:04:29.289323] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:28.894 [2024-05-15 00:04:29.289335] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bdc840 name raid_bdev1, state offline 00:22:28.894 00:04:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@725 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:28.895 00:04:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@725 -- # jq length 00:22:29.151 00:04:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@725 -- # [[ 0 == 0 ]] 00:22:29.151 00:04:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@727 -- # '[' true = true ']' 00:22:29.151 00:04:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@728 -- # '[' false = true ']' 00:22:29.151 00:04:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:22:29.151 00:04:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:29.151 00:04:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:22:29.151 00:04:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:29.151 00:04:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:29.151 00:04:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:29.151 00:04:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:22:29.151 00:04:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:29.151 00:04:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:29.151 00:04:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:22:29.408 /dev/nbd0 00:22:29.408 00:04:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:29.408 00:04:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:29.408 00:04:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:22:29.408 00:04:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@865 -- # local i 00:22:29.408 00:04:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:22:29.408 00:04:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:22:29.408 00:04:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:22:29.408 00:04:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # break 00:22:29.408 00:04:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:22:29.408 00:04:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:22:29.408 00:04:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:29.408 1+0 records in 00:22:29.408 1+0 records out 00:22:29.408 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000189989 s, 21.6 MB/s 00:22:29.408 00:04:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:29.408 00:04:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # size=4096 00:22:29.409 00:04:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:29.409 00:04:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:22:29.409 00:04:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # return 0 00:22:29.409 00:04:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:29.409 00:04:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:29.409 00:04:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:22:29.666 /dev/nbd1 00:22:29.666 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:29.666 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:29.666 00:04:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:22:29.666 00:04:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@865 -- # local i 00:22:29.666 00:04:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:22:29.666 00:04:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:22:29.666 00:04:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:22:29.666 00:04:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # break 00:22:29.666 00:04:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:22:29.666 00:04:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:22:29.666 00:04:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:29.666 1+0 records in 00:22:29.666 1+0 records out 00:22:29.666 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000308179 s, 13.3 MB/s 00:22:29.666 00:04:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:29.666 00:04:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # size=4096 00:22:29.666 00:04:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:29.666 00:04:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:22:29.666 00:04:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # return 0 00:22:29.666 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:29.666 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:29.666 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@743 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:22:29.666 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:22:29.666 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:29.666 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:29.666 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:29.666 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:22:29.666 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:29.666 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:29.923 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:29.923 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:29.923 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:29.923 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:29.923 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:29.923 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:29.923 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:22:29.923 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:22:29.923 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:29.923 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:30.181 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:30.181 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:30.181 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:30.181 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:30.181 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:30.181 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:30.181 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:22:30.181 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:22:30.181 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # '[' true = true ']' 00:22:30.181 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:22:30.181 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev1 ']' 00:22:30.181 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:22:30.468 00:04:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:30.726 [2024-05-15 00:04:31.193746] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:30.726 [2024-05-15 00:04:31.193801] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:30.726 [2024-05-15 00:04:31.193823] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1be0eb0 00:22:30.726 [2024-05-15 00:04:31.193836] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:30.726 [2024-05-15 00:04:31.195464] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:30.726 [2024-05-15 00:04:31.195495] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:30.726 [2024-05-15 00:04:31.195567] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev1 00:22:30.726 [2024-05-15 00:04:31.195596] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:30.726 BaseBdev1 00:22:30.726 00:04:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:22:30.726 00:04:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@751 -- # '[' -z '' ']' 00:22:30.726 00:04:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # continue 00:22:30.726 00:04:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:22:30.726 00:04:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev3 ']' 00:22:30.726 00:04:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev3 00:22:30.983 00:04:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:22:31.239 [2024-05-15 00:04:31.683047] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:22:31.239 [2024-05-15 00:04:31.683094] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:31.239 [2024-05-15 00:04:31.683115] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d86190 00:22:31.239 [2024-05-15 00:04:31.683128] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:31.239 [2024-05-15 00:04:31.683488] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:31.239 [2024-05-15 00:04:31.683507] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:31.239 [2024-05-15 00:04:31.683570] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev3 00:22:31.239 [2024-05-15 00:04:31.683582] bdev_raid.c:3396:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev3 (4) greater than existing raid bdev raid_bdev1 (1) 00:22:31.239 [2024-05-15 00:04:31.683593] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:31.239 [2024-05-15 00:04:31.683608] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c76490 name raid_bdev1, state configuring 00:22:31.239 [2024-05-15 00:04:31.683640] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:31.239 BaseBdev3 00:22:31.239 00:04:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:22:31.239 00:04:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev4 ']' 00:22:31.239 00:04:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev4 00:22:31.496 00:04:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:22:31.752 [2024-05-15 00:04:32.168328] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:22:31.752 [2024-05-15 00:04:32.168369] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:31.752 [2024-05-15 00:04:32.168388] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d934c0 00:22:31.752 [2024-05-15 00:04:32.168412] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:31.752 [2024-05-15 00:04:32.168709] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:31.752 [2024-05-15 00:04:32.168726] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:31.752 [2024-05-15 00:04:32.168782] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev4 00:22:31.752 [2024-05-15 00:04:32.168801] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:31.752 BaseBdev4 00:22:31.752 00:04:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@757 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:32.009 00:04:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@758 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:32.266 [2024-05-15 00:04:32.649589] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:32.266 [2024-05-15 00:04:32.649626] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:32.266 [2024-05-15 00:04:32.649645] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bdbee0 00:22:32.266 [2024-05-15 00:04:32.649657] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:32.266 [2024-05-15 00:04:32.649975] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:32.266 [2024-05-15 00:04:32.649994] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:32.266 [2024-05-15 00:04:32.650062] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev spare 00:22:32.266 [2024-05-15 00:04:32.650080] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:32.266 spare 00:22:32.266 00:04:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:32.266 00:04:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:32.266 00:04:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:32.266 00:04:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:32.267 00:04:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:32.267 00:04:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:22:32.267 00:04:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:32.267 00:04:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:32.267 00:04:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:32.267 00:04:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:32.267 00:04:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:32.267 00:04:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:32.267 [2024-05-15 00:04:32.750410] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c76730 00:22:32.267 [2024-05-15 00:04:32.750426] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:32.267 [2024-05-15 00:04:32.750606] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18e9cd0 00:22:32.267 [2024-05-15 00:04:32.750747] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c76730 00:22:32.267 [2024-05-15 00:04:32.750757] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c76730 00:22:32.267 [2024-05-15 00:04:32.750868] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:32.558 00:04:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:32.559 "name": "raid_bdev1", 00:22:32.559 "uuid": "b04840ae-01dc-405e-b0a4-113a0cc8a7fc", 00:22:32.559 "strip_size_kb": 0, 00:22:32.559 "state": "online", 00:22:32.559 "raid_level": "raid1", 00:22:32.559 "superblock": true, 00:22:32.559 "num_base_bdevs": 4, 00:22:32.559 "num_base_bdevs_discovered": 3, 00:22:32.559 "num_base_bdevs_operational": 3, 00:22:32.559 "base_bdevs_list": [ 00:22:32.559 { 00:22:32.559 "name": "spare", 00:22:32.559 "uuid": "4688e284-b146-5e21-873c-e12d564614ca", 00:22:32.559 "is_configured": true, 00:22:32.559 "data_offset": 2048, 00:22:32.559 "data_size": 63488 00:22:32.559 }, 00:22:32.559 { 00:22:32.559 "name": null, 00:22:32.559 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:32.559 "is_configured": false, 00:22:32.559 "data_offset": 2048, 00:22:32.559 "data_size": 63488 00:22:32.559 }, 00:22:32.559 { 00:22:32.559 "name": "BaseBdev3", 00:22:32.559 "uuid": "efb5ca72-43f1-532c-b2ea-04be2ae0d59e", 00:22:32.559 "is_configured": true, 00:22:32.559 "data_offset": 2048, 00:22:32.559 "data_size": 63488 00:22:32.559 }, 00:22:32.559 { 00:22:32.559 "name": "BaseBdev4", 00:22:32.559 "uuid": "5def3d69-471e-5f4f-8de8-612c080e5cf0", 00:22:32.559 "is_configured": true, 00:22:32.559 "data_offset": 2048, 00:22:32.559 "data_size": 63488 00:22:32.559 } 00:22:32.559 ] 00:22:32.559 }' 00:22:32.559 00:04:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:32.559 00:04:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:33.122 00:04:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:33.123 00:04:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:33.123 00:04:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:22:33.123 00:04:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:22:33.123 00:04:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:33.123 00:04:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:33.123 00:04:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:33.123 00:04:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:33.123 "name": "raid_bdev1", 00:22:33.123 "uuid": "b04840ae-01dc-405e-b0a4-113a0cc8a7fc", 00:22:33.123 "strip_size_kb": 0, 00:22:33.123 "state": "online", 00:22:33.123 "raid_level": "raid1", 00:22:33.123 "superblock": true, 00:22:33.123 "num_base_bdevs": 4, 00:22:33.123 "num_base_bdevs_discovered": 3, 00:22:33.123 "num_base_bdevs_operational": 3, 00:22:33.123 "base_bdevs_list": [ 00:22:33.123 { 00:22:33.123 "name": "spare", 00:22:33.123 "uuid": "4688e284-b146-5e21-873c-e12d564614ca", 00:22:33.123 "is_configured": true, 00:22:33.123 "data_offset": 2048, 00:22:33.123 "data_size": 63488 00:22:33.123 }, 00:22:33.123 { 00:22:33.123 "name": null, 00:22:33.123 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:33.123 "is_configured": false, 00:22:33.123 "data_offset": 2048, 00:22:33.123 "data_size": 63488 00:22:33.123 }, 00:22:33.123 { 00:22:33.123 "name": "BaseBdev3", 00:22:33.123 "uuid": "efb5ca72-43f1-532c-b2ea-04be2ae0d59e", 00:22:33.123 "is_configured": true, 00:22:33.123 "data_offset": 2048, 00:22:33.123 "data_size": 63488 00:22:33.123 }, 00:22:33.123 { 00:22:33.123 "name": "BaseBdev4", 00:22:33.123 "uuid": "5def3d69-471e-5f4f-8de8-612c080e5cf0", 00:22:33.123 "is_configured": true, 00:22:33.123 "data_offset": 2048, 00:22:33.123 "data_size": 63488 00:22:33.123 } 00:22:33.123 ] 00:22:33.123 }' 00:22:33.123 00:04:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:33.123 00:04:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:33.123 00:04:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:33.123 00:04:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:22:33.123 00:04:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:33.123 00:04:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # jq -r '.[].base_bdevs_list[0].name' 00:22:33.380 00:04:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # [[ spare == \s\p\a\r\e ]] 00:22:33.380 00:04:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:33.638 [2024-05-15 00:04:34.165945] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:33.638 00:04:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:33.638 00:04:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:33.638 00:04:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:33.638 00:04:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:33.638 00:04:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:33.638 00:04:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:22:33.638 00:04:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:33.638 00:04:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:33.638 00:04:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:33.638 00:04:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:33.638 00:04:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:33.638 00:04:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:33.896 00:04:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:33.896 "name": "raid_bdev1", 00:22:33.896 "uuid": "b04840ae-01dc-405e-b0a4-113a0cc8a7fc", 00:22:33.896 "strip_size_kb": 0, 00:22:33.896 "state": "online", 00:22:33.896 "raid_level": "raid1", 00:22:33.896 "superblock": true, 00:22:33.896 "num_base_bdevs": 4, 00:22:33.896 "num_base_bdevs_discovered": 2, 00:22:33.896 "num_base_bdevs_operational": 2, 00:22:33.896 "base_bdevs_list": [ 00:22:33.896 { 00:22:33.896 "name": null, 00:22:33.896 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:33.896 "is_configured": false, 00:22:33.896 "data_offset": 2048, 00:22:33.896 "data_size": 63488 00:22:33.896 }, 00:22:33.896 { 00:22:33.896 "name": null, 00:22:33.896 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:33.896 "is_configured": false, 00:22:33.896 "data_offset": 2048, 00:22:33.896 "data_size": 63488 00:22:33.896 }, 00:22:33.896 { 00:22:33.896 "name": "BaseBdev3", 00:22:33.896 "uuid": "efb5ca72-43f1-532c-b2ea-04be2ae0d59e", 00:22:33.896 "is_configured": true, 00:22:33.896 "data_offset": 2048, 00:22:33.896 "data_size": 63488 00:22:33.896 }, 00:22:33.896 { 00:22:33.896 "name": "BaseBdev4", 00:22:33.896 "uuid": "5def3d69-471e-5f4f-8de8-612c080e5cf0", 00:22:33.896 "is_configured": true, 00:22:33.896 "data_offset": 2048, 00:22:33.896 "data_size": 63488 00:22:33.896 } 00:22:33.896 ] 00:22:33.896 }' 00:22:33.896 00:04:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:33.896 00:04:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:34.461 00:04:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:34.718 [2024-05-15 00:04:35.252991] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:34.718 [2024-05-15 00:04:35.253156] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:22:34.718 [2024-05-15 00:04:35.253173] bdev_raid.c:3452:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:34.718 [2024-05-15 00:04:35.253202] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:34.718 [2024-05-15 00:04:35.257139] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bdd070 00:22:34.718 [2024-05-15 00:04:35.258502] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:34.718 00:04:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # sleep 1 00:22:36.090 00:04:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:36.090 00:04:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:36.090 00:04:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:36.090 00:04:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:36.090 00:04:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:36.090 00:04:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:36.090 00:04:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:36.090 00:04:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:36.090 "name": "raid_bdev1", 00:22:36.090 "uuid": "b04840ae-01dc-405e-b0a4-113a0cc8a7fc", 00:22:36.090 "strip_size_kb": 0, 00:22:36.090 "state": "online", 00:22:36.090 "raid_level": "raid1", 00:22:36.090 "superblock": true, 00:22:36.090 "num_base_bdevs": 4, 00:22:36.090 "num_base_bdevs_discovered": 3, 00:22:36.090 "num_base_bdevs_operational": 3, 00:22:36.090 "process": { 00:22:36.090 "type": "rebuild", 00:22:36.090 "target": "spare", 00:22:36.090 "progress": { 00:22:36.090 "blocks": 22528, 00:22:36.090 "percent": 35 00:22:36.090 } 00:22:36.090 }, 00:22:36.090 "base_bdevs_list": [ 00:22:36.090 { 00:22:36.090 "name": "spare", 00:22:36.090 "uuid": "4688e284-b146-5e21-873c-e12d564614ca", 00:22:36.090 "is_configured": true, 00:22:36.090 "data_offset": 2048, 00:22:36.090 "data_size": 63488 00:22:36.090 }, 00:22:36.090 { 00:22:36.090 "name": null, 00:22:36.090 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:36.090 "is_configured": false, 00:22:36.090 "data_offset": 2048, 00:22:36.090 "data_size": 63488 00:22:36.090 }, 00:22:36.090 { 00:22:36.090 "name": "BaseBdev3", 00:22:36.090 "uuid": "efb5ca72-43f1-532c-b2ea-04be2ae0d59e", 00:22:36.090 "is_configured": true, 00:22:36.090 "data_offset": 2048, 00:22:36.090 "data_size": 63488 00:22:36.090 }, 00:22:36.090 { 00:22:36.090 "name": "BaseBdev4", 00:22:36.090 "uuid": "5def3d69-471e-5f4f-8de8-612c080e5cf0", 00:22:36.090 "is_configured": true, 00:22:36.090 "data_offset": 2048, 00:22:36.090 "data_size": 63488 00:22:36.090 } 00:22:36.090 ] 00:22:36.090 }' 00:22:36.090 00:04:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:36.090 00:04:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:36.090 00:04:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:36.090 00:04:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:36.090 00:04:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:36.348 [2024-05-15 00:04:36.783106] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:36.348 [2024-05-15 00:04:36.871303] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:36.348 [2024-05-15 00:04:36.871348] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:36.348 00:04:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:36.348 00:04:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:36.348 00:04:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:36.348 00:04:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:36.349 00:04:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:36.349 00:04:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:22:36.349 00:04:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:36.349 00:04:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:36.349 00:04:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:36.349 00:04:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:36.349 00:04:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:36.349 00:04:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:36.607 00:04:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:36.607 "name": "raid_bdev1", 00:22:36.607 "uuid": "b04840ae-01dc-405e-b0a4-113a0cc8a7fc", 00:22:36.607 "strip_size_kb": 0, 00:22:36.607 "state": "online", 00:22:36.607 "raid_level": "raid1", 00:22:36.607 "superblock": true, 00:22:36.607 "num_base_bdevs": 4, 00:22:36.607 "num_base_bdevs_discovered": 2, 00:22:36.607 "num_base_bdevs_operational": 2, 00:22:36.607 "base_bdevs_list": [ 00:22:36.607 { 00:22:36.607 "name": null, 00:22:36.607 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:36.607 "is_configured": false, 00:22:36.607 "data_offset": 2048, 00:22:36.607 "data_size": 63488 00:22:36.607 }, 00:22:36.607 { 00:22:36.607 "name": null, 00:22:36.607 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:36.607 "is_configured": false, 00:22:36.607 "data_offset": 2048, 00:22:36.607 "data_size": 63488 00:22:36.607 }, 00:22:36.607 { 00:22:36.607 "name": "BaseBdev3", 00:22:36.607 "uuid": "efb5ca72-43f1-532c-b2ea-04be2ae0d59e", 00:22:36.607 "is_configured": true, 00:22:36.607 "data_offset": 2048, 00:22:36.607 "data_size": 63488 00:22:36.607 }, 00:22:36.607 { 00:22:36.607 "name": "BaseBdev4", 00:22:36.607 "uuid": "5def3d69-471e-5f4f-8de8-612c080e5cf0", 00:22:36.607 "is_configured": true, 00:22:36.607 "data_offset": 2048, 00:22:36.607 "data_size": 63488 00:22:36.607 } 00:22:36.607 ] 00:22:36.607 }' 00:22:36.607 00:04:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:36.607 00:04:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:37.174 00:04:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:37.432 [2024-05-15 00:04:37.950743] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:37.432 [2024-05-15 00:04:37.950804] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:37.432 [2024-05-15 00:04:37.950826] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bded10 00:22:37.432 [2024-05-15 00:04:37.950839] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:37.432 [2024-05-15 00:04:37.951227] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:37.432 [2024-05-15 00:04:37.951247] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:37.432 [2024-05-15 00:04:37.951331] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev spare 00:22:37.432 [2024-05-15 00:04:37.951344] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:22:37.432 [2024-05-15 00:04:37.951355] bdev_raid.c:3452:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:37.432 [2024-05-15 00:04:37.951376] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:37.432 [2024-05-15 00:04:37.955417] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c77fd0 00:22:37.432 spare 00:22:37.432 [2024-05-15 00:04:37.956778] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:37.432 00:04:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # sleep 1 00:22:38.805 00:04:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:38.805 00:04:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:38.805 00:04:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:38.805 00:04:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:38.805 00:04:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:38.805 00:04:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:38.805 00:04:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:38.805 00:04:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:38.805 "name": "raid_bdev1", 00:22:38.805 "uuid": "b04840ae-01dc-405e-b0a4-113a0cc8a7fc", 00:22:38.805 "strip_size_kb": 0, 00:22:38.805 "state": "online", 00:22:38.805 "raid_level": "raid1", 00:22:38.805 "superblock": true, 00:22:38.805 "num_base_bdevs": 4, 00:22:38.805 "num_base_bdevs_discovered": 3, 00:22:38.805 "num_base_bdevs_operational": 3, 00:22:38.805 "process": { 00:22:38.805 "type": "rebuild", 00:22:38.805 "target": "spare", 00:22:38.805 "progress": { 00:22:38.805 "blocks": 24576, 00:22:38.805 "percent": 38 00:22:38.805 } 00:22:38.805 }, 00:22:38.805 "base_bdevs_list": [ 00:22:38.805 { 00:22:38.805 "name": "spare", 00:22:38.805 "uuid": "4688e284-b146-5e21-873c-e12d564614ca", 00:22:38.805 "is_configured": true, 00:22:38.805 "data_offset": 2048, 00:22:38.805 "data_size": 63488 00:22:38.805 }, 00:22:38.805 { 00:22:38.805 "name": null, 00:22:38.805 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:38.805 "is_configured": false, 00:22:38.805 "data_offset": 2048, 00:22:38.805 "data_size": 63488 00:22:38.805 }, 00:22:38.805 { 00:22:38.805 "name": "BaseBdev3", 00:22:38.805 "uuid": "efb5ca72-43f1-532c-b2ea-04be2ae0d59e", 00:22:38.805 "is_configured": true, 00:22:38.805 "data_offset": 2048, 00:22:38.805 "data_size": 63488 00:22:38.805 }, 00:22:38.805 { 00:22:38.805 "name": "BaseBdev4", 00:22:38.806 "uuid": "5def3d69-471e-5f4f-8de8-612c080e5cf0", 00:22:38.806 "is_configured": true, 00:22:38.806 "data_offset": 2048, 00:22:38.806 "data_size": 63488 00:22:38.806 } 00:22:38.806 ] 00:22:38.806 }' 00:22:38.806 00:04:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:38.806 00:04:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:38.806 00:04:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:38.806 00:04:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:38.806 00:04:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:39.063 [2024-05-15 00:04:39.525341] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:39.063 [2024-05-15 00:04:39.569418] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:39.063 [2024-05-15 00:04:39.569464] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:39.063 00:04:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@780 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:39.063 00:04:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:39.063 00:04:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:39.064 00:04:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:39.064 00:04:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:39.064 00:04:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:22:39.064 00:04:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:39.064 00:04:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:39.064 00:04:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:39.064 00:04:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:39.064 00:04:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:39.064 00:04:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:39.321 00:04:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:39.321 "name": "raid_bdev1", 00:22:39.321 "uuid": "b04840ae-01dc-405e-b0a4-113a0cc8a7fc", 00:22:39.321 "strip_size_kb": 0, 00:22:39.321 "state": "online", 00:22:39.321 "raid_level": "raid1", 00:22:39.321 "superblock": true, 00:22:39.321 "num_base_bdevs": 4, 00:22:39.321 "num_base_bdevs_discovered": 2, 00:22:39.321 "num_base_bdevs_operational": 2, 00:22:39.321 "base_bdevs_list": [ 00:22:39.321 { 00:22:39.321 "name": null, 00:22:39.321 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:39.321 "is_configured": false, 00:22:39.321 "data_offset": 2048, 00:22:39.321 "data_size": 63488 00:22:39.321 }, 00:22:39.321 { 00:22:39.321 "name": null, 00:22:39.321 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:39.321 "is_configured": false, 00:22:39.321 "data_offset": 2048, 00:22:39.321 "data_size": 63488 00:22:39.321 }, 00:22:39.321 { 00:22:39.321 "name": "BaseBdev3", 00:22:39.321 "uuid": "efb5ca72-43f1-532c-b2ea-04be2ae0d59e", 00:22:39.321 "is_configured": true, 00:22:39.321 "data_offset": 2048, 00:22:39.321 "data_size": 63488 00:22:39.321 }, 00:22:39.321 { 00:22:39.321 "name": "BaseBdev4", 00:22:39.321 "uuid": "5def3d69-471e-5f4f-8de8-612c080e5cf0", 00:22:39.321 "is_configured": true, 00:22:39.321 "data_offset": 2048, 00:22:39.321 "data_size": 63488 00:22:39.321 } 00:22:39.321 ] 00:22:39.321 }' 00:22:39.321 00:04:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:39.321 00:04:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:39.888 00:04:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@781 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:39.888 00:04:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:39.888 00:04:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:22:39.888 00:04:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:22:39.888 00:04:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:39.888 00:04:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:39.888 00:04:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:40.146 00:04:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:40.146 "name": "raid_bdev1", 00:22:40.146 "uuid": "b04840ae-01dc-405e-b0a4-113a0cc8a7fc", 00:22:40.146 "strip_size_kb": 0, 00:22:40.146 "state": "online", 00:22:40.146 "raid_level": "raid1", 00:22:40.146 "superblock": true, 00:22:40.146 "num_base_bdevs": 4, 00:22:40.146 "num_base_bdevs_discovered": 2, 00:22:40.146 "num_base_bdevs_operational": 2, 00:22:40.146 "base_bdevs_list": [ 00:22:40.146 { 00:22:40.146 "name": null, 00:22:40.146 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:40.146 "is_configured": false, 00:22:40.146 "data_offset": 2048, 00:22:40.146 "data_size": 63488 00:22:40.146 }, 00:22:40.146 { 00:22:40.146 "name": null, 00:22:40.146 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:40.146 "is_configured": false, 00:22:40.146 "data_offset": 2048, 00:22:40.146 "data_size": 63488 00:22:40.146 }, 00:22:40.146 { 00:22:40.146 "name": "BaseBdev3", 00:22:40.146 "uuid": "efb5ca72-43f1-532c-b2ea-04be2ae0d59e", 00:22:40.146 "is_configured": true, 00:22:40.146 "data_offset": 2048, 00:22:40.146 "data_size": 63488 00:22:40.146 }, 00:22:40.146 { 00:22:40.146 "name": "BaseBdev4", 00:22:40.146 "uuid": "5def3d69-471e-5f4f-8de8-612c080e5cf0", 00:22:40.146 "is_configured": true, 00:22:40.146 "data_offset": 2048, 00:22:40.146 "data_size": 63488 00:22:40.146 } 00:22:40.146 ] 00:22:40.146 }' 00:22:40.146 00:04:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:40.404 00:04:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:40.404 00:04:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:40.404 00:04:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:22:40.404 00:04:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:22:40.662 00:04:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@785 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:40.919 [2024-05-15 00:04:41.254031] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:40.919 [2024-05-15 00:04:41.254083] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:40.919 [2024-05-15 00:04:41.254108] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1be0eb0 00:22:40.919 [2024-05-15 00:04:41.254123] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:40.919 [2024-05-15 00:04:41.254501] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:40.919 [2024-05-15 00:04:41.254524] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:40.919 [2024-05-15 00:04:41.254591] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev1 00:22:40.919 [2024-05-15 00:04:41.254604] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:22:40.919 [2024-05-15 00:04:41.254615] bdev_raid.c:3430:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:40.919 BaseBdev1 00:22:40.919 00:04:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@786 -- # sleep 1 00:22:41.853 00:04:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@787 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:41.853 00:04:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:41.853 00:04:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:41.853 00:04:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:41.853 00:04:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:41.853 00:04:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:22:41.853 00:04:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:41.853 00:04:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:41.853 00:04:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:41.853 00:04:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:41.853 00:04:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:41.853 00:04:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:42.110 00:04:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:42.110 "name": "raid_bdev1", 00:22:42.110 "uuid": "b04840ae-01dc-405e-b0a4-113a0cc8a7fc", 00:22:42.110 "strip_size_kb": 0, 00:22:42.110 "state": "online", 00:22:42.110 "raid_level": "raid1", 00:22:42.110 "superblock": true, 00:22:42.110 "num_base_bdevs": 4, 00:22:42.110 "num_base_bdevs_discovered": 2, 00:22:42.110 "num_base_bdevs_operational": 2, 00:22:42.110 "base_bdevs_list": [ 00:22:42.110 { 00:22:42.110 "name": null, 00:22:42.110 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:42.110 "is_configured": false, 00:22:42.110 "data_offset": 2048, 00:22:42.110 "data_size": 63488 00:22:42.111 }, 00:22:42.111 { 00:22:42.111 "name": null, 00:22:42.111 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:42.111 "is_configured": false, 00:22:42.111 "data_offset": 2048, 00:22:42.111 "data_size": 63488 00:22:42.111 }, 00:22:42.111 { 00:22:42.111 "name": "BaseBdev3", 00:22:42.111 "uuid": "efb5ca72-43f1-532c-b2ea-04be2ae0d59e", 00:22:42.111 "is_configured": true, 00:22:42.111 "data_offset": 2048, 00:22:42.111 "data_size": 63488 00:22:42.111 }, 00:22:42.111 { 00:22:42.111 "name": "BaseBdev4", 00:22:42.111 "uuid": "5def3d69-471e-5f4f-8de8-612c080e5cf0", 00:22:42.111 "is_configured": true, 00:22:42.111 "data_offset": 2048, 00:22:42.111 "data_size": 63488 00:22:42.111 } 00:22:42.111 ] 00:22:42.111 }' 00:22:42.111 00:04:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:42.111 00:04:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:42.676 00:04:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@788 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:42.676 00:04:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:42.676 00:04:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:22:42.676 00:04:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:22:42.676 00:04:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:42.676 00:04:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:42.676 00:04:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:42.934 00:04:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:42.934 "name": "raid_bdev1", 00:22:42.934 "uuid": "b04840ae-01dc-405e-b0a4-113a0cc8a7fc", 00:22:42.934 "strip_size_kb": 0, 00:22:42.934 "state": "online", 00:22:42.934 "raid_level": "raid1", 00:22:42.934 "superblock": true, 00:22:42.934 "num_base_bdevs": 4, 00:22:42.934 "num_base_bdevs_discovered": 2, 00:22:42.934 "num_base_bdevs_operational": 2, 00:22:42.934 "base_bdevs_list": [ 00:22:42.934 { 00:22:42.934 "name": null, 00:22:42.934 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:42.934 "is_configured": false, 00:22:42.934 "data_offset": 2048, 00:22:42.934 "data_size": 63488 00:22:42.934 }, 00:22:42.934 { 00:22:42.934 "name": null, 00:22:42.934 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:42.934 "is_configured": false, 00:22:42.934 "data_offset": 2048, 00:22:42.934 "data_size": 63488 00:22:42.934 }, 00:22:42.934 { 00:22:42.934 "name": "BaseBdev3", 00:22:42.934 "uuid": "efb5ca72-43f1-532c-b2ea-04be2ae0d59e", 00:22:42.934 "is_configured": true, 00:22:42.934 "data_offset": 2048, 00:22:42.934 "data_size": 63488 00:22:42.934 }, 00:22:42.934 { 00:22:42.934 "name": "BaseBdev4", 00:22:42.934 "uuid": "5def3d69-471e-5f4f-8de8-612c080e5cf0", 00:22:42.934 "is_configured": true, 00:22:42.934 "data_offset": 2048, 00:22:42.934 "data_size": 63488 00:22:42.934 } 00:22:42.934 ] 00:22:42.934 }' 00:22:42.934 00:04:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:42.934 00:04:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:42.934 00:04:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:42.934 00:04:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:22:42.934 00:04:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@789 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:42.934 00:04:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:22:42.934 00:04:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:42.934 00:04:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:42.934 00:04:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:42.934 00:04:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:42.934 00:04:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:42.934 00:04:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:42.934 00:04:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:42.934 00:04:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:42.934 00:04:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:42.934 00:04:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:43.192 [2024-05-15 00:04:43.684497] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:43.192 [2024-05-15 00:04:43.684643] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:22:43.192 [2024-05-15 00:04:43.684660] bdev_raid.c:3430:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:43.192 request: 00:22:43.192 { 00:22:43.192 "raid_bdev": "raid_bdev1", 00:22:43.192 "base_bdev": "BaseBdev1", 00:22:43.192 "method": "bdev_raid_add_base_bdev", 00:22:43.192 "req_id": 1 00:22:43.192 } 00:22:43.192 Got JSON-RPC error response 00:22:43.192 response: 00:22:43.192 { 00:22:43.192 "code": -22, 00:22:43.192 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:22:43.192 } 00:22:43.192 00:04:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:22:43.192 00:04:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:22:43.192 00:04:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:22:43.192 00:04:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:22:43.192 00:04:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@790 -- # sleep 1 00:22:44.123 00:04:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:44.123 00:04:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:44.123 00:04:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:44.123 00:04:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:44.123 00:04:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:44.123 00:04:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:22:44.123 00:04:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:44.123 00:04:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:44.123 00:04:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:44.123 00:04:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:44.381 00:04:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:44.381 00:04:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:44.381 00:04:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:44.381 "name": "raid_bdev1", 00:22:44.381 "uuid": "b04840ae-01dc-405e-b0a4-113a0cc8a7fc", 00:22:44.381 "strip_size_kb": 0, 00:22:44.381 "state": "online", 00:22:44.381 "raid_level": "raid1", 00:22:44.381 "superblock": true, 00:22:44.381 "num_base_bdevs": 4, 00:22:44.381 "num_base_bdevs_discovered": 2, 00:22:44.381 "num_base_bdevs_operational": 2, 00:22:44.381 "base_bdevs_list": [ 00:22:44.381 { 00:22:44.381 "name": null, 00:22:44.381 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:44.381 "is_configured": false, 00:22:44.381 "data_offset": 2048, 00:22:44.381 "data_size": 63488 00:22:44.381 }, 00:22:44.381 { 00:22:44.381 "name": null, 00:22:44.381 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:44.381 "is_configured": false, 00:22:44.381 "data_offset": 2048, 00:22:44.381 "data_size": 63488 00:22:44.381 }, 00:22:44.381 { 00:22:44.381 "name": "BaseBdev3", 00:22:44.382 "uuid": "efb5ca72-43f1-532c-b2ea-04be2ae0d59e", 00:22:44.382 "is_configured": true, 00:22:44.382 "data_offset": 2048, 00:22:44.382 "data_size": 63488 00:22:44.382 }, 00:22:44.382 { 00:22:44.382 "name": "BaseBdev4", 00:22:44.382 "uuid": "5def3d69-471e-5f4f-8de8-612c080e5cf0", 00:22:44.382 "is_configured": true, 00:22:44.382 "data_offset": 2048, 00:22:44.382 "data_size": 63488 00:22:44.382 } 00:22:44.382 ] 00:22:44.382 }' 00:22:44.382 00:04:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:44.382 00:04:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:45.325 00:04:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@792 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:45.325 00:04:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:45.325 00:04:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:22:45.325 00:04:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:22:45.325 00:04:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:45.325 00:04:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:45.325 00:04:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:45.325 00:04:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:45.325 "name": "raid_bdev1", 00:22:45.325 "uuid": "b04840ae-01dc-405e-b0a4-113a0cc8a7fc", 00:22:45.325 "strip_size_kb": 0, 00:22:45.325 "state": "online", 00:22:45.325 "raid_level": "raid1", 00:22:45.325 "superblock": true, 00:22:45.325 "num_base_bdevs": 4, 00:22:45.325 "num_base_bdevs_discovered": 2, 00:22:45.325 "num_base_bdevs_operational": 2, 00:22:45.325 "base_bdevs_list": [ 00:22:45.325 { 00:22:45.325 "name": null, 00:22:45.325 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:45.325 "is_configured": false, 00:22:45.325 "data_offset": 2048, 00:22:45.325 "data_size": 63488 00:22:45.325 }, 00:22:45.325 { 00:22:45.325 "name": null, 00:22:45.325 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:45.325 "is_configured": false, 00:22:45.325 "data_offset": 2048, 00:22:45.325 "data_size": 63488 00:22:45.325 }, 00:22:45.325 { 00:22:45.325 "name": "BaseBdev3", 00:22:45.325 "uuid": "efb5ca72-43f1-532c-b2ea-04be2ae0d59e", 00:22:45.325 "is_configured": true, 00:22:45.325 "data_offset": 2048, 00:22:45.325 "data_size": 63488 00:22:45.325 }, 00:22:45.325 { 00:22:45.325 "name": "BaseBdev4", 00:22:45.325 "uuid": "5def3d69-471e-5f4f-8de8-612c080e5cf0", 00:22:45.325 "is_configured": true, 00:22:45.325 "data_offset": 2048, 00:22:45.325 "data_size": 63488 00:22:45.325 } 00:22:45.325 ] 00:22:45.325 }' 00:22:45.325 00:04:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:45.325 00:04:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:45.325 00:04:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:45.325 00:04:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:22:45.325 00:04:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@795 -- # killprocess 491604 00:22:45.325 00:04:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@946 -- # '[' -z 491604 ']' 00:22:45.325 00:04:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@950 -- # kill -0 491604 00:22:45.325 00:04:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@951 -- # uname 00:22:45.325 00:04:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:22:45.325 00:04:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 491604 00:22:45.607 00:04:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:22:45.607 00:04:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:22:45.607 00:04:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 491604' 00:22:45.607 killing process with pid 491604 00:22:45.607 00:04:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@965 -- # kill 491604 00:22:45.607 Received shutdown signal, test time was about 60.000000 seconds 00:22:45.607 00:22:45.607 Latency(us) 00:22:45.607 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:45.607 =================================================================================================================== 00:22:45.607 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:22:45.607 [2024-05-15 00:04:45.919972] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:45.607 00:04:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@970 -- # wait 491604 00:22:45.607 [2024-05-15 00:04:45.920075] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:45.607 [2024-05-15 00:04:45.920139] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:45.607 [2024-05-15 00:04:45.920151] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c76730 name raid_bdev1, state offline 00:22:45.607 [2024-05-15 00:04:45.971948] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:45.607 00:04:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@797 -- # return 0 00:22:45.607 00:22:45.607 real 0m39.685s 00:22:45.607 user 0m57.562s 00:22:45.607 sys 0m7.104s 00:22:45.607 00:04:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:22:45.607 00:04:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:45.607 ************************************ 00:22:45.607 END TEST raid_rebuild_test_sb 00:22:45.607 ************************************ 00:22:45.865 00:04:46 bdev_raid -- bdev/bdev_raid.sh@825 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:22:45.865 00:04:46 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:22:45.865 00:04:46 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:22:45.865 00:04:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:45.865 ************************************ 00:22:45.865 START TEST raid_rebuild_test_io 00:22:45.865 ************************************ 00:22:45.865 00:04:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 4 false true true 00:22:45.865 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_level=raid1 00:22:45.865 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local num_base_bdevs=4 00:22:45.865 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local superblock=false 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local background_io=true 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local verify=true 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i = 1 )) 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@581 -- # echo BaseBdev1 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@581 -- # echo BaseBdev2 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@581 -- # echo BaseBdev3 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@581 -- # echo BaseBdev4 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # local base_bdevs 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # local raid_bdev_name=raid_bdev1 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@581 -- # local strip_size 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@582 -- # local create_arg 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@583 -- # local raid_bdev_size 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@584 -- # local data_offset 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@586 -- # '[' raid1 '!=' raid1 ']' 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@594 -- # strip_size=0 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # '[' false = true ']' 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # raid_pid=497160 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@603 -- # waitforlisten 497160 /var/tmp/spdk-raid.sock 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@827 -- # '[' -z 497160 ']' 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@832 -- # local max_retries=100 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:45.866 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # xtrace_disable 00:22:45.866 00:04:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:45.866 [2024-05-15 00:04:46.351682] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:22:45.866 [2024-05-15 00:04:46.351746] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid497160 ] 00:22:45.866 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:45.866 Zero copy mechanism will not be used. 00:22:46.123 [2024-05-15 00:04:46.479953] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:46.123 [2024-05-15 00:04:46.583424] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:22:46.123 [2024-05-15 00:04:46.646937] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:46.123 [2024-05-15 00:04:46.646967] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:46.716 00:04:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:22:46.716 00:04:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@860 -- # return 0 00:22:46.716 00:04:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:22:46.716 00:04:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:46.974 BaseBdev1_malloc 00:22:46.974 00:04:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:47.231 [2024-05-15 00:04:47.663618] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:47.231 [2024-05-15 00:04:47.663673] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:47.232 [2024-05-15 00:04:47.663698] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25c0b50 00:22:47.232 [2024-05-15 00:04:47.663711] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:47.232 [2024-05-15 00:04:47.665488] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:47.232 [2024-05-15 00:04:47.665520] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:47.232 BaseBdev1 00:22:47.232 00:04:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:22:47.232 00:04:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:47.489 BaseBdev2_malloc 00:22:47.489 00:04:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:47.747 [2024-05-15 00:04:48.149709] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:47.747 [2024-05-15 00:04:48.149757] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:47.747 [2024-05-15 00:04:48.149777] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2766d10 00:22:47.747 [2024-05-15 00:04:48.149790] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:47.747 [2024-05-15 00:04:48.151362] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:47.747 [2024-05-15 00:04:48.151390] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:47.747 BaseBdev2 00:22:47.747 00:04:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:22:47.747 00:04:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:48.005 BaseBdev3_malloc 00:22:48.005 00:04:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:22:48.263 [2024-05-15 00:04:48.635826] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:22:48.263 [2024-05-15 00:04:48.635876] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:48.263 [2024-05-15 00:04:48.635898] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2768510 00:22:48.263 [2024-05-15 00:04:48.635911] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:48.263 [2024-05-15 00:04:48.637523] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:48.263 [2024-05-15 00:04:48.637552] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:48.263 BaseBdev3 00:22:48.263 00:04:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:22:48.263 00:04:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:48.522 BaseBdev4_malloc 00:22:48.522 00:04:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:22:48.780 [2024-05-15 00:04:49.114958] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:22:48.780 [2024-05-15 00:04:49.115008] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:48.780 [2024-05-15 00:04:49.115029] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2772290 00:22:48.780 [2024-05-15 00:04:49.115042] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:48.780 [2024-05-15 00:04:49.116633] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:48.780 [2024-05-15 00:04:49.116669] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:48.780 BaseBdev4 00:22:48.780 00:04:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:48.780 spare_malloc 00:22:49.038 00:04:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@613 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:49.038 spare_delay 00:22:49.038 00:04:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@614 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:49.296 [2024-05-15 00:04:49.770616] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:49.296 [2024-05-15 00:04:49.770664] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:49.296 [2024-05-15 00:04:49.770686] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25b7e00 00:22:49.296 [2024-05-15 00:04:49.770698] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:49.296 [2024-05-15 00:04:49.772320] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:49.296 [2024-05-15 00:04:49.772350] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:49.296 spare 00:22:49.296 00:04:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:22:49.554 [2024-05-15 00:04:50.007262] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:49.554 [2024-05-15 00:04:50.008594] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:49.554 [2024-05-15 00:04:50.008652] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:49.554 [2024-05-15 00:04:50.008697] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:49.554 [2024-05-15 00:04:50.008777] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x25bb840 00:22:49.554 [2024-05-15 00:04:50.008787] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:22:49.554 [2024-05-15 00:04:50.009004] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25bf1f0 00:22:49.554 [2024-05-15 00:04:50.009157] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25bb840 00:22:49.554 [2024-05-15 00:04:50.009167] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25bb840 00:22:49.554 [2024-05-15 00:04:50.009289] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:49.554 00:04:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:49.554 00:04:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:49.554 00:04:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:49.554 00:04:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:49.554 00:04:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:49.554 00:04:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:22:49.554 00:04:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:49.554 00:04:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:49.554 00:04:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:49.554 00:04:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:49.554 00:04:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.554 00:04:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:49.812 00:04:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:49.812 "name": "raid_bdev1", 00:22:49.812 "uuid": "52015bbf-1fe2-435d-acaf-c022d63f5212", 00:22:49.812 "strip_size_kb": 0, 00:22:49.812 "state": "online", 00:22:49.812 "raid_level": "raid1", 00:22:49.812 "superblock": false, 00:22:49.812 "num_base_bdevs": 4, 00:22:49.812 "num_base_bdevs_discovered": 4, 00:22:49.812 "num_base_bdevs_operational": 4, 00:22:49.812 "base_bdevs_list": [ 00:22:49.812 { 00:22:49.812 "name": "BaseBdev1", 00:22:49.812 "uuid": "bf9b5108-fe89-5e2e-8bb3-c69bb9d55e13", 00:22:49.812 "is_configured": true, 00:22:49.812 "data_offset": 0, 00:22:49.812 "data_size": 65536 00:22:49.812 }, 00:22:49.812 { 00:22:49.812 "name": "BaseBdev2", 00:22:49.812 "uuid": "622b72db-ba60-5b00-9f71-2c66aeb450d1", 00:22:49.812 "is_configured": true, 00:22:49.812 "data_offset": 0, 00:22:49.812 "data_size": 65536 00:22:49.812 }, 00:22:49.812 { 00:22:49.812 "name": "BaseBdev3", 00:22:49.812 "uuid": "b47d0489-8956-5f0d-b00c-84874113780e", 00:22:49.812 "is_configured": true, 00:22:49.812 "data_offset": 0, 00:22:49.812 "data_size": 65536 00:22:49.812 }, 00:22:49.812 { 00:22:49.812 "name": "BaseBdev4", 00:22:49.812 "uuid": "176b18ed-08c5-5bbf-ad45-570da32c2fca", 00:22:49.812 "is_configured": true, 00:22:49.812 "data_offset": 0, 00:22:49.812 "data_size": 65536 00:22:49.812 } 00:22:49.812 ] 00:22:49.812 }' 00:22:49.812 00:04:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:49.812 00:04:50 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:50.377 00:04:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@621 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:50.377 00:04:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@621 -- # jq -r '.[].num_blocks' 00:22:50.634 [2024-05-15 00:04:51.098411] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:50.634 00:04:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@621 -- # raid_bdev_size=65536 00:22:50.634 00:04:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:50.634 00:04:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@624 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:50.892 00:04:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@624 -- # data_offset=0 00:22:50.892 00:04:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@626 -- # '[' true = true ']' 00:22:50.892 00:04:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:50.892 00:04:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@628 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:50.892 [2024-05-15 00:04:51.413021] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25bb3e0 00:22:50.892 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:50.892 Zero copy mechanism will not be used. 00:22:50.892 Running I/O for 60 seconds... 00:22:51.151 [2024-05-15 00:04:51.527756] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:51.151 [2024-05-15 00:04:51.527934] bdev_raid.c:1957:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x25bb3e0 00:22:51.151 00:04:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@648 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:51.151 00:04:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:51.151 00:04:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:51.151 00:04:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:51.151 00:04:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:51.151 00:04:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:22:51.151 00:04:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:51.151 00:04:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:51.151 00:04:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:51.151 00:04:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:51.151 00:04:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:51.151 00:04:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:51.409 00:04:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:51.409 "name": "raid_bdev1", 00:22:51.409 "uuid": "52015bbf-1fe2-435d-acaf-c022d63f5212", 00:22:51.409 "strip_size_kb": 0, 00:22:51.409 "state": "online", 00:22:51.409 "raid_level": "raid1", 00:22:51.409 "superblock": false, 00:22:51.409 "num_base_bdevs": 4, 00:22:51.409 "num_base_bdevs_discovered": 3, 00:22:51.409 "num_base_bdevs_operational": 3, 00:22:51.409 "base_bdevs_list": [ 00:22:51.409 { 00:22:51.409 "name": null, 00:22:51.409 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:51.409 "is_configured": false, 00:22:51.409 "data_offset": 0, 00:22:51.409 "data_size": 65536 00:22:51.409 }, 00:22:51.409 { 00:22:51.409 "name": "BaseBdev2", 00:22:51.409 "uuid": "622b72db-ba60-5b00-9f71-2c66aeb450d1", 00:22:51.409 "is_configured": true, 00:22:51.409 "data_offset": 0, 00:22:51.409 "data_size": 65536 00:22:51.409 }, 00:22:51.409 { 00:22:51.409 "name": "BaseBdev3", 00:22:51.409 "uuid": "b47d0489-8956-5f0d-b00c-84874113780e", 00:22:51.409 "is_configured": true, 00:22:51.409 "data_offset": 0, 00:22:51.409 "data_size": 65536 00:22:51.409 }, 00:22:51.409 { 00:22:51.409 "name": "BaseBdev4", 00:22:51.409 "uuid": "176b18ed-08c5-5bbf-ad45-570da32c2fca", 00:22:51.409 "is_configured": true, 00:22:51.409 "data_offset": 0, 00:22:51.409 "data_size": 65536 00:22:51.409 } 00:22:51.409 ] 00:22:51.409 }' 00:22:51.409 00:04:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:51.409 00:04:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:51.975 00:04:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:52.233 [2024-05-15 00:04:52.664123] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:52.233 00:04:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # sleep 1 00:22:52.233 [2024-05-15 00:04:52.738094] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2655890 00:22:52.233 [2024-05-15 00:04:52.740690] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:52.490 [2024-05-15 00:04:52.862864] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:52.748 [2024-05-15 00:04:53.105316] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:52.748 [2024-05-15 00:04:53.105985] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:53.005 [2024-05-15 00:04:53.424638] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:22:53.263 00:04:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:53.263 00:04:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:53.263 00:04:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:53.263 00:04:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:53.263 00:04:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:53.263 00:04:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:53.263 00:04:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:53.263 [2024-05-15 00:04:53.772973] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:22:53.521 00:04:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:53.521 "name": "raid_bdev1", 00:22:53.521 "uuid": "52015bbf-1fe2-435d-acaf-c022d63f5212", 00:22:53.521 "strip_size_kb": 0, 00:22:53.521 "state": "online", 00:22:53.521 "raid_level": "raid1", 00:22:53.521 "superblock": false, 00:22:53.521 "num_base_bdevs": 4, 00:22:53.521 "num_base_bdevs_discovered": 4, 00:22:53.521 "num_base_bdevs_operational": 4, 00:22:53.521 "process": { 00:22:53.521 "type": "rebuild", 00:22:53.521 "target": "spare", 00:22:53.521 "progress": { 00:22:53.521 "blocks": 14336, 00:22:53.521 "percent": 21 00:22:53.521 } 00:22:53.521 }, 00:22:53.521 "base_bdevs_list": [ 00:22:53.521 { 00:22:53.521 "name": "spare", 00:22:53.521 "uuid": "60f70ede-d07f-5b4d-a3c9-cfbe81578b54", 00:22:53.521 "is_configured": true, 00:22:53.521 "data_offset": 0, 00:22:53.521 "data_size": 65536 00:22:53.521 }, 00:22:53.521 { 00:22:53.521 "name": "BaseBdev2", 00:22:53.521 "uuid": "622b72db-ba60-5b00-9f71-2c66aeb450d1", 00:22:53.521 "is_configured": true, 00:22:53.521 "data_offset": 0, 00:22:53.521 "data_size": 65536 00:22:53.521 }, 00:22:53.521 { 00:22:53.521 "name": "BaseBdev3", 00:22:53.521 "uuid": "b47d0489-8956-5f0d-b00c-84874113780e", 00:22:53.521 "is_configured": true, 00:22:53.521 "data_offset": 0, 00:22:53.521 "data_size": 65536 00:22:53.521 }, 00:22:53.521 { 00:22:53.521 "name": "BaseBdev4", 00:22:53.521 "uuid": "176b18ed-08c5-5bbf-ad45-570da32c2fca", 00:22:53.521 "is_configured": true, 00:22:53.521 "data_offset": 0, 00:22:53.521 "data_size": 65536 00:22:53.521 } 00:22:53.521 ] 00:22:53.521 }' 00:22:53.521 00:04:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:53.521 00:04:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:53.521 00:04:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:53.521 00:04:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:53.521 00:04:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:53.779 [2024-05-15 00:04:54.228779] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:22:53.779 [2024-05-15 00:04:54.296043] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:54.038 [2024-05-15 00:04:54.375889] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:22:54.038 [2024-05-15 00:04:54.486242] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:54.038 [2024-05-15 00:04:54.500691] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:54.038 [2024-05-15 00:04:54.540835] bdev_raid.c:1957:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x25bb3e0 00:22:54.038 00:04:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:54.038 00:04:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:54.038 00:04:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:54.038 00:04:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:54.038 00:04:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:54.038 00:04:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:22:54.038 00:04:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:54.038 00:04:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:54.038 00:04:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:54.038 00:04:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:54.038 00:04:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:54.038 00:04:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:54.297 00:04:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:54.297 "name": "raid_bdev1", 00:22:54.297 "uuid": "52015bbf-1fe2-435d-acaf-c022d63f5212", 00:22:54.297 "strip_size_kb": 0, 00:22:54.297 "state": "online", 00:22:54.297 "raid_level": "raid1", 00:22:54.297 "superblock": false, 00:22:54.297 "num_base_bdevs": 4, 00:22:54.297 "num_base_bdevs_discovered": 3, 00:22:54.297 "num_base_bdevs_operational": 3, 00:22:54.297 "base_bdevs_list": [ 00:22:54.297 { 00:22:54.297 "name": null, 00:22:54.297 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:54.297 "is_configured": false, 00:22:54.297 "data_offset": 0, 00:22:54.297 "data_size": 65536 00:22:54.297 }, 00:22:54.297 { 00:22:54.297 "name": "BaseBdev2", 00:22:54.297 "uuid": "622b72db-ba60-5b00-9f71-2c66aeb450d1", 00:22:54.297 "is_configured": true, 00:22:54.297 "data_offset": 0, 00:22:54.297 "data_size": 65536 00:22:54.297 }, 00:22:54.297 { 00:22:54.297 "name": "BaseBdev3", 00:22:54.297 "uuid": "b47d0489-8956-5f0d-b00c-84874113780e", 00:22:54.297 "is_configured": true, 00:22:54.297 "data_offset": 0, 00:22:54.297 "data_size": 65536 00:22:54.297 }, 00:22:54.297 { 00:22:54.297 "name": "BaseBdev4", 00:22:54.297 "uuid": "176b18ed-08c5-5bbf-ad45-570da32c2fca", 00:22:54.297 "is_configured": true, 00:22:54.297 "data_offset": 0, 00:22:54.297 "data_size": 65536 00:22:54.297 } 00:22:54.297 ] 00:22:54.297 }' 00:22:54.297 00:04:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:54.297 00:04:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:55.232 00:04:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:55.232 00:04:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:55.232 00:04:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:22:55.232 00:04:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:22:55.232 00:04:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:55.232 00:04:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:55.232 00:04:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:55.232 00:04:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:55.232 "name": "raid_bdev1", 00:22:55.232 "uuid": "52015bbf-1fe2-435d-acaf-c022d63f5212", 00:22:55.233 "strip_size_kb": 0, 00:22:55.233 "state": "online", 00:22:55.233 "raid_level": "raid1", 00:22:55.233 "superblock": false, 00:22:55.233 "num_base_bdevs": 4, 00:22:55.233 "num_base_bdevs_discovered": 3, 00:22:55.233 "num_base_bdevs_operational": 3, 00:22:55.233 "base_bdevs_list": [ 00:22:55.233 { 00:22:55.233 "name": null, 00:22:55.233 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:55.233 "is_configured": false, 00:22:55.233 "data_offset": 0, 00:22:55.233 "data_size": 65536 00:22:55.233 }, 00:22:55.233 { 00:22:55.233 "name": "BaseBdev2", 00:22:55.233 "uuid": "622b72db-ba60-5b00-9f71-2c66aeb450d1", 00:22:55.233 "is_configured": true, 00:22:55.233 "data_offset": 0, 00:22:55.233 "data_size": 65536 00:22:55.233 }, 00:22:55.233 { 00:22:55.233 "name": "BaseBdev3", 00:22:55.233 "uuid": "b47d0489-8956-5f0d-b00c-84874113780e", 00:22:55.233 "is_configured": true, 00:22:55.233 "data_offset": 0, 00:22:55.233 "data_size": 65536 00:22:55.233 }, 00:22:55.233 { 00:22:55.233 "name": "BaseBdev4", 00:22:55.233 "uuid": "176b18ed-08c5-5bbf-ad45-570da32c2fca", 00:22:55.233 "is_configured": true, 00:22:55.233 "data_offset": 0, 00:22:55.233 "data_size": 65536 00:22:55.233 } 00:22:55.233 ] 00:22:55.233 }' 00:22:55.233 00:04:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:55.233 00:04:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:55.233 00:04:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:55.491 00:04:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:22:55.491 00:04:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@667 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:55.491 [2024-05-15 00:04:56.059380] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:55.749 00:04:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@668 -- # sleep 1 00:22:55.749 [2024-05-15 00:04:56.115382] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25bbe80 00:22:55.749 [2024-05-15 00:04:56.116919] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:55.749 [2024-05-15 00:04:56.246816] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:55.749 [2024-05-15 00:04:56.247135] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:56.008 [2024-05-15 00:04:56.377468] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:56.008 [2024-05-15 00:04:56.378153] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:56.266 [2024-05-15 00:04:56.753872] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:22:56.524 [2024-05-15 00:04:56.862693] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:56.524 [2024-05-15 00:04:56.862872] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:56.524 00:04:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@669 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:56.524 00:04:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:56.524 00:04:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:56.524 00:04:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:56.524 00:04:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:56.782 00:04:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:56.782 00:04:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:56.782 [2024-05-15 00:04:57.310117] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:22:56.782 [2024-05-15 00:04:57.310755] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:22:56.782 00:04:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:56.782 "name": "raid_bdev1", 00:22:56.782 "uuid": "52015bbf-1fe2-435d-acaf-c022d63f5212", 00:22:56.782 "strip_size_kb": 0, 00:22:56.782 "state": "online", 00:22:56.782 "raid_level": "raid1", 00:22:56.782 "superblock": false, 00:22:56.782 "num_base_bdevs": 4, 00:22:56.782 "num_base_bdevs_discovered": 4, 00:22:56.782 "num_base_bdevs_operational": 4, 00:22:56.782 "process": { 00:22:56.782 "type": "rebuild", 00:22:56.782 "target": "spare", 00:22:56.782 "progress": { 00:22:56.782 "blocks": 16384, 00:22:56.782 "percent": 25 00:22:56.782 } 00:22:56.782 }, 00:22:56.782 "base_bdevs_list": [ 00:22:56.782 { 00:22:56.782 "name": "spare", 00:22:56.782 "uuid": "60f70ede-d07f-5b4d-a3c9-cfbe81578b54", 00:22:56.782 "is_configured": true, 00:22:56.782 "data_offset": 0, 00:22:56.782 "data_size": 65536 00:22:56.782 }, 00:22:56.782 { 00:22:56.782 "name": "BaseBdev2", 00:22:56.782 "uuid": "622b72db-ba60-5b00-9f71-2c66aeb450d1", 00:22:56.782 "is_configured": true, 00:22:56.782 "data_offset": 0, 00:22:56.782 "data_size": 65536 00:22:56.782 }, 00:22:56.782 { 00:22:56.782 "name": "BaseBdev3", 00:22:56.782 "uuid": "b47d0489-8956-5f0d-b00c-84874113780e", 00:22:56.782 "is_configured": true, 00:22:56.782 "data_offset": 0, 00:22:56.782 "data_size": 65536 00:22:56.782 }, 00:22:56.782 { 00:22:56.782 "name": "BaseBdev4", 00:22:56.782 "uuid": "176b18ed-08c5-5bbf-ad45-570da32c2fca", 00:22:56.782 "is_configured": true, 00:22:56.782 "data_offset": 0, 00:22:56.782 "data_size": 65536 00:22:56.782 } 00:22:56.782 ] 00:22:56.782 }' 00:22:56.782 00:04:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:57.040 00:04:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:57.040 00:04:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:57.040 00:04:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:57.040 00:04:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@671 -- # '[' false = true ']' 00:22:57.040 00:04:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@696 -- # local num_base_bdevs_operational=4 00:22:57.040 00:04:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # '[' raid1 = raid1 ']' 00:22:57.040 00:04:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # '[' 4 -gt 2 ']' 00:22:57.040 00:04:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@700 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:57.299 [2024-05-15 00:04:57.688492] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:57.299 [2024-05-15 00:04:57.696627] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:22:57.299 [2024-05-15 00:04:57.697762] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:22:57.299 [2024-05-15 00:04:57.807849] bdev_raid.c:1957:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x25bb3e0 00:22:57.299 [2024-05-15 00:04:57.807882] bdev_raid.c:1957:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x25bbe80 00:22:57.299 00:04:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@703 -- # base_bdevs[1]= 00:22:57.299 00:04:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@704 -- # (( num_base_bdevs_operational-- )) 00:22:57.299 00:04:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:57.299 00:04:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:57.299 00:04:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:57.299 00:04:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:57.299 00:04:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:57.299 00:04:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:57.299 00:04:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:57.556 00:04:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:57.556 "name": "raid_bdev1", 00:22:57.556 "uuid": "52015bbf-1fe2-435d-acaf-c022d63f5212", 00:22:57.556 "strip_size_kb": 0, 00:22:57.556 "state": "online", 00:22:57.556 "raid_level": "raid1", 00:22:57.556 "superblock": false, 00:22:57.556 "num_base_bdevs": 4, 00:22:57.556 "num_base_bdevs_discovered": 3, 00:22:57.556 "num_base_bdevs_operational": 3, 00:22:57.556 "process": { 00:22:57.556 "type": "rebuild", 00:22:57.556 "target": "spare", 00:22:57.556 "progress": { 00:22:57.556 "blocks": 24576, 00:22:57.556 "percent": 37 00:22:57.556 } 00:22:57.556 }, 00:22:57.556 "base_bdevs_list": [ 00:22:57.556 { 00:22:57.556 "name": "spare", 00:22:57.556 "uuid": "60f70ede-d07f-5b4d-a3c9-cfbe81578b54", 00:22:57.556 "is_configured": true, 00:22:57.556 "data_offset": 0, 00:22:57.556 "data_size": 65536 00:22:57.556 }, 00:22:57.556 { 00:22:57.556 "name": null, 00:22:57.556 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:57.556 "is_configured": false, 00:22:57.556 "data_offset": 0, 00:22:57.556 "data_size": 65536 00:22:57.556 }, 00:22:57.556 { 00:22:57.556 "name": "BaseBdev3", 00:22:57.556 "uuid": "b47d0489-8956-5f0d-b00c-84874113780e", 00:22:57.556 "is_configured": true, 00:22:57.556 "data_offset": 0, 00:22:57.556 "data_size": 65536 00:22:57.556 }, 00:22:57.556 { 00:22:57.556 "name": "BaseBdev4", 00:22:57.556 "uuid": "176b18ed-08c5-5bbf-ad45-570da32c2fca", 00:22:57.556 "is_configured": true, 00:22:57.556 "data_offset": 0, 00:22:57.556 "data_size": 65536 00:22:57.556 } 00:22:57.556 ] 00:22:57.556 }' 00:22:57.556 00:04:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:57.556 00:04:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:57.556 00:04:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:57.814 00:04:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:57.814 00:04:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@711 -- # local timeout=789 00:22:57.814 00:04:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:22:57.814 00:04:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:57.814 00:04:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:57.814 00:04:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:57.814 00:04:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:57.814 00:04:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:57.814 00:04:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:57.814 00:04:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:57.814 [2024-05-15 00:04:58.282864] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:22:57.814 [2024-05-15 00:04:58.283333] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:22:58.073 00:04:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:58.073 "name": "raid_bdev1", 00:22:58.073 "uuid": "52015bbf-1fe2-435d-acaf-c022d63f5212", 00:22:58.073 "strip_size_kb": 0, 00:22:58.073 "state": "online", 00:22:58.073 "raid_level": "raid1", 00:22:58.073 "superblock": false, 00:22:58.073 "num_base_bdevs": 4, 00:22:58.073 "num_base_bdevs_discovered": 3, 00:22:58.073 "num_base_bdevs_operational": 3, 00:22:58.073 "process": { 00:22:58.073 "type": "rebuild", 00:22:58.073 "target": "spare", 00:22:58.073 "progress": { 00:22:58.073 "blocks": 28672, 00:22:58.073 "percent": 43 00:22:58.073 } 00:22:58.073 }, 00:22:58.073 "base_bdevs_list": [ 00:22:58.073 { 00:22:58.073 "name": "spare", 00:22:58.073 "uuid": "60f70ede-d07f-5b4d-a3c9-cfbe81578b54", 00:22:58.073 "is_configured": true, 00:22:58.073 "data_offset": 0, 00:22:58.073 "data_size": 65536 00:22:58.073 }, 00:22:58.073 { 00:22:58.073 "name": null, 00:22:58.073 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:58.073 "is_configured": false, 00:22:58.073 "data_offset": 0, 00:22:58.073 "data_size": 65536 00:22:58.073 }, 00:22:58.073 { 00:22:58.073 "name": "BaseBdev3", 00:22:58.073 "uuid": "b47d0489-8956-5f0d-b00c-84874113780e", 00:22:58.073 "is_configured": true, 00:22:58.073 "data_offset": 0, 00:22:58.073 "data_size": 65536 00:22:58.073 }, 00:22:58.073 { 00:22:58.073 "name": "BaseBdev4", 00:22:58.073 "uuid": "176b18ed-08c5-5bbf-ad45-570da32c2fca", 00:22:58.073 "is_configured": true, 00:22:58.073 "data_offset": 0, 00:22:58.073 "data_size": 65536 00:22:58.073 } 00:22:58.073 ] 00:22:58.073 }' 00:22:58.073 00:04:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:58.073 00:04:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:58.073 00:04:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:58.073 00:04:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:58.073 00:04:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@716 -- # sleep 1 00:22:58.073 [2024-05-15 00:04:58.598397] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:22:58.639 [2024-05-15 00:04:59.045020] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:22:58.897 [2024-05-15 00:04:59.292266] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:22:59.155 00:04:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:22:59.155 00:04:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:59.155 00:04:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:59.155 00:04:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:59.155 00:04:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:59.155 00:04:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:59.155 00:04:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.155 00:04:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:59.155 [2024-05-15 00:04:59.505585] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:22:59.456 00:04:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:59.456 "name": "raid_bdev1", 00:22:59.456 "uuid": "52015bbf-1fe2-435d-acaf-c022d63f5212", 00:22:59.456 "strip_size_kb": 0, 00:22:59.456 "state": "online", 00:22:59.456 "raid_level": "raid1", 00:22:59.456 "superblock": false, 00:22:59.456 "num_base_bdevs": 4, 00:22:59.456 "num_base_bdevs_discovered": 3, 00:22:59.456 "num_base_bdevs_operational": 3, 00:22:59.456 "process": { 00:22:59.456 "type": "rebuild", 00:22:59.456 "target": "spare", 00:22:59.456 "progress": { 00:22:59.456 "blocks": 47104, 00:22:59.456 "percent": 71 00:22:59.456 } 00:22:59.456 }, 00:22:59.456 "base_bdevs_list": [ 00:22:59.456 { 00:22:59.456 "name": "spare", 00:22:59.456 "uuid": "60f70ede-d07f-5b4d-a3c9-cfbe81578b54", 00:22:59.456 "is_configured": true, 00:22:59.456 "data_offset": 0, 00:22:59.456 "data_size": 65536 00:22:59.456 }, 00:22:59.456 { 00:22:59.456 "name": null, 00:22:59.456 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:59.456 "is_configured": false, 00:22:59.456 "data_offset": 0, 00:22:59.456 "data_size": 65536 00:22:59.456 }, 00:22:59.456 { 00:22:59.456 "name": "BaseBdev3", 00:22:59.456 "uuid": "b47d0489-8956-5f0d-b00c-84874113780e", 00:22:59.456 "is_configured": true, 00:22:59.456 "data_offset": 0, 00:22:59.456 "data_size": 65536 00:22:59.456 }, 00:22:59.456 { 00:22:59.456 "name": "BaseBdev4", 00:22:59.456 "uuid": "176b18ed-08c5-5bbf-ad45-570da32c2fca", 00:22:59.456 "is_configured": true, 00:22:59.456 "data_offset": 0, 00:22:59.456 "data_size": 65536 00:22:59.456 } 00:22:59.456 ] 00:22:59.456 }' 00:22:59.456 00:04:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:59.456 00:04:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:59.456 00:04:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:59.456 00:04:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:59.456 00:04:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@716 -- # sleep 1 00:22:59.714 [2024-05-15 00:05:00.173489] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:23:00.282 [2024-05-15 00:05:00.625438] bdev_raid.c:2741:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:00.282 [2024-05-15 00:05:00.733659] bdev_raid.c:2458:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:00.282 [2024-05-15 00:05:00.736221] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:00.282 00:05:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:23:00.282 00:05:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:00.282 00:05:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:00.282 00:05:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:23:00.282 00:05:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:23:00.282 00:05:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:00.282 00:05:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:00.282 00:05:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:00.540 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:00.540 "name": "raid_bdev1", 00:23:00.540 "uuid": "52015bbf-1fe2-435d-acaf-c022d63f5212", 00:23:00.540 "strip_size_kb": 0, 00:23:00.540 "state": "online", 00:23:00.540 "raid_level": "raid1", 00:23:00.540 "superblock": false, 00:23:00.540 "num_base_bdevs": 4, 00:23:00.540 "num_base_bdevs_discovered": 3, 00:23:00.540 "num_base_bdevs_operational": 3, 00:23:00.540 "base_bdevs_list": [ 00:23:00.540 { 00:23:00.540 "name": "spare", 00:23:00.540 "uuid": "60f70ede-d07f-5b4d-a3c9-cfbe81578b54", 00:23:00.540 "is_configured": true, 00:23:00.540 "data_offset": 0, 00:23:00.540 "data_size": 65536 00:23:00.540 }, 00:23:00.540 { 00:23:00.540 "name": null, 00:23:00.540 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:00.540 "is_configured": false, 00:23:00.540 "data_offset": 0, 00:23:00.540 "data_size": 65536 00:23:00.540 }, 00:23:00.540 { 00:23:00.540 "name": "BaseBdev3", 00:23:00.540 "uuid": "b47d0489-8956-5f0d-b00c-84874113780e", 00:23:00.540 "is_configured": true, 00:23:00.540 "data_offset": 0, 00:23:00.540 "data_size": 65536 00:23:00.540 }, 00:23:00.540 { 00:23:00.540 "name": "BaseBdev4", 00:23:00.540 "uuid": "176b18ed-08c5-5bbf-ad45-570da32c2fca", 00:23:00.540 "is_configured": true, 00:23:00.540 "data_offset": 0, 00:23:00.540 "data_size": 65536 00:23:00.540 } 00:23:00.540 ] 00:23:00.540 }' 00:23:00.540 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:00.799 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:00.799 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:00.799 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:23:00.799 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # break 00:23:00.799 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@720 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:00.799 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:00.799 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:23:00.799 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:23:00.799 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:00.799 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:00.799 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:01.058 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:01.058 "name": "raid_bdev1", 00:23:01.058 "uuid": "52015bbf-1fe2-435d-acaf-c022d63f5212", 00:23:01.058 "strip_size_kb": 0, 00:23:01.058 "state": "online", 00:23:01.058 "raid_level": "raid1", 00:23:01.058 "superblock": false, 00:23:01.058 "num_base_bdevs": 4, 00:23:01.058 "num_base_bdevs_discovered": 3, 00:23:01.058 "num_base_bdevs_operational": 3, 00:23:01.058 "base_bdevs_list": [ 00:23:01.058 { 00:23:01.058 "name": "spare", 00:23:01.058 "uuid": "60f70ede-d07f-5b4d-a3c9-cfbe81578b54", 00:23:01.058 "is_configured": true, 00:23:01.058 "data_offset": 0, 00:23:01.058 "data_size": 65536 00:23:01.058 }, 00:23:01.058 { 00:23:01.058 "name": null, 00:23:01.058 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:01.058 "is_configured": false, 00:23:01.058 "data_offset": 0, 00:23:01.058 "data_size": 65536 00:23:01.058 }, 00:23:01.058 { 00:23:01.058 "name": "BaseBdev3", 00:23:01.058 "uuid": "b47d0489-8956-5f0d-b00c-84874113780e", 00:23:01.058 "is_configured": true, 00:23:01.058 "data_offset": 0, 00:23:01.058 "data_size": 65536 00:23:01.058 }, 00:23:01.058 { 00:23:01.058 "name": "BaseBdev4", 00:23:01.058 "uuid": "176b18ed-08c5-5bbf-ad45-570da32c2fca", 00:23:01.058 "is_configured": true, 00:23:01.058 "data_offset": 0, 00:23:01.058 "data_size": 65536 00:23:01.058 } 00:23:01.058 ] 00:23:01.058 }' 00:23:01.058 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:01.058 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:01.058 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:01.058 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:23:01.058 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:01.058 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:01.058 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:01.058 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:01.058 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:01.058 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:23:01.058 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:01.058 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:01.058 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:01.058 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:01.058 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:01.058 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:01.316 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:01.316 "name": "raid_bdev1", 00:23:01.316 "uuid": "52015bbf-1fe2-435d-acaf-c022d63f5212", 00:23:01.316 "strip_size_kb": 0, 00:23:01.316 "state": "online", 00:23:01.316 "raid_level": "raid1", 00:23:01.316 "superblock": false, 00:23:01.316 "num_base_bdevs": 4, 00:23:01.316 "num_base_bdevs_discovered": 3, 00:23:01.316 "num_base_bdevs_operational": 3, 00:23:01.316 "base_bdevs_list": [ 00:23:01.316 { 00:23:01.316 "name": "spare", 00:23:01.316 "uuid": "60f70ede-d07f-5b4d-a3c9-cfbe81578b54", 00:23:01.316 "is_configured": true, 00:23:01.316 "data_offset": 0, 00:23:01.316 "data_size": 65536 00:23:01.316 }, 00:23:01.316 { 00:23:01.316 "name": null, 00:23:01.316 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:01.316 "is_configured": false, 00:23:01.316 "data_offset": 0, 00:23:01.316 "data_size": 65536 00:23:01.316 }, 00:23:01.316 { 00:23:01.316 "name": "BaseBdev3", 00:23:01.316 "uuid": "b47d0489-8956-5f0d-b00c-84874113780e", 00:23:01.316 "is_configured": true, 00:23:01.316 "data_offset": 0, 00:23:01.316 "data_size": 65536 00:23:01.316 }, 00:23:01.316 { 00:23:01.316 "name": "BaseBdev4", 00:23:01.316 "uuid": "176b18ed-08c5-5bbf-ad45-570da32c2fca", 00:23:01.316 "is_configured": true, 00:23:01.316 "data_offset": 0, 00:23:01.317 "data_size": 65536 00:23:01.317 } 00:23:01.317 ] 00:23:01.317 }' 00:23:01.317 00:05:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:01.317 00:05:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:01.883 00:05:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:02.141 [2024-05-15 00:05:02.611308] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:02.141 [2024-05-15 00:05:02.611345] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:02.141 00:23:02.141 Latency(us) 00:23:02.141 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:02.141 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:23:02.141 raid_bdev1 : 11.27 92.65 277.94 0.00 0.00 15156.94 311.65 123093.70 00:23:02.141 =================================================================================================================== 00:23:02.141 Total : 92.65 277.94 0.00 0.00 15156.94 311.65 123093.70 00:23:02.141 [2024-05-15 00:05:02.715537] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:02.141 [2024-05-15 00:05:02.715567] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:02.141 [2024-05-15 00:05:02.715661] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:02.141 [2024-05-15 00:05:02.715674] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25bb840 name raid_bdev1, state offline 00:23:02.141 0 00:23:02.399 00:05:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:02.399 00:05:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # jq length 00:23:02.399 00:05:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # [[ 0 == 0 ]] 00:23:02.399 00:05:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # '[' true = true ']' 00:23:02.657 00:05:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@728 -- # '[' true = true ']' 00:23:02.657 00:05:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:23:02.657 00:05:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:02.657 00:05:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:23:02.657 00:05:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:02.657 00:05:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:02.657 00:05:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:02.657 00:05:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:23:02.657 00:05:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:02.657 00:05:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:02.657 00:05:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:23:02.915 /dev/nbd0 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@865 -- # local i 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # break 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:02.915 1+0 records in 00:23:02.915 1+0 records out 00:23:02.915 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000244672 s, 16.7 MB/s 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # size=4096 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # return 0 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # for bdev in "${base_bdevs[@]:1}" 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@732 -- # '[' -z '' ']' 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # continue 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # for bdev in "${base_bdevs[@]:1}" 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@732 -- # '[' -z BaseBdev3 ']' 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:02.915 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:23:03.173 /dev/nbd1 00:23:03.173 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:03.173 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:03.173 00:05:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:23:03.173 00:05:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@865 -- # local i 00:23:03.173 00:05:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:23:03.173 00:05:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:23:03.173 00:05:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:23:03.173 00:05:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # break 00:23:03.173 00:05:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:23:03.173 00:05:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:23:03.173 00:05:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:03.173 1+0 records in 00:23:03.173 1+0 records out 00:23:03.173 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000297194 s, 13.8 MB/s 00:23:03.173 00:05:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:03.173 00:05:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # size=4096 00:23:03.173 00:05:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:03.173 00:05:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:23:03.173 00:05:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # return 0 00:23:03.173 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:03.173 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:03.173 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@736 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:23:03.173 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@737 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:23:03.173 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:03.173 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:23:03.173 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:03.173 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:23:03.173 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:03.173 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:03.431 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:03.431 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:03.431 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:03.431 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:03.431 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:03.431 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:03.431 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:23:03.431 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:03.431 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # for bdev in "${base_bdevs[@]:1}" 00:23:03.431 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@732 -- # '[' -z BaseBdev4 ']' 00:23:03.431 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:23:03.431 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:03.431 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:23:03.431 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:03.431 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:23:03.431 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:03.431 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:23:03.431 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:03.432 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:03.432 00:05:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:23:03.690 /dev/nbd1 00:23:03.690 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:03.690 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:03.690 00:05:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:23:03.690 00:05:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@865 -- # local i 00:23:03.690 00:05:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:23:03.690 00:05:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:23:03.690 00:05:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:23:03.690 00:05:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # break 00:23:03.690 00:05:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:23:03.690 00:05:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:23:03.690 00:05:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:03.690 1+0 records in 00:23:03.690 1+0 records out 00:23:03.690 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000268899 s, 15.2 MB/s 00:23:03.690 00:05:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:03.690 00:05:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # size=4096 00:23:03.690 00:05:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:03.690 00:05:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:23:03.690 00:05:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # return 0 00:23:03.690 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:03.690 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:03.690 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@736 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:23:03.690 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@737 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:23:03.690 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:03.690 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:23:03.690 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:03.690 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:23:03.690 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:03.690 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:03.949 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:03.949 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:03.949 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:03.949 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:03.949 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:03.949 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:03.949 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:23:03.949 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:03.949 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@739 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:03.949 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:03.949 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:03.949 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:03.949 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:23:03.949 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:03.949 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:04.207 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:04.207 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:04.207 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:04.207 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:04.207 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:04.207 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:04.465 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:23:04.465 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:04.465 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@748 -- # '[' false = true ']' 00:23:04.465 00:05:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@795 -- # killprocess 497160 00:23:04.465 00:05:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@946 -- # '[' -z 497160 ']' 00:23:04.465 00:05:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@950 -- # kill -0 497160 00:23:04.465 00:05:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@951 -- # uname 00:23:04.465 00:05:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:23:04.465 00:05:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 497160 00:23:04.465 00:05:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:23:04.465 00:05:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:23:04.465 00:05:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@964 -- # echo 'killing process with pid 497160' 00:23:04.465 killing process with pid 497160 00:23:04.465 00:05:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@965 -- # kill 497160 00:23:04.465 Received shutdown signal, test time was about 13.400163 seconds 00:23:04.465 00:23:04.465 Latency(us) 00:23:04.465 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:04.465 =================================================================================================================== 00:23:04.465 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:23:04.465 [2024-05-15 00:05:04.848128] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:04.465 00:05:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@970 -- # wait 497160 00:23:04.465 [2024-05-15 00:05:04.891931] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@797 -- # return 0 00:23:04.723 00:23:04.723 real 0m18.853s 00:23:04.723 user 0m29.187s 00:23:04.723 sys 0m3.312s 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1122 -- # xtrace_disable 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:04.723 ************************************ 00:23:04.723 END TEST raid_rebuild_test_io 00:23:04.723 ************************************ 00:23:04.723 00:05:05 bdev_raid -- bdev/bdev_raid.sh@826 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:23:04.723 00:05:05 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:23:04.723 00:05:05 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:23:04.723 00:05:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:04.723 ************************************ 00:23:04.723 START TEST raid_rebuild_test_sb_io 00:23:04.723 ************************************ 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 4 true true true 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_level=raid1 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local num_base_bdevs=4 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local superblock=true 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local background_io=true 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local verify=true 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i = 1 )) 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@581 -- # echo BaseBdev1 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@581 -- # echo BaseBdev2 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@581 -- # echo BaseBdev3 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@581 -- # echo BaseBdev4 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # local base_bdevs 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # local raid_bdev_name=raid_bdev1 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@581 -- # local strip_size 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@582 -- # local create_arg 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@583 -- # local raid_bdev_size 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@584 -- # local data_offset 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@586 -- # '[' raid1 '!=' raid1 ']' 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@594 -- # strip_size=0 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # '[' true = true ']' 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@598 -- # create_arg+=' -s' 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # raid_pid=499869 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@603 -- # waitforlisten 499869 /var/tmp/spdk-raid.sock 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@827 -- # '[' -z 499869 ']' 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@832 -- # local max_retries=100 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:04.723 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # xtrace_disable 00:23:04.723 00:05:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:04.723 [2024-05-15 00:05:05.285585] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:23:04.723 [2024-05-15 00:05:05.285646] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid499869 ] 00:23:04.723 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:04.723 Zero copy mechanism will not be used. 00:23:04.982 [2024-05-15 00:05:05.406577] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:04.982 [2024-05-15 00:05:05.511840] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:23:05.327 [2024-05-15 00:05:05.579945] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:05.327 [2024-05-15 00:05:05.579980] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:05.894 00:05:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:23:05.894 00:05:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@860 -- # return 0 00:23:05.894 00:05:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:23:05.894 00:05:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:05.894 BaseBdev1_malloc 00:23:05.894 00:05:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:06.152 [2024-05-15 00:05:06.700330] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:06.152 [2024-05-15 00:05:06.700384] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:06.152 [2024-05-15 00:05:06.700416] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x7fdb50 00:23:06.152 [2024-05-15 00:05:06.700430] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:06.152 [2024-05-15 00:05:06.702190] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:06.152 [2024-05-15 00:05:06.702220] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:06.152 BaseBdev1 00:23:06.152 00:05:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:23:06.152 00:05:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:06.410 BaseBdev2_malloc 00:23:06.410 00:05:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:06.667 [2024-05-15 00:05:07.191695] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:06.667 [2024-05-15 00:05:07.191747] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:06.667 [2024-05-15 00:05:07.191767] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9a3d10 00:23:06.667 [2024-05-15 00:05:07.191779] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:06.667 [2024-05-15 00:05:07.193382] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:06.667 [2024-05-15 00:05:07.193421] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:06.667 BaseBdev2 00:23:06.667 00:05:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:23:06.667 00:05:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:23:06.925 BaseBdev3_malloc 00:23:06.925 00:05:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:23:07.182 [2024-05-15 00:05:07.665578] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:23:07.182 [2024-05-15 00:05:07.665630] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:07.182 [2024-05-15 00:05:07.665650] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9a5510 00:23:07.182 [2024-05-15 00:05:07.665663] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:07.183 [2024-05-15 00:05:07.667218] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:07.183 [2024-05-15 00:05:07.667248] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:23:07.183 BaseBdev3 00:23:07.183 00:05:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:23:07.183 00:05:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:23:07.440 BaseBdev4_malloc 00:23:07.440 00:05:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:23:07.699 [2024-05-15 00:05:08.151449] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:23:07.699 [2024-05-15 00:05:08.151498] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:07.699 [2024-05-15 00:05:08.151521] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9af290 00:23:07.699 [2024-05-15 00:05:08.151533] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:07.699 [2024-05-15 00:05:08.153114] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:07.699 [2024-05-15 00:05:08.153142] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:23:07.699 BaseBdev4 00:23:07.699 00:05:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:07.956 spare_malloc 00:23:07.957 00:05:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@613 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:08.214 spare_delay 00:23:08.214 00:05:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@614 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:08.471 [2024-05-15 00:05:08.887244] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:08.471 [2024-05-15 00:05:08.887291] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:08.471 [2024-05-15 00:05:08.887314] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x7f4e00 00:23:08.471 [2024-05-15 00:05:08.887327] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:08.471 [2024-05-15 00:05:08.888950] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:08.471 [2024-05-15 00:05:08.888978] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:08.471 spare 00:23:08.471 00:05:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:23:08.729 [2024-05-15 00:05:09.131920] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:08.729 [2024-05-15 00:05:09.133251] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:08.729 [2024-05-15 00:05:09.133308] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:08.729 [2024-05-15 00:05:09.133353] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:08.729 [2024-05-15 00:05:09.133553] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x7f8840 00:23:08.729 [2024-05-15 00:05:09.133565] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:08.729 [2024-05-15 00:05:09.133771] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9a2540 00:23:08.729 [2024-05-15 00:05:09.133923] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x7f8840 00:23:08.729 [2024-05-15 00:05:09.133933] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x7f8840 00:23:08.729 [2024-05-15 00:05:09.134034] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:08.729 00:05:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:08.729 00:05:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:08.729 00:05:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:08.730 00:05:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:08.730 00:05:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:08.730 00:05:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:23:08.730 00:05:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:08.730 00:05:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:08.730 00:05:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:08.730 00:05:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:08.730 00:05:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:08.730 00:05:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:08.988 00:05:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:08.988 "name": "raid_bdev1", 00:23:08.988 "uuid": "9a796da2-5f6b-427f-bf3f-d4c3855cb499", 00:23:08.988 "strip_size_kb": 0, 00:23:08.988 "state": "online", 00:23:08.988 "raid_level": "raid1", 00:23:08.988 "superblock": true, 00:23:08.988 "num_base_bdevs": 4, 00:23:08.988 "num_base_bdevs_discovered": 4, 00:23:08.988 "num_base_bdevs_operational": 4, 00:23:08.988 "base_bdevs_list": [ 00:23:08.988 { 00:23:08.988 "name": "BaseBdev1", 00:23:08.988 "uuid": "f5573852-2484-57fc-b708-2135580c3a56", 00:23:08.988 "is_configured": true, 00:23:08.988 "data_offset": 2048, 00:23:08.988 "data_size": 63488 00:23:08.988 }, 00:23:08.988 { 00:23:08.988 "name": "BaseBdev2", 00:23:08.988 "uuid": "1beca5ec-7de6-5593-97e8-70ffc6c96f25", 00:23:08.988 "is_configured": true, 00:23:08.988 "data_offset": 2048, 00:23:08.988 "data_size": 63488 00:23:08.988 }, 00:23:08.988 { 00:23:08.988 "name": "BaseBdev3", 00:23:08.988 "uuid": "c0ff0546-61f9-510d-b009-54ac06993711", 00:23:08.988 "is_configured": true, 00:23:08.988 "data_offset": 2048, 00:23:08.988 "data_size": 63488 00:23:08.988 }, 00:23:08.988 { 00:23:08.988 "name": "BaseBdev4", 00:23:08.988 "uuid": "ef062340-f95f-5893-a2c7-a598208517fa", 00:23:08.988 "is_configured": true, 00:23:08.988 "data_offset": 2048, 00:23:08.988 "data_size": 63488 00:23:08.988 } 00:23:08.988 ] 00:23:08.988 }' 00:23:08.988 00:05:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:08.988 00:05:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:09.554 00:05:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@621 -- # jq -r '.[].num_blocks' 00:23:09.554 00:05:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@621 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:09.812 [2024-05-15 00:05:10.202997] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:09.812 00:05:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@621 -- # raid_bdev_size=63488 00:23:09.812 00:05:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:09.812 00:05:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@624 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:10.070 00:05:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@624 -- # data_offset=2048 00:23:10.070 00:05:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@626 -- # '[' true = true ']' 00:23:10.070 00:05:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@628 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:10.070 00:05:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:10.070 [2024-05-15 00:05:10.557803] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7fd2c0 00:23:10.070 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:10.070 Zero copy mechanism will not be used. 00:23:10.070 Running I/O for 60 seconds... 00:23:10.328 [2024-05-15 00:05:10.704640] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:10.328 [2024-05-15 00:05:10.712815] bdev_raid.c:1957:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x7fd2c0 00:23:10.328 00:05:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@648 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:10.328 00:05:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:10.328 00:05:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:10.328 00:05:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:10.328 00:05:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:10.328 00:05:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:23:10.328 00:05:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:10.328 00:05:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:10.328 00:05:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:10.328 00:05:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:10.328 00:05:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:10.328 00:05:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:10.584 00:05:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:10.584 "name": "raid_bdev1", 00:23:10.584 "uuid": "9a796da2-5f6b-427f-bf3f-d4c3855cb499", 00:23:10.584 "strip_size_kb": 0, 00:23:10.584 "state": "online", 00:23:10.584 "raid_level": "raid1", 00:23:10.584 "superblock": true, 00:23:10.584 "num_base_bdevs": 4, 00:23:10.584 "num_base_bdevs_discovered": 3, 00:23:10.584 "num_base_bdevs_operational": 3, 00:23:10.584 "base_bdevs_list": [ 00:23:10.584 { 00:23:10.584 "name": null, 00:23:10.584 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:10.584 "is_configured": false, 00:23:10.584 "data_offset": 2048, 00:23:10.584 "data_size": 63488 00:23:10.584 }, 00:23:10.584 { 00:23:10.584 "name": "BaseBdev2", 00:23:10.584 "uuid": "1beca5ec-7de6-5593-97e8-70ffc6c96f25", 00:23:10.584 "is_configured": true, 00:23:10.584 "data_offset": 2048, 00:23:10.584 "data_size": 63488 00:23:10.584 }, 00:23:10.584 { 00:23:10.584 "name": "BaseBdev3", 00:23:10.584 "uuid": "c0ff0546-61f9-510d-b009-54ac06993711", 00:23:10.584 "is_configured": true, 00:23:10.584 "data_offset": 2048, 00:23:10.584 "data_size": 63488 00:23:10.584 }, 00:23:10.584 { 00:23:10.584 "name": "BaseBdev4", 00:23:10.584 "uuid": "ef062340-f95f-5893-a2c7-a598208517fa", 00:23:10.584 "is_configured": true, 00:23:10.584 "data_offset": 2048, 00:23:10.584 "data_size": 63488 00:23:10.584 } 00:23:10.584 ] 00:23:10.584 }' 00:23:10.584 00:05:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:10.584 00:05:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:11.149 00:05:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:11.406 [2024-05-15 00:05:11.866825] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:11.406 00:05:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # sleep 1 00:23:11.406 [2024-05-15 00:05:11.966458] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9a3030 00:23:11.406 [2024-05-15 00:05:11.968848] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:11.665 [2024-05-15 00:05:12.071335] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:11.665 [2024-05-15 00:05:12.072530] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:11.923 [2024-05-15 00:05:12.295715] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:11.923 [2024-05-15 00:05:12.296374] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:12.490 [2024-05-15 00:05:12.793261] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:12.490 [2024-05-15 00:05:12.793979] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:12.490 00:05:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:12.490 00:05:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:12.490 00:05:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:23:12.490 00:05:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:23:12.490 00:05:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:12.490 00:05:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:12.490 00:05:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:12.748 [2024-05-15 00:05:13.142467] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:12.748 00:05:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:12.748 "name": "raid_bdev1", 00:23:12.748 "uuid": "9a796da2-5f6b-427f-bf3f-d4c3855cb499", 00:23:12.748 "strip_size_kb": 0, 00:23:12.748 "state": "online", 00:23:12.748 "raid_level": "raid1", 00:23:12.748 "superblock": true, 00:23:12.748 "num_base_bdevs": 4, 00:23:12.748 "num_base_bdevs_discovered": 4, 00:23:12.748 "num_base_bdevs_operational": 4, 00:23:12.748 "process": { 00:23:12.748 "type": "rebuild", 00:23:12.748 "target": "spare", 00:23:12.748 "progress": { 00:23:12.748 "blocks": 14336, 00:23:12.748 "percent": 22 00:23:12.748 } 00:23:12.748 }, 00:23:12.748 "base_bdevs_list": [ 00:23:12.748 { 00:23:12.748 "name": "spare", 00:23:12.748 "uuid": "187aed10-98aa-5406-8eff-fe98d5a3fb28", 00:23:12.748 "is_configured": true, 00:23:12.748 "data_offset": 2048, 00:23:12.748 "data_size": 63488 00:23:12.748 }, 00:23:12.748 { 00:23:12.748 "name": "BaseBdev2", 00:23:12.748 "uuid": "1beca5ec-7de6-5593-97e8-70ffc6c96f25", 00:23:12.748 "is_configured": true, 00:23:12.748 "data_offset": 2048, 00:23:12.748 "data_size": 63488 00:23:12.748 }, 00:23:12.748 { 00:23:12.748 "name": "BaseBdev3", 00:23:12.748 "uuid": "c0ff0546-61f9-510d-b009-54ac06993711", 00:23:12.748 "is_configured": true, 00:23:12.748 "data_offset": 2048, 00:23:12.748 "data_size": 63488 00:23:12.748 }, 00:23:12.748 { 00:23:12.748 "name": "BaseBdev4", 00:23:12.748 "uuid": "ef062340-f95f-5893-a2c7-a598208517fa", 00:23:12.748 "is_configured": true, 00:23:12.748 "data_offset": 2048, 00:23:12.748 "data_size": 63488 00:23:12.748 } 00:23:12.748 ] 00:23:12.748 }' 00:23:12.748 00:05:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:12.748 00:05:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:12.748 00:05:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:12.748 00:05:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:23:12.748 00:05:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:12.748 [2024-05-15 00:05:13.300552] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:13.006 [2024-05-15 00:05:13.497055] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:13.263 [2024-05-15 00:05:13.623057] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:13.263 [2024-05-15 00:05:13.633421] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:13.263 [2024-05-15 00:05:13.654895] bdev_raid.c:1957:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x7fd2c0 00:23:13.263 00:05:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:13.263 00:05:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:13.263 00:05:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:13.263 00:05:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:13.263 00:05:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:13.263 00:05:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:23:13.263 00:05:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:13.263 00:05:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:13.263 00:05:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:13.263 00:05:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:13.263 00:05:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:13.263 00:05:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:13.521 00:05:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:13.521 "name": "raid_bdev1", 00:23:13.521 "uuid": "9a796da2-5f6b-427f-bf3f-d4c3855cb499", 00:23:13.521 "strip_size_kb": 0, 00:23:13.521 "state": "online", 00:23:13.521 "raid_level": "raid1", 00:23:13.521 "superblock": true, 00:23:13.521 "num_base_bdevs": 4, 00:23:13.521 "num_base_bdevs_discovered": 3, 00:23:13.521 "num_base_bdevs_operational": 3, 00:23:13.521 "base_bdevs_list": [ 00:23:13.521 { 00:23:13.521 "name": null, 00:23:13.521 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:13.521 "is_configured": false, 00:23:13.521 "data_offset": 2048, 00:23:13.521 "data_size": 63488 00:23:13.521 }, 00:23:13.521 { 00:23:13.521 "name": "BaseBdev2", 00:23:13.521 "uuid": "1beca5ec-7de6-5593-97e8-70ffc6c96f25", 00:23:13.521 "is_configured": true, 00:23:13.521 "data_offset": 2048, 00:23:13.521 "data_size": 63488 00:23:13.521 }, 00:23:13.521 { 00:23:13.521 "name": "BaseBdev3", 00:23:13.521 "uuid": "c0ff0546-61f9-510d-b009-54ac06993711", 00:23:13.521 "is_configured": true, 00:23:13.521 "data_offset": 2048, 00:23:13.521 "data_size": 63488 00:23:13.521 }, 00:23:13.521 { 00:23:13.521 "name": "BaseBdev4", 00:23:13.521 "uuid": "ef062340-f95f-5893-a2c7-a598208517fa", 00:23:13.521 "is_configured": true, 00:23:13.521 "data_offset": 2048, 00:23:13.521 "data_size": 63488 00:23:13.521 } 00:23:13.521 ] 00:23:13.521 }' 00:23:13.521 00:05:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:13.521 00:05:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:14.088 00:05:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:14.088 00:05:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:14.088 00:05:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:23:14.088 00:05:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:23:14.088 00:05:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:14.088 00:05:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.088 00:05:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:14.346 00:05:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:14.346 "name": "raid_bdev1", 00:23:14.346 "uuid": "9a796da2-5f6b-427f-bf3f-d4c3855cb499", 00:23:14.346 "strip_size_kb": 0, 00:23:14.346 "state": "online", 00:23:14.346 "raid_level": "raid1", 00:23:14.346 "superblock": true, 00:23:14.346 "num_base_bdevs": 4, 00:23:14.346 "num_base_bdevs_discovered": 3, 00:23:14.346 "num_base_bdevs_operational": 3, 00:23:14.346 "base_bdevs_list": [ 00:23:14.346 { 00:23:14.346 "name": null, 00:23:14.346 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:14.346 "is_configured": false, 00:23:14.346 "data_offset": 2048, 00:23:14.346 "data_size": 63488 00:23:14.346 }, 00:23:14.346 { 00:23:14.346 "name": "BaseBdev2", 00:23:14.346 "uuid": "1beca5ec-7de6-5593-97e8-70ffc6c96f25", 00:23:14.346 "is_configured": true, 00:23:14.346 "data_offset": 2048, 00:23:14.346 "data_size": 63488 00:23:14.346 }, 00:23:14.346 { 00:23:14.346 "name": "BaseBdev3", 00:23:14.346 "uuid": "c0ff0546-61f9-510d-b009-54ac06993711", 00:23:14.346 "is_configured": true, 00:23:14.346 "data_offset": 2048, 00:23:14.346 "data_size": 63488 00:23:14.346 }, 00:23:14.346 { 00:23:14.346 "name": "BaseBdev4", 00:23:14.346 "uuid": "ef062340-f95f-5893-a2c7-a598208517fa", 00:23:14.346 "is_configured": true, 00:23:14.346 "data_offset": 2048, 00:23:14.346 "data_size": 63488 00:23:14.346 } 00:23:14.346 ] 00:23:14.346 }' 00:23:14.346 00:05:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:14.346 00:05:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:14.346 00:05:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:14.346 00:05:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:23:14.346 00:05:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@667 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:14.604 [2024-05-15 00:05:15.144416] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:14.862 00:05:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@668 -- # sleep 1 00:23:14.862 [2024-05-15 00:05:15.235688] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9a10d0 00:23:14.862 [2024-05-15 00:05:15.237218] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:14.862 [2024-05-15 00:05:15.375791] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:14.862 [2024-05-15 00:05:15.377021] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:15.120 [2024-05-15 00:05:15.579218] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:15.120 [2024-05-15 00:05:15.579414] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:15.378 [2024-05-15 00:05:15.872045] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:15.636 [2024-05-15 00:05:16.122392] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:15.636 00:05:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@669 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:15.636 00:05:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:15.636 00:05:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:23:15.636 00:05:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:23:15.636 00:05:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:15.636 00:05:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:15.636 00:05:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:15.894 00:05:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:15.894 "name": "raid_bdev1", 00:23:15.894 "uuid": "9a796da2-5f6b-427f-bf3f-d4c3855cb499", 00:23:15.894 "strip_size_kb": 0, 00:23:15.894 "state": "online", 00:23:15.894 "raid_level": "raid1", 00:23:15.894 "superblock": true, 00:23:15.894 "num_base_bdevs": 4, 00:23:15.894 "num_base_bdevs_discovered": 4, 00:23:15.894 "num_base_bdevs_operational": 4, 00:23:15.894 "process": { 00:23:15.894 "type": "rebuild", 00:23:15.894 "target": "spare", 00:23:15.894 "progress": { 00:23:15.894 "blocks": 12288, 00:23:15.894 "percent": 19 00:23:15.894 } 00:23:15.894 }, 00:23:15.894 "base_bdevs_list": [ 00:23:15.894 { 00:23:15.894 "name": "spare", 00:23:15.894 "uuid": "187aed10-98aa-5406-8eff-fe98d5a3fb28", 00:23:15.894 "is_configured": true, 00:23:15.894 "data_offset": 2048, 00:23:15.894 "data_size": 63488 00:23:15.894 }, 00:23:15.894 { 00:23:15.894 "name": "BaseBdev2", 00:23:15.894 "uuid": "1beca5ec-7de6-5593-97e8-70ffc6c96f25", 00:23:15.894 "is_configured": true, 00:23:15.894 "data_offset": 2048, 00:23:15.894 "data_size": 63488 00:23:15.894 }, 00:23:15.894 { 00:23:15.894 "name": "BaseBdev3", 00:23:15.894 "uuid": "c0ff0546-61f9-510d-b009-54ac06993711", 00:23:15.894 "is_configured": true, 00:23:15.894 "data_offset": 2048, 00:23:15.894 "data_size": 63488 00:23:15.894 }, 00:23:15.894 { 00:23:15.894 "name": "BaseBdev4", 00:23:15.894 "uuid": "ef062340-f95f-5893-a2c7-a598208517fa", 00:23:15.894 "is_configured": true, 00:23:15.894 "data_offset": 2048, 00:23:15.894 "data_size": 63488 00:23:15.894 } 00:23:15.894 ] 00:23:15.894 }' 00:23:15.894 00:05:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:16.150 00:05:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:16.150 00:05:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:16.150 00:05:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:23:16.150 00:05:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@671 -- # '[' true = true ']' 00:23:16.150 00:05:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@671 -- # '[' = false ']' 00:23:16.150 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 671: [: =: unary operator expected 00:23:16.150 00:05:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@696 -- # local num_base_bdevs_operational=4 00:23:16.150 00:05:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # '[' raid1 = raid1 ']' 00:23:16.150 00:05:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # '[' 4 -gt 2 ']' 00:23:16.151 00:05:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@700 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:23:16.407 [2024-05-15 00:05:16.752475] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:16.407 [2024-05-15 00:05:16.815046] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:23:16.407 [2024-05-15 00:05:16.923254] bdev_raid.c:1957:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x7fd2c0 00:23:16.407 [2024-05-15 00:05:16.923281] bdev_raid.c:1957:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x9a10d0 00:23:16.665 [2024-05-15 00:05:17.073313] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:23:16.665 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@703 -- # base_bdevs[1]= 00:23:16.665 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@704 -- # (( num_base_bdevs_operational-- )) 00:23:16.665 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:16.665 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:16.665 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:23:16.665 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:23:16.665 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:16.665 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:16.665 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:16.923 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:16.923 "name": "raid_bdev1", 00:23:16.923 "uuid": "9a796da2-5f6b-427f-bf3f-d4c3855cb499", 00:23:16.923 "strip_size_kb": 0, 00:23:16.923 "state": "online", 00:23:16.923 "raid_level": "raid1", 00:23:16.923 "superblock": true, 00:23:16.923 "num_base_bdevs": 4, 00:23:16.923 "num_base_bdevs_discovered": 3, 00:23:16.923 "num_base_bdevs_operational": 3, 00:23:16.923 "process": { 00:23:16.923 "type": "rebuild", 00:23:16.923 "target": "spare", 00:23:16.923 "progress": { 00:23:16.923 "blocks": 22528, 00:23:16.923 "percent": 35 00:23:16.923 } 00:23:16.923 }, 00:23:16.923 "base_bdevs_list": [ 00:23:16.923 { 00:23:16.923 "name": "spare", 00:23:16.923 "uuid": "187aed10-98aa-5406-8eff-fe98d5a3fb28", 00:23:16.923 "is_configured": true, 00:23:16.923 "data_offset": 2048, 00:23:16.923 "data_size": 63488 00:23:16.923 }, 00:23:16.923 { 00:23:16.923 "name": null, 00:23:16.923 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:16.923 "is_configured": false, 00:23:16.923 "data_offset": 2048, 00:23:16.923 "data_size": 63488 00:23:16.923 }, 00:23:16.923 { 00:23:16.923 "name": "BaseBdev3", 00:23:16.923 "uuid": "c0ff0546-61f9-510d-b009-54ac06993711", 00:23:16.923 "is_configured": true, 00:23:16.923 "data_offset": 2048, 00:23:16.923 "data_size": 63488 00:23:16.923 }, 00:23:16.923 { 00:23:16.923 "name": "BaseBdev4", 00:23:16.923 "uuid": "ef062340-f95f-5893-a2c7-a598208517fa", 00:23:16.923 "is_configured": true, 00:23:16.923 "data_offset": 2048, 00:23:16.923 "data_size": 63488 00:23:16.923 } 00:23:16.923 ] 00:23:16.923 }' 00:23:16.923 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:16.923 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:16.923 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:16.923 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:23:16.923 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@711 -- # local timeout=808 00:23:16.923 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:23:16.923 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:16.923 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:16.923 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:23:16.923 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:23:16.923 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:16.923 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:16.923 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:17.181 [2024-05-15 00:05:17.543776] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:23:17.181 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:17.181 "name": "raid_bdev1", 00:23:17.181 "uuid": "9a796da2-5f6b-427f-bf3f-d4c3855cb499", 00:23:17.181 "strip_size_kb": 0, 00:23:17.181 "state": "online", 00:23:17.181 "raid_level": "raid1", 00:23:17.181 "superblock": true, 00:23:17.181 "num_base_bdevs": 4, 00:23:17.181 "num_base_bdevs_discovered": 3, 00:23:17.181 "num_base_bdevs_operational": 3, 00:23:17.181 "process": { 00:23:17.181 "type": "rebuild", 00:23:17.181 "target": "spare", 00:23:17.181 "progress": { 00:23:17.181 "blocks": 28672, 00:23:17.181 "percent": 45 00:23:17.181 } 00:23:17.181 }, 00:23:17.181 "base_bdevs_list": [ 00:23:17.181 { 00:23:17.181 "name": "spare", 00:23:17.181 "uuid": "187aed10-98aa-5406-8eff-fe98d5a3fb28", 00:23:17.181 "is_configured": true, 00:23:17.181 "data_offset": 2048, 00:23:17.181 "data_size": 63488 00:23:17.181 }, 00:23:17.181 { 00:23:17.181 "name": null, 00:23:17.181 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:17.181 "is_configured": false, 00:23:17.181 "data_offset": 2048, 00:23:17.181 "data_size": 63488 00:23:17.181 }, 00:23:17.181 { 00:23:17.181 "name": "BaseBdev3", 00:23:17.181 "uuid": "c0ff0546-61f9-510d-b009-54ac06993711", 00:23:17.181 "is_configured": true, 00:23:17.181 "data_offset": 2048, 00:23:17.181 "data_size": 63488 00:23:17.181 }, 00:23:17.181 { 00:23:17.181 "name": "BaseBdev4", 00:23:17.181 "uuid": "ef062340-f95f-5893-a2c7-a598208517fa", 00:23:17.181 "is_configured": true, 00:23:17.181 "data_offset": 2048, 00:23:17.181 "data_size": 63488 00:23:17.181 } 00:23:17.181 ] 00:23:17.181 }' 00:23:17.181 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:17.181 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:17.181 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:17.181 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:23:17.181 00:05:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@716 -- # sleep 1 00:23:18.115 [2024-05-15 00:05:18.563203] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:23:18.373 00:05:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:23:18.373 00:05:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:18.373 00:05:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:18.373 00:05:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:23:18.373 00:05:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:23:18.373 00:05:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:18.373 00:05:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:18.373 00:05:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:18.631 00:05:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:18.631 "name": "raid_bdev1", 00:23:18.631 "uuid": "9a796da2-5f6b-427f-bf3f-d4c3855cb499", 00:23:18.631 "strip_size_kb": 0, 00:23:18.631 "state": "online", 00:23:18.631 "raid_level": "raid1", 00:23:18.631 "superblock": true, 00:23:18.631 "num_base_bdevs": 4, 00:23:18.631 "num_base_bdevs_discovered": 3, 00:23:18.631 "num_base_bdevs_operational": 3, 00:23:18.631 "process": { 00:23:18.631 "type": "rebuild", 00:23:18.631 "target": "spare", 00:23:18.631 "progress": { 00:23:18.631 "blocks": 51200, 00:23:18.631 "percent": 80 00:23:18.631 } 00:23:18.631 }, 00:23:18.631 "base_bdevs_list": [ 00:23:18.631 { 00:23:18.631 "name": "spare", 00:23:18.631 "uuid": "187aed10-98aa-5406-8eff-fe98d5a3fb28", 00:23:18.631 "is_configured": true, 00:23:18.631 "data_offset": 2048, 00:23:18.631 "data_size": 63488 00:23:18.631 }, 00:23:18.631 { 00:23:18.631 "name": null, 00:23:18.631 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:18.631 "is_configured": false, 00:23:18.631 "data_offset": 2048, 00:23:18.631 "data_size": 63488 00:23:18.631 }, 00:23:18.631 { 00:23:18.631 "name": "BaseBdev3", 00:23:18.631 "uuid": "c0ff0546-61f9-510d-b009-54ac06993711", 00:23:18.631 "is_configured": true, 00:23:18.631 "data_offset": 2048, 00:23:18.631 "data_size": 63488 00:23:18.631 }, 00:23:18.631 { 00:23:18.631 "name": "BaseBdev4", 00:23:18.631 "uuid": "ef062340-f95f-5893-a2c7-a598208517fa", 00:23:18.631 "is_configured": true, 00:23:18.631 "data_offset": 2048, 00:23:18.631 "data_size": 63488 00:23:18.631 } 00:23:18.631 ] 00:23:18.631 }' 00:23:18.631 00:05:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:18.631 [2024-05-15 00:05:19.016388] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:23:18.631 [2024-05-15 00:05:19.016609] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:23:18.631 00:05:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:18.631 00:05:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:18.631 00:05:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:23:18.631 00:05:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@716 -- # sleep 1 00:23:18.889 [2024-05-15 00:05:19.452456] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:23:19.147 [2024-05-15 00:05:19.683372] bdev_raid.c:2741:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:19.405 [2024-05-15 00:05:19.791619] bdev_raid.c:2458:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:19.405 [2024-05-15 00:05:19.794985] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:19.663 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:23:19.663 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:19.663 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:19.663 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:23:19.663 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:23:19.663 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:19.663 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:19.663 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:19.921 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:19.921 "name": "raid_bdev1", 00:23:19.921 "uuid": "9a796da2-5f6b-427f-bf3f-d4c3855cb499", 00:23:19.921 "strip_size_kb": 0, 00:23:19.921 "state": "online", 00:23:19.921 "raid_level": "raid1", 00:23:19.921 "superblock": true, 00:23:19.921 "num_base_bdevs": 4, 00:23:19.921 "num_base_bdevs_discovered": 3, 00:23:19.921 "num_base_bdevs_operational": 3, 00:23:19.921 "base_bdevs_list": [ 00:23:19.921 { 00:23:19.921 "name": "spare", 00:23:19.921 "uuid": "187aed10-98aa-5406-8eff-fe98d5a3fb28", 00:23:19.921 "is_configured": true, 00:23:19.921 "data_offset": 2048, 00:23:19.921 "data_size": 63488 00:23:19.921 }, 00:23:19.921 { 00:23:19.921 "name": null, 00:23:19.921 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:19.921 "is_configured": false, 00:23:19.921 "data_offset": 2048, 00:23:19.921 "data_size": 63488 00:23:19.921 }, 00:23:19.921 { 00:23:19.921 "name": "BaseBdev3", 00:23:19.921 "uuid": "c0ff0546-61f9-510d-b009-54ac06993711", 00:23:19.921 "is_configured": true, 00:23:19.921 "data_offset": 2048, 00:23:19.921 "data_size": 63488 00:23:19.921 }, 00:23:19.921 { 00:23:19.921 "name": "BaseBdev4", 00:23:19.921 "uuid": "ef062340-f95f-5893-a2c7-a598208517fa", 00:23:19.921 "is_configured": true, 00:23:19.921 "data_offset": 2048, 00:23:19.921 "data_size": 63488 00:23:19.921 } 00:23:19.921 ] 00:23:19.921 }' 00:23:19.921 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:19.921 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:19.921 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:19.921 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:23:19.921 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # break 00:23:19.921 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@720 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:19.921 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:19.921 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:23:19.921 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:23:19.921 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:19.921 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:19.921 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:20.179 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:20.179 "name": "raid_bdev1", 00:23:20.179 "uuid": "9a796da2-5f6b-427f-bf3f-d4c3855cb499", 00:23:20.179 "strip_size_kb": 0, 00:23:20.179 "state": "online", 00:23:20.179 "raid_level": "raid1", 00:23:20.179 "superblock": true, 00:23:20.179 "num_base_bdevs": 4, 00:23:20.179 "num_base_bdevs_discovered": 3, 00:23:20.179 "num_base_bdevs_operational": 3, 00:23:20.179 "base_bdevs_list": [ 00:23:20.179 { 00:23:20.179 "name": "spare", 00:23:20.179 "uuid": "187aed10-98aa-5406-8eff-fe98d5a3fb28", 00:23:20.179 "is_configured": true, 00:23:20.179 "data_offset": 2048, 00:23:20.179 "data_size": 63488 00:23:20.179 }, 00:23:20.179 { 00:23:20.179 "name": null, 00:23:20.179 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:20.179 "is_configured": false, 00:23:20.179 "data_offset": 2048, 00:23:20.179 "data_size": 63488 00:23:20.179 }, 00:23:20.179 { 00:23:20.179 "name": "BaseBdev3", 00:23:20.179 "uuid": "c0ff0546-61f9-510d-b009-54ac06993711", 00:23:20.179 "is_configured": true, 00:23:20.179 "data_offset": 2048, 00:23:20.179 "data_size": 63488 00:23:20.179 }, 00:23:20.179 { 00:23:20.179 "name": "BaseBdev4", 00:23:20.179 "uuid": "ef062340-f95f-5893-a2c7-a598208517fa", 00:23:20.179 "is_configured": true, 00:23:20.179 "data_offset": 2048, 00:23:20.179 "data_size": 63488 00:23:20.179 } 00:23:20.179 ] 00:23:20.179 }' 00:23:20.179 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:20.179 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:20.179 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:20.437 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:23:20.437 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:20.437 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:20.437 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:20.437 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:20.437 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:20.437 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:23:20.437 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:20.437 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:20.437 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:20.437 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:20.437 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:20.437 00:05:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:20.437 00:05:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:20.437 "name": "raid_bdev1", 00:23:20.437 "uuid": "9a796da2-5f6b-427f-bf3f-d4c3855cb499", 00:23:20.437 "strip_size_kb": 0, 00:23:20.437 "state": "online", 00:23:20.437 "raid_level": "raid1", 00:23:20.437 "superblock": true, 00:23:20.437 "num_base_bdevs": 4, 00:23:20.437 "num_base_bdevs_discovered": 3, 00:23:20.437 "num_base_bdevs_operational": 3, 00:23:20.437 "base_bdevs_list": [ 00:23:20.437 { 00:23:20.437 "name": "spare", 00:23:20.437 "uuid": "187aed10-98aa-5406-8eff-fe98d5a3fb28", 00:23:20.437 "is_configured": true, 00:23:20.437 "data_offset": 2048, 00:23:20.437 "data_size": 63488 00:23:20.437 }, 00:23:20.437 { 00:23:20.437 "name": null, 00:23:20.437 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:20.437 "is_configured": false, 00:23:20.437 "data_offset": 2048, 00:23:20.437 "data_size": 63488 00:23:20.437 }, 00:23:20.437 { 00:23:20.437 "name": "BaseBdev3", 00:23:20.437 "uuid": "c0ff0546-61f9-510d-b009-54ac06993711", 00:23:20.437 "is_configured": true, 00:23:20.437 "data_offset": 2048, 00:23:20.437 "data_size": 63488 00:23:20.437 }, 00:23:20.437 { 00:23:20.437 "name": "BaseBdev4", 00:23:20.437 "uuid": "ef062340-f95f-5893-a2c7-a598208517fa", 00:23:20.437 "is_configured": true, 00:23:20.437 "data_offset": 2048, 00:23:20.437 "data_size": 63488 00:23:20.437 } 00:23:20.437 ] 00:23:20.437 }' 00:23:20.437 00:05:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:20.437 00:05:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:21.371 00:05:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:21.371 [2024-05-15 00:05:21.843498] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:21.371 [2024-05-15 00:05:21.843530] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:21.371 00:23:21.371 Latency(us) 00:23:21.371 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:21.371 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:23:21.371 raid_bdev1 : 11.31 86.84 260.53 0.00 0.00 16119.58 320.56 120358.29 00:23:21.371 =================================================================================================================== 00:23:21.371 Total : 86.84 260.53 0.00 0.00 16119.58 320.56 120358.29 00:23:21.371 [2024-05-15 00:05:21.899586] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:21.371 [2024-05-15 00:05:21.899616] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:21.371 [2024-05-15 00:05:21.899714] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:21.371 [2024-05-15 00:05:21.899727] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7f8840 name raid_bdev1, state offline 00:23:21.371 0 00:23:21.371 00:05:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # jq length 00:23:21.371 00:05:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:21.629 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # [[ 0 == 0 ]] 00:23:21.629 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # '[' true = true ']' 00:23:21.629 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@728 -- # '[' true = true ']' 00:23:21.629 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:23:21.629 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:21.629 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:23:21.629 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:21.629 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:21.629 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:21.629 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:23:21.629 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:21.629 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:21.629 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:23:21.887 /dev/nbd0 00:23:21.887 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:21.887 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:21.887 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:23:21.887 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@865 -- # local i 00:23:21.887 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:23:21.887 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:23:21.887 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:23:21.887 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # break 00:23:21.887 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:23:21.887 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:23:21.887 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:21.887 1+0 records in 00:23:21.887 1+0 records out 00:23:21.887 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000305627 s, 13.4 MB/s 00:23:21.887 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:21.887 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # size=4096 00:23:21.887 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:21.887 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:23:21.887 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # return 0 00:23:21.887 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:21.887 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:21.887 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # for bdev in "${base_bdevs[@]:1}" 00:23:21.887 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@732 -- # '[' -z '' ']' 00:23:21.887 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # continue 00:23:21.887 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # for bdev in "${base_bdevs[@]:1}" 00:23:21.887 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@732 -- # '[' -z BaseBdev3 ']' 00:23:21.888 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:23:21.888 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:21.888 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:23:21.888 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:21.888 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:23:21.888 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:21.888 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:23:21.888 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:21.888 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:21.888 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:23:22.145 /dev/nbd1 00:23:22.145 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:22.145 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:22.145 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:23:22.145 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@865 -- # local i 00:23:22.145 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:23:22.145 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:23:22.145 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:23:22.145 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # break 00:23:22.145 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:23:22.145 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:23:22.145 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:22.145 1+0 records in 00:23:22.145 1+0 records out 00:23:22.145 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00028044 s, 14.6 MB/s 00:23:22.145 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:22.145 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # size=4096 00:23:22.145 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:22.145 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:23:22.145 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # return 0 00:23:22.146 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:22.146 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:22.146 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@736 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:23:22.403 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@737 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:23:22.403 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:22.403 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:23:22.403 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:22.403 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:23:22.403 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:22.403 00:05:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:22.661 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:22.661 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:22.661 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:22.661 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:22.661 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:22.661 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:22.661 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:23:22.661 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:22.661 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # for bdev in "${base_bdevs[@]:1}" 00:23:22.661 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@732 -- # '[' -z BaseBdev4 ']' 00:23:22.661 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:23:22.661 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:22.661 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:23:22.661 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:22.661 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:23:22.661 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:22.661 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:23:22.661 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:22.661 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:22.661 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:23:22.919 /dev/nbd1 00:23:22.919 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:22.919 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:22.919 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:23:22.919 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@865 -- # local i 00:23:22.919 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:23:22.919 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:23:22.919 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:23:22.919 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # break 00:23:22.919 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:23:22.919 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:23:22.919 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:22.919 1+0 records in 00:23:22.919 1+0 records out 00:23:22.919 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000292162 s, 14.0 MB/s 00:23:22.919 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:22.919 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # size=4096 00:23:22.919 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:22.919 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:23:22.919 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # return 0 00:23:22.919 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:22.919 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:22.919 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@736 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:23:22.919 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@737 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:23:22.919 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:22.919 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:23:22.919 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:22.919 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:23:22.919 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:22.919 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:23.175 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:23.175 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:23.175 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:23.175 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:23.175 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:23.175 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:23.175 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:23:23.175 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:23.175 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@739 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:23.175 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:23.175 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:23.175 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:23.175 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:23:23.175 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:23.175 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:23.432 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:23.432 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:23.432 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:23.432 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:23.432 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:23.432 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:23.432 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:23:23.432 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:23.432 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # '[' true = true ']' 00:23:23.432 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:23:23.432 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev1 ']' 00:23:23.432 00:05:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:23:23.689 00:05:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:23.947 [2024-05-15 00:05:24.444311] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:23.947 [2024-05-15 00:05:24.444357] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:23.947 [2024-05-15 00:05:24.444377] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x7fdd80 00:23:23.947 [2024-05-15 00:05:24.444390] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:23.947 [2024-05-15 00:05:24.445999] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:23.947 [2024-05-15 00:05:24.446028] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:23.947 [2024-05-15 00:05:24.446096] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev1 00:23:23.947 [2024-05-15 00:05:24.446123] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:23.947 BaseBdev1 00:23:23.947 00:05:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:23:23.947 00:05:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@751 -- # '[' -z '' ']' 00:23:23.947 00:05:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # continue 00:23:23.947 00:05:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:23:23.947 00:05:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev3 ']' 00:23:23.947 00:05:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev3 00:23:24.205 00:05:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:23:24.464 [2024-05-15 00:05:24.937675] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:23:24.464 [2024-05-15 00:05:24.937717] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:24.464 [2024-05-15 00:05:24.937738] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x891fa0 00:23:24.464 [2024-05-15 00:05:24.937751] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:24.464 [2024-05-15 00:05:24.938072] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:24.464 [2024-05-15 00:05:24.938089] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:23:24.464 [2024-05-15 00:05:24.938150] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev3 00:23:24.464 [2024-05-15 00:05:24.938162] bdev_raid.c:3396:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev3 (4) greater than existing raid bdev raid_bdev1 (1) 00:23:24.464 [2024-05-15 00:05:24.938172] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:24.464 [2024-05-15 00:05:24.938188] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7fd070 name raid_bdev1, state configuring 00:23:24.464 [2024-05-15 00:05:24.938218] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:24.464 BaseBdev3 00:23:24.464 00:05:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:23:24.464 00:05:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev4 ']' 00:23:24.464 00:05:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev4 00:23:24.722 00:05:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:23:24.980 [2024-05-15 00:05:25.431051] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:23:24.980 [2024-05-15 00:05:25.431097] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:24.980 [2024-05-15 00:05:25.431116] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9af4c0 00:23:24.980 [2024-05-15 00:05:25.431129] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:24.980 [2024-05-15 00:05:25.431473] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:24.980 [2024-05-15 00:05:25.431492] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:23:24.980 [2024-05-15 00:05:25.431552] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev4 00:23:24.980 [2024-05-15 00:05:25.431571] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:24.980 BaseBdev4 00:23:24.980 00:05:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@757 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:25.238 00:05:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@758 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:25.496 [2024-05-15 00:05:25.920422] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:25.496 [2024-05-15 00:05:25.920464] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:25.496 [2024-05-15 00:05:25.920485] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9a0ed0 00:23:25.496 [2024-05-15 00:05:25.920497] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:25.496 [2024-05-15 00:05:25.920849] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:25.496 [2024-05-15 00:05:25.920866] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:25.496 [2024-05-15 00:05:25.920940] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev spare 00:23:25.496 [2024-05-15 00:05:25.920958] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:25.497 spare 00:23:25.497 00:05:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:25.497 00:05:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:25.497 00:05:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:25.497 00:05:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:25.497 00:05:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:25.497 00:05:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:23:25.497 00:05:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:25.497 00:05:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:25.497 00:05:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:25.497 00:05:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:25.497 00:05:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:25.497 00:05:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:25.497 [2024-05-15 00:05:26.021299] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x7fc590 00:23:25.497 [2024-05-15 00:05:26.021318] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:25.497 [2024-05-15 00:05:26.021512] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7f7bf0 00:23:25.497 [2024-05-15 00:05:26.021662] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x7fc590 00:23:25.497 [2024-05-15 00:05:26.021672] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x7fc590 00:23:25.497 [2024-05-15 00:05:26.021789] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:25.755 00:05:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:25.755 "name": "raid_bdev1", 00:23:25.755 "uuid": "9a796da2-5f6b-427f-bf3f-d4c3855cb499", 00:23:25.755 "strip_size_kb": 0, 00:23:25.755 "state": "online", 00:23:25.755 "raid_level": "raid1", 00:23:25.755 "superblock": true, 00:23:25.755 "num_base_bdevs": 4, 00:23:25.755 "num_base_bdevs_discovered": 3, 00:23:25.755 "num_base_bdevs_operational": 3, 00:23:25.755 "base_bdevs_list": [ 00:23:25.755 { 00:23:25.755 "name": "spare", 00:23:25.755 "uuid": "187aed10-98aa-5406-8eff-fe98d5a3fb28", 00:23:25.755 "is_configured": true, 00:23:25.755 "data_offset": 2048, 00:23:25.755 "data_size": 63488 00:23:25.755 }, 00:23:25.755 { 00:23:25.755 "name": null, 00:23:25.755 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:25.755 "is_configured": false, 00:23:25.755 "data_offset": 2048, 00:23:25.755 "data_size": 63488 00:23:25.755 }, 00:23:25.755 { 00:23:25.755 "name": "BaseBdev3", 00:23:25.755 "uuid": "c0ff0546-61f9-510d-b009-54ac06993711", 00:23:25.755 "is_configured": true, 00:23:25.755 "data_offset": 2048, 00:23:25.755 "data_size": 63488 00:23:25.755 }, 00:23:25.755 { 00:23:25.755 "name": "BaseBdev4", 00:23:25.755 "uuid": "ef062340-f95f-5893-a2c7-a598208517fa", 00:23:25.755 "is_configured": true, 00:23:25.755 "data_offset": 2048, 00:23:25.755 "data_size": 63488 00:23:25.755 } 00:23:25.755 ] 00:23:25.755 }' 00:23:25.755 00:05:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:25.755 00:05:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:26.321 00:05:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:26.321 00:05:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:26.321 00:05:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:23:26.321 00:05:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:23:26.321 00:05:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:26.321 00:05:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:26.321 00:05:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:26.579 00:05:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:26.579 "name": "raid_bdev1", 00:23:26.579 "uuid": "9a796da2-5f6b-427f-bf3f-d4c3855cb499", 00:23:26.579 "strip_size_kb": 0, 00:23:26.579 "state": "online", 00:23:26.579 "raid_level": "raid1", 00:23:26.579 "superblock": true, 00:23:26.579 "num_base_bdevs": 4, 00:23:26.579 "num_base_bdevs_discovered": 3, 00:23:26.579 "num_base_bdevs_operational": 3, 00:23:26.579 "base_bdevs_list": [ 00:23:26.579 { 00:23:26.579 "name": "spare", 00:23:26.579 "uuid": "187aed10-98aa-5406-8eff-fe98d5a3fb28", 00:23:26.579 "is_configured": true, 00:23:26.579 "data_offset": 2048, 00:23:26.579 "data_size": 63488 00:23:26.579 }, 00:23:26.579 { 00:23:26.579 "name": null, 00:23:26.579 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:26.579 "is_configured": false, 00:23:26.579 "data_offset": 2048, 00:23:26.579 "data_size": 63488 00:23:26.579 }, 00:23:26.579 { 00:23:26.579 "name": "BaseBdev3", 00:23:26.579 "uuid": "c0ff0546-61f9-510d-b009-54ac06993711", 00:23:26.579 "is_configured": true, 00:23:26.579 "data_offset": 2048, 00:23:26.579 "data_size": 63488 00:23:26.579 }, 00:23:26.579 { 00:23:26.579 "name": "BaseBdev4", 00:23:26.579 "uuid": "ef062340-f95f-5893-a2c7-a598208517fa", 00:23:26.579 "is_configured": true, 00:23:26.579 "data_offset": 2048, 00:23:26.579 "data_size": 63488 00:23:26.579 } 00:23:26.579 ] 00:23:26.579 }' 00:23:26.579 00:05:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:26.579 00:05:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:26.579 00:05:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:26.579 00:05:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:23:26.579 00:05:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:26.579 00:05:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # jq -r '.[].base_bdevs_list[0].name' 00:23:26.837 00:05:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # [[ spare == \s\p\a\r\e ]] 00:23:26.837 00:05:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:27.094 [2024-05-15 00:05:27.609259] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:27.094 00:05:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:27.094 00:05:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:27.094 00:05:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:27.095 00:05:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:27.095 00:05:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:27.095 00:05:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:27.095 00:05:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:27.095 00:05:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:27.095 00:05:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:27.095 00:05:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:27.095 00:05:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:27.095 00:05:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:27.353 00:05:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:27.353 "name": "raid_bdev1", 00:23:27.353 "uuid": "9a796da2-5f6b-427f-bf3f-d4c3855cb499", 00:23:27.353 "strip_size_kb": 0, 00:23:27.353 "state": "online", 00:23:27.353 "raid_level": "raid1", 00:23:27.353 "superblock": true, 00:23:27.353 "num_base_bdevs": 4, 00:23:27.353 "num_base_bdevs_discovered": 2, 00:23:27.353 "num_base_bdevs_operational": 2, 00:23:27.353 "base_bdevs_list": [ 00:23:27.353 { 00:23:27.353 "name": null, 00:23:27.353 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:27.353 "is_configured": false, 00:23:27.353 "data_offset": 2048, 00:23:27.353 "data_size": 63488 00:23:27.353 }, 00:23:27.353 { 00:23:27.353 "name": null, 00:23:27.353 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:27.353 "is_configured": false, 00:23:27.353 "data_offset": 2048, 00:23:27.353 "data_size": 63488 00:23:27.353 }, 00:23:27.353 { 00:23:27.353 "name": "BaseBdev3", 00:23:27.353 "uuid": "c0ff0546-61f9-510d-b009-54ac06993711", 00:23:27.353 "is_configured": true, 00:23:27.353 "data_offset": 2048, 00:23:27.353 "data_size": 63488 00:23:27.353 }, 00:23:27.353 { 00:23:27.353 "name": "BaseBdev4", 00:23:27.353 "uuid": "ef062340-f95f-5893-a2c7-a598208517fa", 00:23:27.353 "is_configured": true, 00:23:27.353 "data_offset": 2048, 00:23:27.353 "data_size": 63488 00:23:27.353 } 00:23:27.353 ] 00:23:27.353 }' 00:23:27.353 00:05:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:27.353 00:05:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:27.918 00:05:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:28.176 [2024-05-15 00:05:28.716367] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:28.176 [2024-05-15 00:05:28.716528] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:23:28.176 [2024-05-15 00:05:28.716545] bdev_raid.c:3452:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:28.176 [2024-05-15 00:05:28.716573] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:28.176 [2024-05-15 00:05:28.720970] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7fe260 00:23:28.176 [2024-05-15 00:05:28.723225] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:28.176 00:05:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # sleep 1 00:23:29.608 00:05:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:29.608 00:05:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:29.608 00:05:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:23:29.608 00:05:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:23:29.608 00:05:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:29.608 00:05:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:29.608 00:05:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:29.608 00:05:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:29.608 "name": "raid_bdev1", 00:23:29.608 "uuid": "9a796da2-5f6b-427f-bf3f-d4c3855cb499", 00:23:29.608 "strip_size_kb": 0, 00:23:29.608 "state": "online", 00:23:29.608 "raid_level": "raid1", 00:23:29.608 "superblock": true, 00:23:29.608 "num_base_bdevs": 4, 00:23:29.608 "num_base_bdevs_discovered": 3, 00:23:29.608 "num_base_bdevs_operational": 3, 00:23:29.608 "process": { 00:23:29.608 "type": "rebuild", 00:23:29.608 "target": "spare", 00:23:29.608 "progress": { 00:23:29.608 "blocks": 24576, 00:23:29.608 "percent": 38 00:23:29.608 } 00:23:29.608 }, 00:23:29.608 "base_bdevs_list": [ 00:23:29.608 { 00:23:29.608 "name": "spare", 00:23:29.608 "uuid": "187aed10-98aa-5406-8eff-fe98d5a3fb28", 00:23:29.608 "is_configured": true, 00:23:29.608 "data_offset": 2048, 00:23:29.608 "data_size": 63488 00:23:29.608 }, 00:23:29.608 { 00:23:29.608 "name": null, 00:23:29.608 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:29.608 "is_configured": false, 00:23:29.608 "data_offset": 2048, 00:23:29.608 "data_size": 63488 00:23:29.608 }, 00:23:29.608 { 00:23:29.608 "name": "BaseBdev3", 00:23:29.608 "uuid": "c0ff0546-61f9-510d-b009-54ac06993711", 00:23:29.608 "is_configured": true, 00:23:29.608 "data_offset": 2048, 00:23:29.608 "data_size": 63488 00:23:29.608 }, 00:23:29.608 { 00:23:29.608 "name": "BaseBdev4", 00:23:29.608 "uuid": "ef062340-f95f-5893-a2c7-a598208517fa", 00:23:29.608 "is_configured": true, 00:23:29.608 "data_offset": 2048, 00:23:29.608 "data_size": 63488 00:23:29.608 } 00:23:29.608 ] 00:23:29.608 }' 00:23:29.608 00:05:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:29.608 00:05:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:29.608 00:05:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:29.608 00:05:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:23:29.608 00:05:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:29.867 [2024-05-15 00:05:30.247328] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:29.867 [2024-05-15 00:05:30.335991] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:29.867 [2024-05-15 00:05:30.336037] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:29.867 00:05:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:29.867 00:05:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:29.867 00:05:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:29.867 00:05:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:29.867 00:05:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:29.867 00:05:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:29.867 00:05:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:29.867 00:05:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:29.867 00:05:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:29.867 00:05:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:29.867 00:05:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:29.867 00:05:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:30.125 00:05:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:30.125 "name": "raid_bdev1", 00:23:30.125 "uuid": "9a796da2-5f6b-427f-bf3f-d4c3855cb499", 00:23:30.125 "strip_size_kb": 0, 00:23:30.125 "state": "online", 00:23:30.125 "raid_level": "raid1", 00:23:30.125 "superblock": true, 00:23:30.125 "num_base_bdevs": 4, 00:23:30.125 "num_base_bdevs_discovered": 2, 00:23:30.125 "num_base_bdevs_operational": 2, 00:23:30.125 "base_bdevs_list": [ 00:23:30.125 { 00:23:30.125 "name": null, 00:23:30.125 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:30.125 "is_configured": false, 00:23:30.125 "data_offset": 2048, 00:23:30.125 "data_size": 63488 00:23:30.125 }, 00:23:30.125 { 00:23:30.125 "name": null, 00:23:30.125 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:30.125 "is_configured": false, 00:23:30.125 "data_offset": 2048, 00:23:30.125 "data_size": 63488 00:23:30.125 }, 00:23:30.125 { 00:23:30.125 "name": "BaseBdev3", 00:23:30.125 "uuid": "c0ff0546-61f9-510d-b009-54ac06993711", 00:23:30.125 "is_configured": true, 00:23:30.125 "data_offset": 2048, 00:23:30.125 "data_size": 63488 00:23:30.125 }, 00:23:30.125 { 00:23:30.125 "name": "BaseBdev4", 00:23:30.125 "uuid": "ef062340-f95f-5893-a2c7-a598208517fa", 00:23:30.125 "is_configured": true, 00:23:30.125 "data_offset": 2048, 00:23:30.125 "data_size": 63488 00:23:30.125 } 00:23:30.125 ] 00:23:30.125 }' 00:23:30.125 00:05:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:30.125 00:05:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:30.691 00:05:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:30.949 [2024-05-15 00:05:31.451505] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:30.949 [2024-05-15 00:05:31.451558] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:30.949 [2024-05-15 00:05:31.451579] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x7fa190 00:23:30.949 [2024-05-15 00:05:31.451591] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:30.949 [2024-05-15 00:05:31.451964] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:30.949 [2024-05-15 00:05:31.451982] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:30.949 [2024-05-15 00:05:31.452059] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev spare 00:23:30.949 [2024-05-15 00:05:31.452078] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:23:30.949 [2024-05-15 00:05:31.452088] bdev_raid.c:3452:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:30.949 [2024-05-15 00:05:31.452106] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:30.949 [2024-05-15 00:05:31.456546] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x4d9860 00:23:30.949 spare 00:23:30.949 [2024-05-15 00:05:31.458023] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:30.949 00:05:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # sleep 1 00:23:32.323 00:05:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:32.323 00:05:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:32.323 00:05:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:23:32.323 00:05:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:23:32.323 00:05:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:32.323 00:05:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:32.323 00:05:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:32.323 00:05:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:32.323 "name": "raid_bdev1", 00:23:32.323 "uuid": "9a796da2-5f6b-427f-bf3f-d4c3855cb499", 00:23:32.323 "strip_size_kb": 0, 00:23:32.323 "state": "online", 00:23:32.323 "raid_level": "raid1", 00:23:32.323 "superblock": true, 00:23:32.323 "num_base_bdevs": 4, 00:23:32.323 "num_base_bdevs_discovered": 3, 00:23:32.323 "num_base_bdevs_operational": 3, 00:23:32.324 "process": { 00:23:32.324 "type": "rebuild", 00:23:32.324 "target": "spare", 00:23:32.324 "progress": { 00:23:32.324 "blocks": 24576, 00:23:32.324 "percent": 38 00:23:32.324 } 00:23:32.324 }, 00:23:32.324 "base_bdevs_list": [ 00:23:32.324 { 00:23:32.324 "name": "spare", 00:23:32.324 "uuid": "187aed10-98aa-5406-8eff-fe98d5a3fb28", 00:23:32.324 "is_configured": true, 00:23:32.324 "data_offset": 2048, 00:23:32.324 "data_size": 63488 00:23:32.324 }, 00:23:32.324 { 00:23:32.324 "name": null, 00:23:32.324 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:32.324 "is_configured": false, 00:23:32.324 "data_offset": 2048, 00:23:32.324 "data_size": 63488 00:23:32.324 }, 00:23:32.324 { 00:23:32.324 "name": "BaseBdev3", 00:23:32.324 "uuid": "c0ff0546-61f9-510d-b009-54ac06993711", 00:23:32.324 "is_configured": true, 00:23:32.324 "data_offset": 2048, 00:23:32.324 "data_size": 63488 00:23:32.324 }, 00:23:32.324 { 00:23:32.324 "name": "BaseBdev4", 00:23:32.324 "uuid": "ef062340-f95f-5893-a2c7-a598208517fa", 00:23:32.324 "is_configured": true, 00:23:32.324 "data_offset": 2048, 00:23:32.324 "data_size": 63488 00:23:32.324 } 00:23:32.324 ] 00:23:32.324 }' 00:23:32.324 00:05:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:32.324 00:05:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:32.324 00:05:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:32.324 00:05:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:23:32.324 00:05:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:32.582 [2024-05-15 00:05:33.046237] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:32.582 [2024-05-15 00:05:33.070694] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:32.582 [2024-05-15 00:05:33.070738] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:32.582 00:05:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@780 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:32.582 00:05:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:32.582 00:05:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:32.582 00:05:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:32.582 00:05:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:32.582 00:05:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:32.582 00:05:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:32.582 00:05:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:32.582 00:05:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:32.582 00:05:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:32.582 00:05:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:32.582 00:05:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:32.839 00:05:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:32.840 "name": "raid_bdev1", 00:23:32.840 "uuid": "9a796da2-5f6b-427f-bf3f-d4c3855cb499", 00:23:32.840 "strip_size_kb": 0, 00:23:32.840 "state": "online", 00:23:32.840 "raid_level": "raid1", 00:23:32.840 "superblock": true, 00:23:32.840 "num_base_bdevs": 4, 00:23:32.840 "num_base_bdevs_discovered": 2, 00:23:32.840 "num_base_bdevs_operational": 2, 00:23:32.840 "base_bdevs_list": [ 00:23:32.840 { 00:23:32.840 "name": null, 00:23:32.840 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:32.840 "is_configured": false, 00:23:32.840 "data_offset": 2048, 00:23:32.840 "data_size": 63488 00:23:32.840 }, 00:23:32.840 { 00:23:32.840 "name": null, 00:23:32.840 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:32.840 "is_configured": false, 00:23:32.840 "data_offset": 2048, 00:23:32.840 "data_size": 63488 00:23:32.840 }, 00:23:32.840 { 00:23:32.840 "name": "BaseBdev3", 00:23:32.840 "uuid": "c0ff0546-61f9-510d-b009-54ac06993711", 00:23:32.840 "is_configured": true, 00:23:32.840 "data_offset": 2048, 00:23:32.840 "data_size": 63488 00:23:32.840 }, 00:23:32.840 { 00:23:32.840 "name": "BaseBdev4", 00:23:32.840 "uuid": "ef062340-f95f-5893-a2c7-a598208517fa", 00:23:32.840 "is_configured": true, 00:23:32.840 "data_offset": 2048, 00:23:32.840 "data_size": 63488 00:23:32.840 } 00:23:32.840 ] 00:23:32.840 }' 00:23:32.840 00:05:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:32.840 00:05:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:33.404 00:05:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@781 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:33.404 00:05:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:33.404 00:05:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:23:33.404 00:05:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:23:33.404 00:05:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:33.404 00:05:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:33.404 00:05:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:33.662 00:05:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:33.662 "name": "raid_bdev1", 00:23:33.662 "uuid": "9a796da2-5f6b-427f-bf3f-d4c3855cb499", 00:23:33.662 "strip_size_kb": 0, 00:23:33.662 "state": "online", 00:23:33.662 "raid_level": "raid1", 00:23:33.662 "superblock": true, 00:23:33.662 "num_base_bdevs": 4, 00:23:33.662 "num_base_bdevs_discovered": 2, 00:23:33.662 "num_base_bdevs_operational": 2, 00:23:33.662 "base_bdevs_list": [ 00:23:33.662 { 00:23:33.662 "name": null, 00:23:33.662 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:33.662 "is_configured": false, 00:23:33.662 "data_offset": 2048, 00:23:33.662 "data_size": 63488 00:23:33.662 }, 00:23:33.662 { 00:23:33.662 "name": null, 00:23:33.662 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:33.662 "is_configured": false, 00:23:33.662 "data_offset": 2048, 00:23:33.662 "data_size": 63488 00:23:33.662 }, 00:23:33.662 { 00:23:33.662 "name": "BaseBdev3", 00:23:33.662 "uuid": "c0ff0546-61f9-510d-b009-54ac06993711", 00:23:33.662 "is_configured": true, 00:23:33.662 "data_offset": 2048, 00:23:33.662 "data_size": 63488 00:23:33.662 }, 00:23:33.662 { 00:23:33.662 "name": "BaseBdev4", 00:23:33.662 "uuid": "ef062340-f95f-5893-a2c7-a598208517fa", 00:23:33.662 "is_configured": true, 00:23:33.662 "data_offset": 2048, 00:23:33.662 "data_size": 63488 00:23:33.662 } 00:23:33.662 ] 00:23:33.662 }' 00:23:33.662 00:05:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:33.662 00:05:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:33.662 00:05:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:33.920 00:05:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:23:33.920 00:05:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:23:33.920 00:05:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@785 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:34.178 [2024-05-15 00:05:34.675434] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:34.178 [2024-05-15 00:05:34.675488] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:34.178 [2024-05-15 00:05:34.675510] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x7fdd80 00:23:34.178 [2024-05-15 00:05:34.675523] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:34.178 [2024-05-15 00:05:34.675863] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:34.178 [2024-05-15 00:05:34.675880] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:34.178 [2024-05-15 00:05:34.675945] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev1 00:23:34.178 [2024-05-15 00:05:34.675957] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:23:34.178 [2024-05-15 00:05:34.675967] bdev_raid.c:3430:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:34.178 BaseBdev1 00:23:34.178 00:05:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@786 -- # sleep 1 00:23:35.111 00:05:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@787 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:35.111 00:05:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:35.111 00:05:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:35.111 00:05:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:35.111 00:05:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:35.111 00:05:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:35.111 00:05:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:35.369 00:05:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:35.369 00:05:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:35.369 00:05:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:35.369 00:05:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:35.369 00:05:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:35.369 00:05:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:35.369 "name": "raid_bdev1", 00:23:35.369 "uuid": "9a796da2-5f6b-427f-bf3f-d4c3855cb499", 00:23:35.369 "strip_size_kb": 0, 00:23:35.369 "state": "online", 00:23:35.369 "raid_level": "raid1", 00:23:35.369 "superblock": true, 00:23:35.369 "num_base_bdevs": 4, 00:23:35.369 "num_base_bdevs_discovered": 2, 00:23:35.369 "num_base_bdevs_operational": 2, 00:23:35.369 "base_bdevs_list": [ 00:23:35.369 { 00:23:35.369 "name": null, 00:23:35.369 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:35.369 "is_configured": false, 00:23:35.369 "data_offset": 2048, 00:23:35.369 "data_size": 63488 00:23:35.369 }, 00:23:35.369 { 00:23:35.369 "name": null, 00:23:35.370 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:35.370 "is_configured": false, 00:23:35.370 "data_offset": 2048, 00:23:35.370 "data_size": 63488 00:23:35.370 }, 00:23:35.370 { 00:23:35.370 "name": "BaseBdev3", 00:23:35.370 "uuid": "c0ff0546-61f9-510d-b009-54ac06993711", 00:23:35.370 "is_configured": true, 00:23:35.370 "data_offset": 2048, 00:23:35.370 "data_size": 63488 00:23:35.370 }, 00:23:35.370 { 00:23:35.370 "name": "BaseBdev4", 00:23:35.370 "uuid": "ef062340-f95f-5893-a2c7-a598208517fa", 00:23:35.370 "is_configured": true, 00:23:35.370 "data_offset": 2048, 00:23:35.370 "data_size": 63488 00:23:35.370 } 00:23:35.370 ] 00:23:35.370 }' 00:23:35.370 00:05:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:35.370 00:05:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:35.935 00:05:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@788 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:35.935 00:05:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:35.935 00:05:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:23:36.192 00:05:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:23:36.192 00:05:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:36.192 00:05:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.193 00:05:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:36.193 00:05:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:36.193 "name": "raid_bdev1", 00:23:36.193 "uuid": "9a796da2-5f6b-427f-bf3f-d4c3855cb499", 00:23:36.193 "strip_size_kb": 0, 00:23:36.193 "state": "online", 00:23:36.193 "raid_level": "raid1", 00:23:36.193 "superblock": true, 00:23:36.193 "num_base_bdevs": 4, 00:23:36.193 "num_base_bdevs_discovered": 2, 00:23:36.193 "num_base_bdevs_operational": 2, 00:23:36.193 "base_bdevs_list": [ 00:23:36.193 { 00:23:36.193 "name": null, 00:23:36.193 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:36.193 "is_configured": false, 00:23:36.193 "data_offset": 2048, 00:23:36.193 "data_size": 63488 00:23:36.193 }, 00:23:36.193 { 00:23:36.193 "name": null, 00:23:36.193 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:36.193 "is_configured": false, 00:23:36.193 "data_offset": 2048, 00:23:36.193 "data_size": 63488 00:23:36.193 }, 00:23:36.193 { 00:23:36.193 "name": "BaseBdev3", 00:23:36.193 "uuid": "c0ff0546-61f9-510d-b009-54ac06993711", 00:23:36.193 "is_configured": true, 00:23:36.193 "data_offset": 2048, 00:23:36.193 "data_size": 63488 00:23:36.193 }, 00:23:36.193 { 00:23:36.193 "name": "BaseBdev4", 00:23:36.193 "uuid": "ef062340-f95f-5893-a2c7-a598208517fa", 00:23:36.193 "is_configured": true, 00:23:36.193 "data_offset": 2048, 00:23:36.193 "data_size": 63488 00:23:36.193 } 00:23:36.193 ] 00:23:36.193 }' 00:23:36.193 00:05:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:36.450 00:05:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:36.450 00:05:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:36.450 00:05:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:23:36.450 00:05:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@789 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:36.450 00:05:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:23:36.451 00:05:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:36.451 00:05:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:36.451 00:05:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:36.451 00:05:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:36.451 00:05:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:36.451 00:05:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:36.451 00:05:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:36.451 00:05:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:36.451 00:05:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:36.451 00:05:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:36.708 [2024-05-15 00:05:37.082112] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:36.708 [2024-05-15 00:05:37.082238] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:23:36.708 [2024-05-15 00:05:37.082254] bdev_raid.c:3430:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:36.708 request: 00:23:36.708 { 00:23:36.708 "raid_bdev": "raid_bdev1", 00:23:36.708 "base_bdev": "BaseBdev1", 00:23:36.708 "method": "bdev_raid_add_base_bdev", 00:23:36.708 "req_id": 1 00:23:36.708 } 00:23:36.708 Got JSON-RPC error response 00:23:36.708 response: 00:23:36.708 { 00:23:36.708 "code": -22, 00:23:36.708 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:23:36.708 } 00:23:36.708 00:05:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:23:36.708 00:05:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:23:36.708 00:05:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:23:36.708 00:05:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:23:36.708 00:05:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@790 -- # sleep 1 00:23:37.642 00:05:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:37.642 00:05:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:37.642 00:05:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:37.642 00:05:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:37.642 00:05:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:37.642 00:05:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:37.642 00:05:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:37.642 00:05:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:37.642 00:05:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:37.642 00:05:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:37.642 00:05:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:37.642 00:05:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:37.901 00:05:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:37.901 "name": "raid_bdev1", 00:23:37.901 "uuid": "9a796da2-5f6b-427f-bf3f-d4c3855cb499", 00:23:37.901 "strip_size_kb": 0, 00:23:37.901 "state": "online", 00:23:37.901 "raid_level": "raid1", 00:23:37.901 "superblock": true, 00:23:37.901 "num_base_bdevs": 4, 00:23:37.901 "num_base_bdevs_discovered": 2, 00:23:37.901 "num_base_bdevs_operational": 2, 00:23:37.901 "base_bdevs_list": [ 00:23:37.901 { 00:23:37.901 "name": null, 00:23:37.901 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:37.901 "is_configured": false, 00:23:37.901 "data_offset": 2048, 00:23:37.901 "data_size": 63488 00:23:37.901 }, 00:23:37.901 { 00:23:37.901 "name": null, 00:23:37.901 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:37.901 "is_configured": false, 00:23:37.901 "data_offset": 2048, 00:23:37.901 "data_size": 63488 00:23:37.901 }, 00:23:37.901 { 00:23:37.901 "name": "BaseBdev3", 00:23:37.901 "uuid": "c0ff0546-61f9-510d-b009-54ac06993711", 00:23:37.901 "is_configured": true, 00:23:37.901 "data_offset": 2048, 00:23:37.901 "data_size": 63488 00:23:37.901 }, 00:23:37.901 { 00:23:37.901 "name": "BaseBdev4", 00:23:37.901 "uuid": "ef062340-f95f-5893-a2c7-a598208517fa", 00:23:37.901 "is_configured": true, 00:23:37.901 "data_offset": 2048, 00:23:37.901 "data_size": 63488 00:23:37.901 } 00:23:37.901 ] 00:23:37.901 }' 00:23:37.901 00:05:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:37.901 00:05:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:38.466 00:05:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@792 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:38.466 00:05:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:38.466 00:05:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:23:38.466 00:05:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:23:38.466 00:05:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:38.466 00:05:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:38.466 00:05:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:38.724 00:05:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:38.724 "name": "raid_bdev1", 00:23:38.724 "uuid": "9a796da2-5f6b-427f-bf3f-d4c3855cb499", 00:23:38.724 "strip_size_kb": 0, 00:23:38.724 "state": "online", 00:23:38.724 "raid_level": "raid1", 00:23:38.724 "superblock": true, 00:23:38.724 "num_base_bdevs": 4, 00:23:38.724 "num_base_bdevs_discovered": 2, 00:23:38.724 "num_base_bdevs_operational": 2, 00:23:38.724 "base_bdevs_list": [ 00:23:38.724 { 00:23:38.724 "name": null, 00:23:38.724 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:38.724 "is_configured": false, 00:23:38.724 "data_offset": 2048, 00:23:38.724 "data_size": 63488 00:23:38.724 }, 00:23:38.724 { 00:23:38.724 "name": null, 00:23:38.724 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:38.724 "is_configured": false, 00:23:38.724 "data_offset": 2048, 00:23:38.724 "data_size": 63488 00:23:38.724 }, 00:23:38.724 { 00:23:38.724 "name": "BaseBdev3", 00:23:38.724 "uuid": "c0ff0546-61f9-510d-b009-54ac06993711", 00:23:38.724 "is_configured": true, 00:23:38.724 "data_offset": 2048, 00:23:38.724 "data_size": 63488 00:23:38.724 }, 00:23:38.724 { 00:23:38.724 "name": "BaseBdev4", 00:23:38.724 "uuid": "ef062340-f95f-5893-a2c7-a598208517fa", 00:23:38.724 "is_configured": true, 00:23:38.724 "data_offset": 2048, 00:23:38.724 "data_size": 63488 00:23:38.724 } 00:23:38.724 ] 00:23:38.724 }' 00:23:38.724 00:05:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:38.724 00:05:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:38.724 00:05:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:38.724 00:05:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:23:38.724 00:05:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@795 -- # killprocess 499869 00:23:38.724 00:05:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@946 -- # '[' -z 499869 ']' 00:23:38.724 00:05:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@950 -- # kill -0 499869 00:23:38.724 00:05:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@951 -- # uname 00:23:38.982 00:05:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:23:38.982 00:05:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 499869 00:23:38.982 00:05:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:23:38.982 00:05:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:23:38.982 00:05:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@964 -- # echo 'killing process with pid 499869' 00:23:38.982 killing process with pid 499869 00:23:38.982 00:05:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@965 -- # kill 499869 00:23:38.982 Received shutdown signal, test time was about 28.727718 seconds 00:23:38.982 00:23:38.982 Latency(us) 00:23:38.982 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:38.982 =================================================================================================================== 00:23:38.982 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:23:38.982 [2024-05-15 00:05:39.357804] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:38.982 [2024-05-15 00:05:39.357906] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:38.982 00:05:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@970 -- # wait 499869 00:23:38.982 [2024-05-15 00:05:39.357976] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:38.982 [2024-05-15 00:05:39.357989] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7fc590 name raid_bdev1, state offline 00:23:38.982 [2024-05-15 00:05:39.399089] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:39.241 00:05:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@797 -- # return 0 00:23:39.241 00:23:39.241 real 0m34.419s 00:23:39.241 user 0m54.549s 00:23:39.241 sys 0m5.389s 00:23:39.241 00:05:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1122 -- # xtrace_disable 00:23:39.241 00:05:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:39.241 ************************************ 00:23:39.241 END TEST raid_rebuild_test_sb_io 00:23:39.241 ************************************ 00:23:39.241 00:05:39 bdev_raid -- bdev/bdev_raid.sh@830 -- # '[' n == y ']' 00:23:39.241 00:05:39 bdev_raid -- bdev/bdev_raid.sh@842 -- # base_blocklen=4096 00:23:39.241 00:05:39 bdev_raid -- bdev/bdev_raid.sh@844 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:23:39.241 00:05:39 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:23:39.241 00:05:39 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:23:39.241 00:05:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:39.241 ************************************ 00:23:39.241 START TEST raid_state_function_test_sb_4k 00:23:39.241 ************************************ 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1121 -- # raid_state_function_test raid1 2 true 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local raid_level=raid1 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=2 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@231 -- # '[' raid1 '!=' raid1 ']' 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@235 -- # strip_size=0 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # raid_pid=504737 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 504737' 00:23:39.241 Process raid pid: 504737 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@247 -- # waitforlisten 504737 /var/tmp/spdk-raid.sock 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@827 -- # '[' -z 504737 ']' 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@832 -- # local max_retries=100 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:39.241 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@836 -- # xtrace_disable 00:23:39.241 00:05:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:39.241 [2024-05-15 00:05:39.792865] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:23:39.241 [2024-05-15 00:05:39.792925] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:23:39.499 [2024-05-15 00:05:39.914203] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:39.499 [2024-05-15 00:05:40.023848] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:23:39.757 [2024-05-15 00:05:40.093432] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:39.757 [2024-05-15 00:05:40.093459] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:40.323 00:05:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:23:40.323 00:05:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@860 -- # return 0 00:23:40.323 00:05:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:40.581 [2024-05-15 00:05:40.950035] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:40.581 [2024-05-15 00:05:40.950077] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:40.581 [2024-05-15 00:05:40.950088] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:40.581 [2024-05-15 00:05:40.950099] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:40.581 00:05:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:23:40.581 00:05:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:23:40.581 00:05:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:23:40.581 00:05:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:40.581 00:05:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:40.581 00:05:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:40.581 00:05:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:40.581 00:05:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:40.581 00:05:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:40.581 00:05:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:40.581 00:05:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:40.581 00:05:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:40.839 00:05:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:40.839 "name": "Existed_Raid", 00:23:40.839 "uuid": "932ee828-0f8e-4e09-8ac4-9775422076dc", 00:23:40.839 "strip_size_kb": 0, 00:23:40.839 "state": "configuring", 00:23:40.839 "raid_level": "raid1", 00:23:40.839 "superblock": true, 00:23:40.839 "num_base_bdevs": 2, 00:23:40.839 "num_base_bdevs_discovered": 0, 00:23:40.839 "num_base_bdevs_operational": 2, 00:23:40.839 "base_bdevs_list": [ 00:23:40.839 { 00:23:40.839 "name": "BaseBdev1", 00:23:40.839 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:40.839 "is_configured": false, 00:23:40.839 "data_offset": 0, 00:23:40.839 "data_size": 0 00:23:40.839 }, 00:23:40.839 { 00:23:40.839 "name": "BaseBdev2", 00:23:40.839 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:40.839 "is_configured": false, 00:23:40.839 "data_offset": 0, 00:23:40.839 "data_size": 0 00:23:40.839 } 00:23:40.839 ] 00:23:40.839 }' 00:23:40.839 00:05:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:40.839 00:05:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:41.403 00:05:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:41.660 [2024-05-15 00:05:42.024724] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:41.660 [2024-05-15 00:05:42.024756] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14a5bc0 name Existed_Raid, state configuring 00:23:41.660 00:05:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:41.918 [2024-05-15 00:05:42.269392] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:41.918 [2024-05-15 00:05:42.269430] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:41.918 [2024-05-15 00:05:42.269440] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:41.918 [2024-05-15 00:05:42.269452] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:41.918 00:05:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:23:42.175 [2024-05-15 00:05:42.528022] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:42.175 BaseBdev1 00:23:42.175 00:05:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:23:42.175 00:05:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:23:42.176 00:05:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:23:42.176 00:05:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local i 00:23:42.176 00:05:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:23:42.176 00:05:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:23:42.176 00:05:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:42.433 00:05:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:23:42.433 [ 00:23:42.433 { 00:23:42.433 "name": "BaseBdev1", 00:23:42.433 "aliases": [ 00:23:42.433 "ab84aac6-c076-4895-bb0d-eca1ca31f36f" 00:23:42.433 ], 00:23:42.433 "product_name": "Malloc disk", 00:23:42.433 "block_size": 4096, 00:23:42.433 "num_blocks": 8192, 00:23:42.433 "uuid": "ab84aac6-c076-4895-bb0d-eca1ca31f36f", 00:23:42.433 "assigned_rate_limits": { 00:23:42.433 "rw_ios_per_sec": 0, 00:23:42.433 "rw_mbytes_per_sec": 0, 00:23:42.433 "r_mbytes_per_sec": 0, 00:23:42.433 "w_mbytes_per_sec": 0 00:23:42.433 }, 00:23:42.433 "claimed": true, 00:23:42.433 "claim_type": "exclusive_write", 00:23:42.433 "zoned": false, 00:23:42.433 "supported_io_types": { 00:23:42.433 "read": true, 00:23:42.433 "write": true, 00:23:42.433 "unmap": true, 00:23:42.433 "write_zeroes": true, 00:23:42.433 "flush": true, 00:23:42.433 "reset": true, 00:23:42.433 "compare": false, 00:23:42.433 "compare_and_write": false, 00:23:42.433 "abort": true, 00:23:42.433 "nvme_admin": false, 00:23:42.433 "nvme_io": false 00:23:42.433 }, 00:23:42.433 "memory_domains": [ 00:23:42.433 { 00:23:42.433 "dma_device_id": "system", 00:23:42.433 "dma_device_type": 1 00:23:42.433 }, 00:23:42.433 { 00:23:42.433 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:42.433 "dma_device_type": 2 00:23:42.433 } 00:23:42.433 ], 00:23:42.433 "driver_specific": {} 00:23:42.433 } 00:23:42.433 ] 00:23:42.691 00:05:43 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@903 -- # return 0 00:23:42.691 00:05:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:23:42.691 00:05:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:23:42.691 00:05:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:23:42.691 00:05:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:42.691 00:05:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:42.691 00:05:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:42.691 00:05:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:42.691 00:05:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:42.691 00:05:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:42.691 00:05:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:42.691 00:05:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:42.691 00:05:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:42.691 00:05:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:42.691 "name": "Existed_Raid", 00:23:42.691 "uuid": "572da42c-f8dd-4a62-a54e-164724910e90", 00:23:42.691 "strip_size_kb": 0, 00:23:42.691 "state": "configuring", 00:23:42.691 "raid_level": "raid1", 00:23:42.691 "superblock": true, 00:23:42.691 "num_base_bdevs": 2, 00:23:42.691 "num_base_bdevs_discovered": 1, 00:23:42.691 "num_base_bdevs_operational": 2, 00:23:42.691 "base_bdevs_list": [ 00:23:42.691 { 00:23:42.691 "name": "BaseBdev1", 00:23:42.691 "uuid": "ab84aac6-c076-4895-bb0d-eca1ca31f36f", 00:23:42.691 "is_configured": true, 00:23:42.691 "data_offset": 256, 00:23:42.691 "data_size": 7936 00:23:42.691 }, 00:23:42.691 { 00:23:42.691 "name": "BaseBdev2", 00:23:42.691 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:42.691 "is_configured": false, 00:23:42.691 "data_offset": 0, 00:23:42.691 "data_size": 0 00:23:42.691 } 00:23:42.691 ] 00:23:42.691 }' 00:23:42.691 00:05:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:42.691 00:05:43 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:43.258 00:05:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:43.540 [2024-05-15 00:05:43.939875] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:43.540 [2024-05-15 00:05:43.939914] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14a5e60 name Existed_Raid, state configuring 00:23:43.540 00:05:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:43.814 [2024-05-15 00:05:44.184551] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:43.814 [2024-05-15 00:05:44.186042] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:43.814 [2024-05-15 00:05:44.186074] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:43.814 00:05:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:23:43.814 00:05:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:23:43.814 00:05:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:23:43.814 00:05:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:23:43.814 00:05:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:23:43.814 00:05:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:43.814 00:05:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:43.814 00:05:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:43.814 00:05:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:43.814 00:05:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:43.814 00:05:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:43.814 00:05:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:43.814 00:05:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:43.814 00:05:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:43.814 00:05:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:43.814 "name": "Existed_Raid", 00:23:43.814 "uuid": "0a5ce7f5-2daa-4735-81b4-99bd079564ad", 00:23:43.814 "strip_size_kb": 0, 00:23:43.814 "state": "configuring", 00:23:43.814 "raid_level": "raid1", 00:23:43.814 "superblock": true, 00:23:43.814 "num_base_bdevs": 2, 00:23:43.814 "num_base_bdevs_discovered": 1, 00:23:43.814 "num_base_bdevs_operational": 2, 00:23:43.814 "base_bdevs_list": [ 00:23:43.814 { 00:23:43.814 "name": "BaseBdev1", 00:23:43.814 "uuid": "ab84aac6-c076-4895-bb0d-eca1ca31f36f", 00:23:43.814 "is_configured": true, 00:23:43.814 "data_offset": 256, 00:23:43.814 "data_size": 7936 00:23:43.814 }, 00:23:43.814 { 00:23:43.814 "name": "BaseBdev2", 00:23:43.814 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:43.814 "is_configured": false, 00:23:43.814 "data_offset": 0, 00:23:43.814 "data_size": 0 00:23:43.814 } 00:23:43.814 ] 00:23:43.814 }' 00:23:43.814 00:05:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:43.814 00:05:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:44.747 00:05:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:23:44.747 [2024-05-15 00:05:45.218615] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:44.747 [2024-05-15 00:05:45.218760] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x14a54b0 00:23:44.747 [2024-05-15 00:05:45.218774] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:44.747 [2024-05-15 00:05:45.218947] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14a5a70 00:23:44.747 [2024-05-15 00:05:45.219067] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14a54b0 00:23:44.747 [2024-05-15 00:05:45.219077] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x14a54b0 00:23:44.747 [2024-05-15 00:05:45.219169] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:44.747 BaseBdev2 00:23:44.747 00:05:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:23:44.747 00:05:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:23:44.747 00:05:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:23:44.747 00:05:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local i 00:23:44.747 00:05:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:23:44.747 00:05:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:23:44.747 00:05:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:45.004 00:05:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:23:45.261 [ 00:23:45.261 { 00:23:45.261 "name": "BaseBdev2", 00:23:45.261 "aliases": [ 00:23:45.261 "4ae93647-6e0d-41ab-851f-5ec40eae73b9" 00:23:45.261 ], 00:23:45.261 "product_name": "Malloc disk", 00:23:45.261 "block_size": 4096, 00:23:45.261 "num_blocks": 8192, 00:23:45.261 "uuid": "4ae93647-6e0d-41ab-851f-5ec40eae73b9", 00:23:45.261 "assigned_rate_limits": { 00:23:45.261 "rw_ios_per_sec": 0, 00:23:45.261 "rw_mbytes_per_sec": 0, 00:23:45.261 "r_mbytes_per_sec": 0, 00:23:45.261 "w_mbytes_per_sec": 0 00:23:45.261 }, 00:23:45.261 "claimed": true, 00:23:45.261 "claim_type": "exclusive_write", 00:23:45.261 "zoned": false, 00:23:45.261 "supported_io_types": { 00:23:45.261 "read": true, 00:23:45.261 "write": true, 00:23:45.261 "unmap": true, 00:23:45.261 "write_zeroes": true, 00:23:45.261 "flush": true, 00:23:45.261 "reset": true, 00:23:45.261 "compare": false, 00:23:45.261 "compare_and_write": false, 00:23:45.261 "abort": true, 00:23:45.261 "nvme_admin": false, 00:23:45.261 "nvme_io": false 00:23:45.261 }, 00:23:45.261 "memory_domains": [ 00:23:45.261 { 00:23:45.261 "dma_device_id": "system", 00:23:45.261 "dma_device_type": 1 00:23:45.261 }, 00:23:45.261 { 00:23:45.261 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:45.261 "dma_device_type": 2 00:23:45.261 } 00:23:45.261 ], 00:23:45.261 "driver_specific": {} 00:23:45.261 } 00:23:45.261 ] 00:23:45.261 00:05:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@903 -- # return 0 00:23:45.261 00:05:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:23:45.261 00:05:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:23:45.261 00:05:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:23:45.261 00:05:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:23:45.261 00:05:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:45.261 00:05:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:45.261 00:05:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:45.261 00:05:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:45.261 00:05:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:45.261 00:05:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:45.261 00:05:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:45.261 00:05:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:45.261 00:05:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:45.261 00:05:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:45.733 00:05:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:45.733 "name": "Existed_Raid", 00:23:45.733 "uuid": "0a5ce7f5-2daa-4735-81b4-99bd079564ad", 00:23:45.733 "strip_size_kb": 0, 00:23:45.733 "state": "online", 00:23:45.733 "raid_level": "raid1", 00:23:45.733 "superblock": true, 00:23:45.733 "num_base_bdevs": 2, 00:23:45.733 "num_base_bdevs_discovered": 2, 00:23:45.733 "num_base_bdevs_operational": 2, 00:23:45.733 "base_bdevs_list": [ 00:23:45.733 { 00:23:45.733 "name": "BaseBdev1", 00:23:45.733 "uuid": "ab84aac6-c076-4895-bb0d-eca1ca31f36f", 00:23:45.733 "is_configured": true, 00:23:45.733 "data_offset": 256, 00:23:45.733 "data_size": 7936 00:23:45.733 }, 00:23:45.733 { 00:23:45.733 "name": "BaseBdev2", 00:23:45.733 "uuid": "4ae93647-6e0d-41ab-851f-5ec40eae73b9", 00:23:45.733 "is_configured": true, 00:23:45.733 "data_offset": 256, 00:23:45.733 "data_size": 7936 00:23:45.733 } 00:23:45.733 ] 00:23:45.733 }' 00:23:45.733 00:05:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:45.733 00:05:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:45.990 00:05:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:23:45.990 00:05:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:23:45.990 00:05:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:23:45.990 00:05:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:23:45.990 00:05:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:23:45.990 00:05:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@199 -- # local name 00:23:45.990 00:05:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:23:45.990 00:05:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:23:46.248 [2024-05-15 00:05:46.666717] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:46.248 00:05:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:23:46.248 "name": "Existed_Raid", 00:23:46.248 "aliases": [ 00:23:46.248 "0a5ce7f5-2daa-4735-81b4-99bd079564ad" 00:23:46.248 ], 00:23:46.248 "product_name": "Raid Volume", 00:23:46.248 "block_size": 4096, 00:23:46.248 "num_blocks": 7936, 00:23:46.248 "uuid": "0a5ce7f5-2daa-4735-81b4-99bd079564ad", 00:23:46.248 "assigned_rate_limits": { 00:23:46.248 "rw_ios_per_sec": 0, 00:23:46.248 "rw_mbytes_per_sec": 0, 00:23:46.248 "r_mbytes_per_sec": 0, 00:23:46.248 "w_mbytes_per_sec": 0 00:23:46.248 }, 00:23:46.248 "claimed": false, 00:23:46.248 "zoned": false, 00:23:46.248 "supported_io_types": { 00:23:46.248 "read": true, 00:23:46.248 "write": true, 00:23:46.248 "unmap": false, 00:23:46.248 "write_zeroes": true, 00:23:46.248 "flush": false, 00:23:46.248 "reset": true, 00:23:46.248 "compare": false, 00:23:46.248 "compare_and_write": false, 00:23:46.248 "abort": false, 00:23:46.248 "nvme_admin": false, 00:23:46.248 "nvme_io": false 00:23:46.248 }, 00:23:46.248 "memory_domains": [ 00:23:46.248 { 00:23:46.248 "dma_device_id": "system", 00:23:46.248 "dma_device_type": 1 00:23:46.248 }, 00:23:46.248 { 00:23:46.248 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:46.248 "dma_device_type": 2 00:23:46.248 }, 00:23:46.248 { 00:23:46.248 "dma_device_id": "system", 00:23:46.248 "dma_device_type": 1 00:23:46.248 }, 00:23:46.248 { 00:23:46.248 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:46.248 "dma_device_type": 2 00:23:46.248 } 00:23:46.248 ], 00:23:46.248 "driver_specific": { 00:23:46.248 "raid": { 00:23:46.248 "uuid": "0a5ce7f5-2daa-4735-81b4-99bd079564ad", 00:23:46.248 "strip_size_kb": 0, 00:23:46.248 "state": "online", 00:23:46.248 "raid_level": "raid1", 00:23:46.248 "superblock": true, 00:23:46.248 "num_base_bdevs": 2, 00:23:46.248 "num_base_bdevs_discovered": 2, 00:23:46.248 "num_base_bdevs_operational": 2, 00:23:46.248 "base_bdevs_list": [ 00:23:46.248 { 00:23:46.248 "name": "BaseBdev1", 00:23:46.248 "uuid": "ab84aac6-c076-4895-bb0d-eca1ca31f36f", 00:23:46.248 "is_configured": true, 00:23:46.248 "data_offset": 256, 00:23:46.248 "data_size": 7936 00:23:46.248 }, 00:23:46.248 { 00:23:46.248 "name": "BaseBdev2", 00:23:46.248 "uuid": "4ae93647-6e0d-41ab-851f-5ec40eae73b9", 00:23:46.248 "is_configured": true, 00:23:46.248 "data_offset": 256, 00:23:46.248 "data_size": 7936 00:23:46.248 } 00:23:46.248 ] 00:23:46.248 } 00:23:46.248 } 00:23:46.248 }' 00:23:46.248 00:05:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:46.248 00:05:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:23:46.248 BaseBdev2' 00:23:46.248 00:05:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:23:46.248 00:05:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:23:46.248 00:05:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:23:46.506 00:05:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:23:46.506 "name": "BaseBdev1", 00:23:46.506 "aliases": [ 00:23:46.506 "ab84aac6-c076-4895-bb0d-eca1ca31f36f" 00:23:46.506 ], 00:23:46.506 "product_name": "Malloc disk", 00:23:46.506 "block_size": 4096, 00:23:46.506 "num_blocks": 8192, 00:23:46.506 "uuid": "ab84aac6-c076-4895-bb0d-eca1ca31f36f", 00:23:46.506 "assigned_rate_limits": { 00:23:46.506 "rw_ios_per_sec": 0, 00:23:46.506 "rw_mbytes_per_sec": 0, 00:23:46.506 "r_mbytes_per_sec": 0, 00:23:46.506 "w_mbytes_per_sec": 0 00:23:46.506 }, 00:23:46.506 "claimed": true, 00:23:46.506 "claim_type": "exclusive_write", 00:23:46.506 "zoned": false, 00:23:46.506 "supported_io_types": { 00:23:46.506 "read": true, 00:23:46.506 "write": true, 00:23:46.506 "unmap": true, 00:23:46.506 "write_zeroes": true, 00:23:46.506 "flush": true, 00:23:46.506 "reset": true, 00:23:46.506 "compare": false, 00:23:46.506 "compare_and_write": false, 00:23:46.506 "abort": true, 00:23:46.506 "nvme_admin": false, 00:23:46.506 "nvme_io": false 00:23:46.506 }, 00:23:46.506 "memory_domains": [ 00:23:46.506 { 00:23:46.506 "dma_device_id": "system", 00:23:46.506 "dma_device_type": 1 00:23:46.506 }, 00:23:46.506 { 00:23:46.506 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:46.506 "dma_device_type": 2 00:23:46.506 } 00:23:46.506 ], 00:23:46.506 "driver_specific": {} 00:23:46.506 }' 00:23:46.506 00:05:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:46.506 00:05:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:46.506 00:05:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:23:46.506 00:05:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:46.506 00:05:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:46.763 00:05:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:46.763 00:05:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:46.763 00:05:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:46.763 00:05:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:46.763 00:05:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:46.763 00:05:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:46.763 00:05:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:23:46.763 00:05:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:23:46.763 00:05:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:23:46.763 00:05:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:23:47.021 00:05:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:23:47.021 "name": "BaseBdev2", 00:23:47.021 "aliases": [ 00:23:47.021 "4ae93647-6e0d-41ab-851f-5ec40eae73b9" 00:23:47.021 ], 00:23:47.021 "product_name": "Malloc disk", 00:23:47.021 "block_size": 4096, 00:23:47.021 "num_blocks": 8192, 00:23:47.021 "uuid": "4ae93647-6e0d-41ab-851f-5ec40eae73b9", 00:23:47.021 "assigned_rate_limits": { 00:23:47.021 "rw_ios_per_sec": 0, 00:23:47.021 "rw_mbytes_per_sec": 0, 00:23:47.021 "r_mbytes_per_sec": 0, 00:23:47.021 "w_mbytes_per_sec": 0 00:23:47.021 }, 00:23:47.021 "claimed": true, 00:23:47.021 "claim_type": "exclusive_write", 00:23:47.021 "zoned": false, 00:23:47.021 "supported_io_types": { 00:23:47.021 "read": true, 00:23:47.021 "write": true, 00:23:47.021 "unmap": true, 00:23:47.021 "write_zeroes": true, 00:23:47.021 "flush": true, 00:23:47.021 "reset": true, 00:23:47.021 "compare": false, 00:23:47.021 "compare_and_write": false, 00:23:47.021 "abort": true, 00:23:47.021 "nvme_admin": false, 00:23:47.021 "nvme_io": false 00:23:47.021 }, 00:23:47.021 "memory_domains": [ 00:23:47.021 { 00:23:47.021 "dma_device_id": "system", 00:23:47.021 "dma_device_type": 1 00:23:47.021 }, 00:23:47.021 { 00:23:47.021 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:47.021 "dma_device_type": 2 00:23:47.021 } 00:23:47.021 ], 00:23:47.021 "driver_specific": {} 00:23:47.021 }' 00:23:47.021 00:05:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:47.021 00:05:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:47.021 00:05:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:23:47.279 00:05:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:47.279 00:05:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:47.279 00:05:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:47.279 00:05:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:47.279 00:05:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:47.279 00:05:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:47.279 00:05:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:47.279 00:05:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:47.279 00:05:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:23:47.279 00:05:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:23:47.537 [2024-05-15 00:05:48.090317] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:47.537 00:05:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # local expected_state 00:23:47.537 00:05:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@277 -- # has_redundancy raid1 00:23:47.537 00:05:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # case $1 in 00:23:47.537 00:05:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@215 -- # return 0 00:23:47.537 00:05:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@280 -- # expected_state=online 00:23:47.537 00:05:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:23:47.537 00:05:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:23:47.537 00:05:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:47.537 00:05:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:47.537 00:05:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:47.537 00:05:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:23:47.537 00:05:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:47.537 00:05:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:47.537 00:05:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:47.537 00:05:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:47.537 00:05:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:47.537 00:05:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:47.795 00:05:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:47.795 "name": "Existed_Raid", 00:23:47.795 "uuid": "0a5ce7f5-2daa-4735-81b4-99bd079564ad", 00:23:47.795 "strip_size_kb": 0, 00:23:47.795 "state": "online", 00:23:47.795 "raid_level": "raid1", 00:23:47.795 "superblock": true, 00:23:47.795 "num_base_bdevs": 2, 00:23:47.795 "num_base_bdevs_discovered": 1, 00:23:47.795 "num_base_bdevs_operational": 1, 00:23:47.795 "base_bdevs_list": [ 00:23:47.795 { 00:23:47.795 "name": null, 00:23:47.795 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:47.795 "is_configured": false, 00:23:47.795 "data_offset": 256, 00:23:47.795 "data_size": 7936 00:23:47.795 }, 00:23:47.795 { 00:23:47.795 "name": "BaseBdev2", 00:23:47.795 "uuid": "4ae93647-6e0d-41ab-851f-5ec40eae73b9", 00:23:47.795 "is_configured": true, 00:23:47.795 "data_offset": 256, 00:23:47.795 "data_size": 7936 00:23:47.795 } 00:23:47.795 ] 00:23:47.795 }' 00:23:47.795 00:05:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:47.795 00:05:48 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:48.359 00:05:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:23:48.359 00:05:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:23:48.359 00:05:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:48.359 00:05:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:23:48.617 00:05:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:23:48.617 00:05:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:23:48.617 00:05:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:23:48.875 [2024-05-15 00:05:49.298685] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:48.875 [2024-05-15 00:05:49.298765] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:48.875 [2024-05-15 00:05:49.311464] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:48.875 [2024-05-15 00:05:49.311531] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:48.875 [2024-05-15 00:05:49.311544] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14a54b0 name Existed_Raid, state offline 00:23:48.875 00:05:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:23:48.875 00:05:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:23:48.875 00:05:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:48.875 00:05:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:23:49.133 00:05:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:23:49.133 00:05:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:23:49.133 00:05:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@300 -- # '[' 2 -gt 2 ']' 00:23:49.133 00:05:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@342 -- # killprocess 504737 00:23:49.133 00:05:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@946 -- # '[' -z 504737 ']' 00:23:49.133 00:05:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@950 -- # kill -0 504737 00:23:49.133 00:05:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@951 -- # uname 00:23:49.133 00:05:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:23:49.133 00:05:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 504737 00:23:49.133 00:05:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:23:49.133 00:05:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:23:49.133 00:05:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@964 -- # echo 'killing process with pid 504737' 00:23:49.133 killing process with pid 504737 00:23:49.133 00:05:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@965 -- # kill 504737 00:23:49.133 [2024-05-15 00:05:49.617796] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:49.133 00:05:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@970 -- # wait 504737 00:23:49.134 [2024-05-15 00:05:49.618778] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:49.392 00:05:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@344 -- # return 0 00:23:49.392 00:23:49.392 real 0m10.138s 00:23:49.392 user 0m17.911s 00:23:49.392 sys 0m1.976s 00:23:49.392 00:05:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1122 -- # xtrace_disable 00:23:49.392 00:05:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:49.392 ************************************ 00:23:49.392 END TEST raid_state_function_test_sb_4k 00:23:49.392 ************************************ 00:23:49.392 00:05:49 bdev_raid -- bdev/bdev_raid.sh@845 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:23:49.392 00:05:49 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:23:49.392 00:05:49 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:23:49.392 00:05:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:49.392 ************************************ 00:23:49.392 START TEST raid_superblock_test_4k 00:23:49.392 ************************************ 00:23:49.392 00:05:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1121 -- # raid_superblock_test raid1 2 00:23:49.392 00:05:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local raid_level=raid1 00:23:49.392 00:05:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=2 00:23:49.392 00:05:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:23:49.392 00:05:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:23:49.392 00:05:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:23:49.392 00:05:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:23:49.392 00:05:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:23:49.392 00:05:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:23:49.392 00:05:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:23:49.392 00:05:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size 00:23:49.392 00:05:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:23:49.392 00:05:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:23:49.392 00:05:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:23:49.392 00:05:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@404 -- # '[' raid1 '!=' raid1 ']' 00:23:49.392 00:05:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@408 -- # strip_size=0 00:23:49.392 00:05:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # raid_pid=506357 00:23:49.392 00:05:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@413 -- # waitforlisten 506357 /var/tmp/spdk-raid.sock 00:23:49.392 00:05:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:23:49.392 00:05:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@827 -- # '[' -z 506357 ']' 00:23:49.392 00:05:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:49.392 00:05:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@832 -- # local max_retries=100 00:23:49.392 00:05:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:49.392 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:49.392 00:05:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@836 -- # xtrace_disable 00:23:49.392 00:05:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:49.651 [2024-05-15 00:05:50.023124] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:23:49.651 [2024-05-15 00:05:50.023204] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid506357 ] 00:23:49.651 [2024-05-15 00:05:50.157363] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:49.908 [2024-05-15 00:05:50.261233] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:23:49.908 [2024-05-15 00:05:50.319109] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:49.908 [2024-05-15 00:05:50.319147] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:50.474 00:05:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:23:50.474 00:05:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@860 -- # return 0 00:23:50.474 00:05:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:23:50.474 00:05:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:23:50.474 00:05:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:23:50.474 00:05:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:23:50.474 00:05:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:23:50.474 00:05:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:50.474 00:05:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:23:50.474 00:05:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:50.474 00:05:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:23:50.733 malloc1 00:23:50.733 00:05:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:50.996 [2024-05-15 00:05:51.399053] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:50.996 [2024-05-15 00:05:51.399107] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:50.996 [2024-05-15 00:05:51.399132] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1145780 00:23:50.996 [2024-05-15 00:05:51.399145] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:50.996 [2024-05-15 00:05:51.400933] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:50.996 [2024-05-15 00:05:51.400965] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:50.996 pt1 00:23:50.996 00:05:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:23:50.996 00:05:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:23:50.996 00:05:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:23:50.996 00:05:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:23:50.996 00:05:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:23:50.996 00:05:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:50.996 00:05:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:23:50.996 00:05:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:50.996 00:05:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:23:51.254 malloc2 00:23:51.254 00:05:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:51.512 [2024-05-15 00:05:51.886560] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:51.512 [2024-05-15 00:05:51.886614] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:51.512 [2024-05-15 00:05:51.886634] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1146b60 00:23:51.512 [2024-05-15 00:05:51.886648] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:51.512 [2024-05-15 00:05:51.888242] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:51.512 [2024-05-15 00:05:51.888271] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:51.512 pt2 00:23:51.512 00:05:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:23:51.512 00:05:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:23:51.512 00:05:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:23:51.769 [2024-05-15 00:05:52.119195] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:51.769 [2024-05-15 00:05:52.120553] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:51.769 [2024-05-15 00:05:52.120715] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x12f21f0 00:23:51.770 [2024-05-15 00:05:52.120729] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:51.770 [2024-05-15 00:05:52.120940] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x115c670 00:23:51.770 [2024-05-15 00:05:52.121091] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12f21f0 00:23:51.770 [2024-05-15 00:05:52.121101] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12f21f0 00:23:51.770 [2024-05-15 00:05:52.121213] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:51.770 00:05:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:51.770 00:05:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:51.770 00:05:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:51.770 00:05:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:51.770 00:05:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:51.770 00:05:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:51.770 00:05:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:51.770 00:05:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:51.770 00:05:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:51.770 00:05:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:51.770 00:05:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:51.770 00:05:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:52.027 00:05:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:52.027 "name": "raid_bdev1", 00:23:52.027 "uuid": "0b11e361-3bf9-45cf-bf83-79e3abfccb87", 00:23:52.027 "strip_size_kb": 0, 00:23:52.027 "state": "online", 00:23:52.027 "raid_level": "raid1", 00:23:52.027 "superblock": true, 00:23:52.027 "num_base_bdevs": 2, 00:23:52.027 "num_base_bdevs_discovered": 2, 00:23:52.027 "num_base_bdevs_operational": 2, 00:23:52.027 "base_bdevs_list": [ 00:23:52.027 { 00:23:52.027 "name": "pt1", 00:23:52.027 "uuid": "2862a2f2-a1e4-5d6c-8f62-430fc6077420", 00:23:52.027 "is_configured": true, 00:23:52.027 "data_offset": 256, 00:23:52.027 "data_size": 7936 00:23:52.027 }, 00:23:52.027 { 00:23:52.027 "name": "pt2", 00:23:52.027 "uuid": "ca2b0999-ddb0-5756-818c-87eb7659da0f", 00:23:52.027 "is_configured": true, 00:23:52.027 "data_offset": 256, 00:23:52.027 "data_size": 7936 00:23:52.027 } 00:23:52.027 ] 00:23:52.027 }' 00:23:52.027 00:05:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:52.027 00:05:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:52.592 00:05:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:23:52.592 00:05:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:23:52.592 00:05:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:23:52.592 00:05:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:23:52.592 00:05:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:23:52.592 00:05:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@199 -- # local name 00:23:52.592 00:05:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:52.592 00:05:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:23:52.592 [2024-05-15 00:05:53.114020] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:52.592 00:05:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:23:52.592 "name": "raid_bdev1", 00:23:52.592 "aliases": [ 00:23:52.592 "0b11e361-3bf9-45cf-bf83-79e3abfccb87" 00:23:52.592 ], 00:23:52.592 "product_name": "Raid Volume", 00:23:52.592 "block_size": 4096, 00:23:52.592 "num_blocks": 7936, 00:23:52.592 "uuid": "0b11e361-3bf9-45cf-bf83-79e3abfccb87", 00:23:52.592 "assigned_rate_limits": { 00:23:52.592 "rw_ios_per_sec": 0, 00:23:52.592 "rw_mbytes_per_sec": 0, 00:23:52.592 "r_mbytes_per_sec": 0, 00:23:52.592 "w_mbytes_per_sec": 0 00:23:52.592 }, 00:23:52.592 "claimed": false, 00:23:52.592 "zoned": false, 00:23:52.592 "supported_io_types": { 00:23:52.592 "read": true, 00:23:52.592 "write": true, 00:23:52.592 "unmap": false, 00:23:52.592 "write_zeroes": true, 00:23:52.592 "flush": false, 00:23:52.592 "reset": true, 00:23:52.592 "compare": false, 00:23:52.592 "compare_and_write": false, 00:23:52.592 "abort": false, 00:23:52.592 "nvme_admin": false, 00:23:52.592 "nvme_io": false 00:23:52.592 }, 00:23:52.592 "memory_domains": [ 00:23:52.592 { 00:23:52.592 "dma_device_id": "system", 00:23:52.592 "dma_device_type": 1 00:23:52.592 }, 00:23:52.592 { 00:23:52.592 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:52.592 "dma_device_type": 2 00:23:52.592 }, 00:23:52.592 { 00:23:52.592 "dma_device_id": "system", 00:23:52.592 "dma_device_type": 1 00:23:52.592 }, 00:23:52.592 { 00:23:52.592 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:52.592 "dma_device_type": 2 00:23:52.592 } 00:23:52.592 ], 00:23:52.592 "driver_specific": { 00:23:52.592 "raid": { 00:23:52.592 "uuid": "0b11e361-3bf9-45cf-bf83-79e3abfccb87", 00:23:52.592 "strip_size_kb": 0, 00:23:52.592 "state": "online", 00:23:52.592 "raid_level": "raid1", 00:23:52.592 "superblock": true, 00:23:52.592 "num_base_bdevs": 2, 00:23:52.592 "num_base_bdevs_discovered": 2, 00:23:52.592 "num_base_bdevs_operational": 2, 00:23:52.592 "base_bdevs_list": [ 00:23:52.592 { 00:23:52.592 "name": "pt1", 00:23:52.592 "uuid": "2862a2f2-a1e4-5d6c-8f62-430fc6077420", 00:23:52.592 "is_configured": true, 00:23:52.592 "data_offset": 256, 00:23:52.592 "data_size": 7936 00:23:52.592 }, 00:23:52.592 { 00:23:52.592 "name": "pt2", 00:23:52.592 "uuid": "ca2b0999-ddb0-5756-818c-87eb7659da0f", 00:23:52.592 "is_configured": true, 00:23:52.592 "data_offset": 256, 00:23:52.592 "data_size": 7936 00:23:52.592 } 00:23:52.592 ] 00:23:52.592 } 00:23:52.592 } 00:23:52.592 }' 00:23:52.592 00:05:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:52.592 00:05:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:23:52.592 pt2' 00:23:52.592 00:05:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:23:52.592 00:05:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:52.592 00:05:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:23:52.849 00:05:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:23:52.849 "name": "pt1", 00:23:52.849 "aliases": [ 00:23:52.849 "2862a2f2-a1e4-5d6c-8f62-430fc6077420" 00:23:52.849 ], 00:23:52.849 "product_name": "passthru", 00:23:52.849 "block_size": 4096, 00:23:52.849 "num_blocks": 8192, 00:23:52.849 "uuid": "2862a2f2-a1e4-5d6c-8f62-430fc6077420", 00:23:52.849 "assigned_rate_limits": { 00:23:52.849 "rw_ios_per_sec": 0, 00:23:52.849 "rw_mbytes_per_sec": 0, 00:23:52.849 "r_mbytes_per_sec": 0, 00:23:52.849 "w_mbytes_per_sec": 0 00:23:52.849 }, 00:23:52.849 "claimed": true, 00:23:52.849 "claim_type": "exclusive_write", 00:23:52.849 "zoned": false, 00:23:52.849 "supported_io_types": { 00:23:52.849 "read": true, 00:23:52.849 "write": true, 00:23:52.849 "unmap": true, 00:23:52.849 "write_zeroes": true, 00:23:52.849 "flush": true, 00:23:52.849 "reset": true, 00:23:52.849 "compare": false, 00:23:52.849 "compare_and_write": false, 00:23:52.849 "abort": true, 00:23:52.849 "nvme_admin": false, 00:23:52.849 "nvme_io": false 00:23:52.849 }, 00:23:52.849 "memory_domains": [ 00:23:52.849 { 00:23:52.849 "dma_device_id": "system", 00:23:52.849 "dma_device_type": 1 00:23:52.849 }, 00:23:52.850 { 00:23:52.850 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:52.850 "dma_device_type": 2 00:23:52.850 } 00:23:52.850 ], 00:23:52.850 "driver_specific": { 00:23:52.850 "passthru": { 00:23:52.850 "name": "pt1", 00:23:52.850 "base_bdev_name": "malloc1" 00:23:52.850 } 00:23:52.850 } 00:23:52.850 }' 00:23:52.850 00:05:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:52.850 00:05:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:52.850 00:05:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:23:52.850 00:05:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:53.107 00:05:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:53.107 00:05:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:53.107 00:05:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:53.107 00:05:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:53.107 00:05:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:53.107 00:05:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:53.107 00:05:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:53.107 00:05:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:23:53.107 00:05:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:23:53.107 00:05:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:23:53.107 00:05:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:53.364 00:05:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:23:53.364 "name": "pt2", 00:23:53.364 "aliases": [ 00:23:53.364 "ca2b0999-ddb0-5756-818c-87eb7659da0f" 00:23:53.364 ], 00:23:53.364 "product_name": "passthru", 00:23:53.364 "block_size": 4096, 00:23:53.364 "num_blocks": 8192, 00:23:53.364 "uuid": "ca2b0999-ddb0-5756-818c-87eb7659da0f", 00:23:53.364 "assigned_rate_limits": { 00:23:53.364 "rw_ios_per_sec": 0, 00:23:53.364 "rw_mbytes_per_sec": 0, 00:23:53.364 "r_mbytes_per_sec": 0, 00:23:53.365 "w_mbytes_per_sec": 0 00:23:53.365 }, 00:23:53.365 "claimed": true, 00:23:53.365 "claim_type": "exclusive_write", 00:23:53.365 "zoned": false, 00:23:53.365 "supported_io_types": { 00:23:53.365 "read": true, 00:23:53.365 "write": true, 00:23:53.365 "unmap": true, 00:23:53.365 "write_zeroes": true, 00:23:53.365 "flush": true, 00:23:53.365 "reset": true, 00:23:53.365 "compare": false, 00:23:53.365 "compare_and_write": false, 00:23:53.365 "abort": true, 00:23:53.365 "nvme_admin": false, 00:23:53.365 "nvme_io": false 00:23:53.365 }, 00:23:53.365 "memory_domains": [ 00:23:53.365 { 00:23:53.365 "dma_device_id": "system", 00:23:53.365 "dma_device_type": 1 00:23:53.365 }, 00:23:53.365 { 00:23:53.365 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:53.365 "dma_device_type": 2 00:23:53.365 } 00:23:53.365 ], 00:23:53.365 "driver_specific": { 00:23:53.365 "passthru": { 00:23:53.365 "name": "pt2", 00:23:53.365 "base_bdev_name": "malloc2" 00:23:53.365 } 00:23:53.365 } 00:23:53.365 }' 00:23:53.365 00:05:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:53.365 00:05:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:53.623 00:05:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:23:53.623 00:05:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:53.623 00:05:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:53.623 00:05:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:53.623 00:05:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:53.623 00:05:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:53.623 00:05:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:53.623 00:05:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:53.623 00:05:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:53.623 00:05:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:23:53.623 00:05:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:53.623 00:05:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:23:53.881 [2024-05-15 00:05:54.417468] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:53.881 00:05:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=0b11e361-3bf9-45cf-bf83-79e3abfccb87 00:23:53.881 00:05:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@436 -- # '[' -z 0b11e361-3bf9-45cf-bf83-79e3abfccb87 ']' 00:23:53.881 00:05:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:54.138 [2024-05-15 00:05:54.649857] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:54.138 [2024-05-15 00:05:54.649882] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:54.138 [2024-05-15 00:05:54.649940] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:54.138 [2024-05-15 00:05:54.649995] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:54.138 [2024-05-15 00:05:54.650007] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12f21f0 name raid_bdev1, state offline 00:23:54.138 00:05:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:54.138 00:05:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:23:54.397 00:05:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:23:54.397 00:05:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:23:54.397 00:05:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:23:54.397 00:05:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:54.654 00:05:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:23:54.654 00:05:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:54.912 00:05:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:23:54.912 00:05:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:23:55.170 00:05:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:23:55.170 00:05:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:23:55.170 00:05:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@648 -- # local es=0 00:23:55.170 00:05:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:23:55.170 00:05:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:55.170 00:05:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:55.170 00:05:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:55.170 00:05:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:55.170 00:05:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:55.170 00:05:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:55.170 00:05:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:55.170 00:05:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:55.170 00:05:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:23:55.428 [2024-05-15 00:05:55.784822] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:23:55.428 [2024-05-15 00:05:55.786171] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:23:55.428 [2024-05-15 00:05:55.786230] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:23:55.428 [2024-05-15 00:05:55.786270] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:23:55.428 [2024-05-15 00:05:55.786289] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:55.428 [2024-05-15 00:05:55.786299] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1145c00 name raid_bdev1, state configuring 00:23:55.428 request: 00:23:55.428 { 00:23:55.428 "name": "raid_bdev1", 00:23:55.428 "raid_level": "raid1", 00:23:55.428 "base_bdevs": [ 00:23:55.428 "malloc1", 00:23:55.428 "malloc2" 00:23:55.428 ], 00:23:55.428 "superblock": false, 00:23:55.428 "method": "bdev_raid_create", 00:23:55.428 "req_id": 1 00:23:55.428 } 00:23:55.428 Got JSON-RPC error response 00:23:55.428 response: 00:23:55.428 { 00:23:55.428 "code": -17, 00:23:55.428 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:23:55.428 } 00:23:55.428 00:05:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # es=1 00:23:55.428 00:05:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:23:55.428 00:05:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:23:55.428 00:05:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:23:55.428 00:05:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:55.428 00:05:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:23:55.686 00:05:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:23:55.686 00:05:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:23:55.686 00:05:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:55.944 [2024-05-15 00:05:56.278067] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:55.944 [2024-05-15 00:05:56.278117] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:55.944 [2024-05-15 00:05:56.278143] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11459b0 00:23:55.944 [2024-05-15 00:05:56.278156] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:55.944 [2024-05-15 00:05:56.279764] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:55.944 [2024-05-15 00:05:56.279793] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:55.944 [2024-05-15 00:05:56.279863] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt1 00:23:55.944 [2024-05-15 00:05:56.279889] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:55.944 pt1 00:23:55.944 00:05:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:23:55.944 00:05:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:55.944 00:05:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:23:55.944 00:05:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:55.944 00:05:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:55.944 00:05:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:55.944 00:05:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:55.944 00:05:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:55.944 00:05:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:55.944 00:05:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:55.944 00:05:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:55.944 00:05:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:56.202 00:05:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:56.202 "name": "raid_bdev1", 00:23:56.202 "uuid": "0b11e361-3bf9-45cf-bf83-79e3abfccb87", 00:23:56.202 "strip_size_kb": 0, 00:23:56.202 "state": "configuring", 00:23:56.203 "raid_level": "raid1", 00:23:56.203 "superblock": true, 00:23:56.203 "num_base_bdevs": 2, 00:23:56.203 "num_base_bdevs_discovered": 1, 00:23:56.203 "num_base_bdevs_operational": 2, 00:23:56.203 "base_bdevs_list": [ 00:23:56.203 { 00:23:56.203 "name": "pt1", 00:23:56.203 "uuid": "2862a2f2-a1e4-5d6c-8f62-430fc6077420", 00:23:56.203 "is_configured": true, 00:23:56.203 "data_offset": 256, 00:23:56.203 "data_size": 7936 00:23:56.203 }, 00:23:56.203 { 00:23:56.203 "name": null, 00:23:56.203 "uuid": "ca2b0999-ddb0-5756-818c-87eb7659da0f", 00:23:56.203 "is_configured": false, 00:23:56.203 "data_offset": 256, 00:23:56.203 "data_size": 7936 00:23:56.203 } 00:23:56.203 ] 00:23:56.203 }' 00:23:56.203 00:05:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:56.203 00:05:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:56.767 00:05:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@470 -- # '[' 2 -gt 2 ']' 00:23:56.767 00:05:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:23:56.767 00:05:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:23:56.767 00:05:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:57.026 [2024-05-15 00:05:57.389036] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:57.026 [2024-05-15 00:05:57.389089] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:57.026 [2024-05-15 00:05:57.389110] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12f7110 00:23:57.026 [2024-05-15 00:05:57.389123] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:57.026 [2024-05-15 00:05:57.389488] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:57.026 [2024-05-15 00:05:57.389506] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:57.026 [2024-05-15 00:05:57.389570] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:23:57.026 [2024-05-15 00:05:57.389591] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:57.026 [2024-05-15 00:05:57.389692] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x12f6a60 00:23:57.026 [2024-05-15 00:05:57.389702] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:57.026 [2024-05-15 00:05:57.389879] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11450e0 00:23:57.026 [2024-05-15 00:05:57.390006] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12f6a60 00:23:57.026 [2024-05-15 00:05:57.390016] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12f6a60 00:23:57.026 [2024-05-15 00:05:57.390112] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:57.026 pt2 00:23:57.026 00:05:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:23:57.026 00:05:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:23:57.026 00:05:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:57.026 00:05:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:57.026 00:05:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:57.026 00:05:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:57.026 00:05:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:57.026 00:05:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:57.026 00:05:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:57.026 00:05:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:57.026 00:05:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:57.026 00:05:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:57.026 00:05:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:57.026 00:05:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:57.284 00:05:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:57.284 "name": "raid_bdev1", 00:23:57.284 "uuid": "0b11e361-3bf9-45cf-bf83-79e3abfccb87", 00:23:57.284 "strip_size_kb": 0, 00:23:57.284 "state": "online", 00:23:57.284 "raid_level": "raid1", 00:23:57.284 "superblock": true, 00:23:57.284 "num_base_bdevs": 2, 00:23:57.284 "num_base_bdevs_discovered": 2, 00:23:57.284 "num_base_bdevs_operational": 2, 00:23:57.284 "base_bdevs_list": [ 00:23:57.284 { 00:23:57.284 "name": "pt1", 00:23:57.284 "uuid": "2862a2f2-a1e4-5d6c-8f62-430fc6077420", 00:23:57.284 "is_configured": true, 00:23:57.284 "data_offset": 256, 00:23:57.284 "data_size": 7936 00:23:57.284 }, 00:23:57.284 { 00:23:57.284 "name": "pt2", 00:23:57.284 "uuid": "ca2b0999-ddb0-5756-818c-87eb7659da0f", 00:23:57.284 "is_configured": true, 00:23:57.284 "data_offset": 256, 00:23:57.284 "data_size": 7936 00:23:57.284 } 00:23:57.284 ] 00:23:57.284 }' 00:23:57.284 00:05:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:57.284 00:05:57 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:57.850 00:05:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:23:57.850 00:05:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:23:57.850 00:05:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:23:57.850 00:05:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:23:57.850 00:05:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:23:57.850 00:05:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@199 -- # local name 00:23:57.850 00:05:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:57.850 00:05:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:23:58.108 [2024-05-15 00:05:58.488159] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:58.108 00:05:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:23:58.108 "name": "raid_bdev1", 00:23:58.108 "aliases": [ 00:23:58.108 "0b11e361-3bf9-45cf-bf83-79e3abfccb87" 00:23:58.109 ], 00:23:58.109 "product_name": "Raid Volume", 00:23:58.109 "block_size": 4096, 00:23:58.109 "num_blocks": 7936, 00:23:58.109 "uuid": "0b11e361-3bf9-45cf-bf83-79e3abfccb87", 00:23:58.109 "assigned_rate_limits": { 00:23:58.109 "rw_ios_per_sec": 0, 00:23:58.109 "rw_mbytes_per_sec": 0, 00:23:58.109 "r_mbytes_per_sec": 0, 00:23:58.109 "w_mbytes_per_sec": 0 00:23:58.109 }, 00:23:58.109 "claimed": false, 00:23:58.109 "zoned": false, 00:23:58.109 "supported_io_types": { 00:23:58.109 "read": true, 00:23:58.109 "write": true, 00:23:58.109 "unmap": false, 00:23:58.109 "write_zeroes": true, 00:23:58.109 "flush": false, 00:23:58.109 "reset": true, 00:23:58.109 "compare": false, 00:23:58.109 "compare_and_write": false, 00:23:58.109 "abort": false, 00:23:58.109 "nvme_admin": false, 00:23:58.109 "nvme_io": false 00:23:58.109 }, 00:23:58.109 "memory_domains": [ 00:23:58.109 { 00:23:58.109 "dma_device_id": "system", 00:23:58.109 "dma_device_type": 1 00:23:58.109 }, 00:23:58.109 { 00:23:58.109 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:58.109 "dma_device_type": 2 00:23:58.109 }, 00:23:58.109 { 00:23:58.109 "dma_device_id": "system", 00:23:58.109 "dma_device_type": 1 00:23:58.109 }, 00:23:58.109 { 00:23:58.109 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:58.109 "dma_device_type": 2 00:23:58.109 } 00:23:58.109 ], 00:23:58.109 "driver_specific": { 00:23:58.109 "raid": { 00:23:58.109 "uuid": "0b11e361-3bf9-45cf-bf83-79e3abfccb87", 00:23:58.109 "strip_size_kb": 0, 00:23:58.109 "state": "online", 00:23:58.109 "raid_level": "raid1", 00:23:58.109 "superblock": true, 00:23:58.109 "num_base_bdevs": 2, 00:23:58.109 "num_base_bdevs_discovered": 2, 00:23:58.109 "num_base_bdevs_operational": 2, 00:23:58.109 "base_bdevs_list": [ 00:23:58.109 { 00:23:58.109 "name": "pt1", 00:23:58.109 "uuid": "2862a2f2-a1e4-5d6c-8f62-430fc6077420", 00:23:58.109 "is_configured": true, 00:23:58.109 "data_offset": 256, 00:23:58.109 "data_size": 7936 00:23:58.109 }, 00:23:58.109 { 00:23:58.109 "name": "pt2", 00:23:58.109 "uuid": "ca2b0999-ddb0-5756-818c-87eb7659da0f", 00:23:58.109 "is_configured": true, 00:23:58.109 "data_offset": 256, 00:23:58.109 "data_size": 7936 00:23:58.109 } 00:23:58.109 ] 00:23:58.109 } 00:23:58.109 } 00:23:58.109 }' 00:23:58.109 00:05:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:58.109 00:05:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:23:58.109 pt2' 00:23:58.109 00:05:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:23:58.109 00:05:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:58.109 00:05:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:23:58.367 00:05:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:23:58.367 "name": "pt1", 00:23:58.367 "aliases": [ 00:23:58.367 "2862a2f2-a1e4-5d6c-8f62-430fc6077420" 00:23:58.367 ], 00:23:58.367 "product_name": "passthru", 00:23:58.367 "block_size": 4096, 00:23:58.367 "num_blocks": 8192, 00:23:58.367 "uuid": "2862a2f2-a1e4-5d6c-8f62-430fc6077420", 00:23:58.367 "assigned_rate_limits": { 00:23:58.367 "rw_ios_per_sec": 0, 00:23:58.367 "rw_mbytes_per_sec": 0, 00:23:58.367 "r_mbytes_per_sec": 0, 00:23:58.367 "w_mbytes_per_sec": 0 00:23:58.367 }, 00:23:58.367 "claimed": true, 00:23:58.367 "claim_type": "exclusive_write", 00:23:58.367 "zoned": false, 00:23:58.367 "supported_io_types": { 00:23:58.367 "read": true, 00:23:58.367 "write": true, 00:23:58.367 "unmap": true, 00:23:58.367 "write_zeroes": true, 00:23:58.367 "flush": true, 00:23:58.367 "reset": true, 00:23:58.367 "compare": false, 00:23:58.367 "compare_and_write": false, 00:23:58.367 "abort": true, 00:23:58.367 "nvme_admin": false, 00:23:58.367 "nvme_io": false 00:23:58.367 }, 00:23:58.367 "memory_domains": [ 00:23:58.367 { 00:23:58.367 "dma_device_id": "system", 00:23:58.367 "dma_device_type": 1 00:23:58.367 }, 00:23:58.367 { 00:23:58.367 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:58.367 "dma_device_type": 2 00:23:58.367 } 00:23:58.367 ], 00:23:58.367 "driver_specific": { 00:23:58.367 "passthru": { 00:23:58.367 "name": "pt1", 00:23:58.367 "base_bdev_name": "malloc1" 00:23:58.367 } 00:23:58.367 } 00:23:58.367 }' 00:23:58.367 00:05:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:58.367 00:05:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:58.367 00:05:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:23:58.367 00:05:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:58.625 00:05:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:58.625 00:05:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:58.625 00:05:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:58.625 00:05:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:58.625 00:05:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:58.625 00:05:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:58.625 00:05:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:58.625 00:05:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:23:58.625 00:05:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:23:58.625 00:05:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:58.625 00:05:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:23:58.882 00:05:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:23:58.883 "name": "pt2", 00:23:58.883 "aliases": [ 00:23:58.883 "ca2b0999-ddb0-5756-818c-87eb7659da0f" 00:23:58.883 ], 00:23:58.883 "product_name": "passthru", 00:23:58.883 "block_size": 4096, 00:23:58.883 "num_blocks": 8192, 00:23:58.883 "uuid": "ca2b0999-ddb0-5756-818c-87eb7659da0f", 00:23:58.883 "assigned_rate_limits": { 00:23:58.883 "rw_ios_per_sec": 0, 00:23:58.883 "rw_mbytes_per_sec": 0, 00:23:58.883 "r_mbytes_per_sec": 0, 00:23:58.883 "w_mbytes_per_sec": 0 00:23:58.883 }, 00:23:58.883 "claimed": true, 00:23:58.883 "claim_type": "exclusive_write", 00:23:58.883 "zoned": false, 00:23:58.883 "supported_io_types": { 00:23:58.883 "read": true, 00:23:58.883 "write": true, 00:23:58.883 "unmap": true, 00:23:58.883 "write_zeroes": true, 00:23:58.883 "flush": true, 00:23:58.883 "reset": true, 00:23:58.883 "compare": false, 00:23:58.883 "compare_and_write": false, 00:23:58.883 "abort": true, 00:23:58.883 "nvme_admin": false, 00:23:58.883 "nvme_io": false 00:23:58.883 }, 00:23:58.883 "memory_domains": [ 00:23:58.883 { 00:23:58.883 "dma_device_id": "system", 00:23:58.883 "dma_device_type": 1 00:23:58.883 }, 00:23:58.883 { 00:23:58.883 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:58.883 "dma_device_type": 2 00:23:58.883 } 00:23:58.883 ], 00:23:58.883 "driver_specific": { 00:23:58.883 "passthru": { 00:23:58.883 "name": "pt2", 00:23:58.883 "base_bdev_name": "malloc2" 00:23:58.883 } 00:23:58.883 } 00:23:58.883 }' 00:23:58.883 00:05:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:58.883 00:05:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:58.883 00:05:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:23:58.883 00:05:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:58.883 00:05:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:59.141 00:05:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:59.141 00:05:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:59.141 00:05:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:59.141 00:05:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:59.141 00:05:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:59.141 00:05:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:59.141 00:05:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:23:59.141 00:05:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:59.141 00:05:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:23:59.400 [2024-05-15 00:05:59.899856] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:59.400 00:05:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@487 -- # '[' 0b11e361-3bf9-45cf-bf83-79e3abfccb87 '!=' 0b11e361-3bf9-45cf-bf83-79e3abfccb87 ']' 00:23:59.400 00:05:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@491 -- # has_redundancy raid1 00:23:59.400 00:05:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # case $1 in 00:23:59.400 00:05:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@215 -- # return 0 00:23:59.400 00:05:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@493 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:59.658 [2024-05-15 00:06:00.148336] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:23:59.658 00:06:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@496 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:59.658 00:06:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:59.658 00:06:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:59.658 00:06:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:59.658 00:06:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:59.658 00:06:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:23:59.658 00:06:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:59.658 00:06:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:59.658 00:06:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:59.658 00:06:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:59.658 00:06:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:59.658 00:06:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:59.917 00:06:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:59.917 "name": "raid_bdev1", 00:23:59.917 "uuid": "0b11e361-3bf9-45cf-bf83-79e3abfccb87", 00:23:59.917 "strip_size_kb": 0, 00:23:59.917 "state": "online", 00:23:59.917 "raid_level": "raid1", 00:23:59.917 "superblock": true, 00:23:59.917 "num_base_bdevs": 2, 00:23:59.917 "num_base_bdevs_discovered": 1, 00:23:59.917 "num_base_bdevs_operational": 1, 00:23:59.917 "base_bdevs_list": [ 00:23:59.917 { 00:23:59.917 "name": null, 00:23:59.917 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:59.917 "is_configured": false, 00:23:59.917 "data_offset": 256, 00:23:59.917 "data_size": 7936 00:23:59.917 }, 00:23:59.917 { 00:23:59.917 "name": "pt2", 00:23:59.917 "uuid": "ca2b0999-ddb0-5756-818c-87eb7659da0f", 00:23:59.917 "is_configured": true, 00:23:59.917 "data_offset": 256, 00:23:59.917 "data_size": 7936 00:23:59.917 } 00:23:59.917 ] 00:23:59.917 }' 00:23:59.917 00:06:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:59.917 00:06:00 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:24:00.483 00:06:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:00.742 [2024-05-15 00:06:01.247210] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:00.742 [2024-05-15 00:06:01.247239] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:00.742 [2024-05-15 00:06:01.247311] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:00.742 [2024-05-15 00:06:01.247361] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:00.742 [2024-05-15 00:06:01.247373] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12f6a60 name raid_bdev1, state offline 00:24:00.742 00:06:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:00.742 00:06:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # jq -r '.[]' 00:24:01.002 00:06:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # raid_bdev= 00:24:01.002 00:06:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@501 -- # '[' -n '' ']' 00:24:01.002 00:06:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # (( i = 1 )) 00:24:01.002 00:06:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:24:01.002 00:06:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@507 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:24:01.284 00:06:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # (( i++ )) 00:24:01.284 00:06:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:24:01.284 00:06:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@511 -- # (( i = 1 )) 00:24:01.284 00:06:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@511 -- # (( i < num_base_bdevs - 1 )) 00:24:01.284 00:06:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # i=1 00:24:01.284 00:06:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@520 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:24:01.543 [2024-05-15 00:06:01.989159] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:24:01.543 [2024-05-15 00:06:01.989206] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:01.543 [2024-05-15 00:06:01.989225] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12eeaa0 00:24:01.543 [2024-05-15 00:06:01.989238] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:01.543 [2024-05-15 00:06:01.990870] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:01.543 [2024-05-15 00:06:01.990900] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:24:01.543 [2024-05-15 00:06:01.990974] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:24:01.543 [2024-05-15 00:06:01.991002] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:01.543 [2024-05-15 00:06:01.991090] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x12efbd0 00:24:01.543 [2024-05-15 00:06:01.991101] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:01.543 [2024-05-15 00:06:01.991278] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1145450 00:24:01.543 [2024-05-15 00:06:01.991414] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12efbd0 00:24:01.543 [2024-05-15 00:06:01.991424] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12efbd0 00:24:01.543 [2024-05-15 00:06:01.991527] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:01.543 pt2 00:24:01.543 00:06:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@523 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:01.543 00:06:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:01.543 00:06:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:01.543 00:06:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:01.543 00:06:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:01.543 00:06:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:24:01.543 00:06:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:01.543 00:06:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:01.543 00:06:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:01.543 00:06:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:01.543 00:06:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:01.543 00:06:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:01.801 00:06:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:01.801 "name": "raid_bdev1", 00:24:01.801 "uuid": "0b11e361-3bf9-45cf-bf83-79e3abfccb87", 00:24:01.801 "strip_size_kb": 0, 00:24:01.801 "state": "online", 00:24:01.801 "raid_level": "raid1", 00:24:01.801 "superblock": true, 00:24:01.801 "num_base_bdevs": 2, 00:24:01.801 "num_base_bdevs_discovered": 1, 00:24:01.801 "num_base_bdevs_operational": 1, 00:24:01.801 "base_bdevs_list": [ 00:24:01.801 { 00:24:01.801 "name": null, 00:24:01.801 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:01.801 "is_configured": false, 00:24:01.801 "data_offset": 256, 00:24:01.801 "data_size": 7936 00:24:01.801 }, 00:24:01.801 { 00:24:01.801 "name": "pt2", 00:24:01.801 "uuid": "ca2b0999-ddb0-5756-818c-87eb7659da0f", 00:24:01.801 "is_configured": true, 00:24:01.801 "data_offset": 256, 00:24:01.801 "data_size": 7936 00:24:01.801 } 00:24:01.801 ] 00:24:01.801 }' 00:24:01.802 00:06:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:01.802 00:06:02 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:24:02.369 00:06:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@525 -- # '[' 2 -gt 2 ']' 00:24:02.369 00:06:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@563 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:02.369 00:06:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@563 -- # jq -r '.[] | .uuid' 00:24:02.628 [2024-05-15 00:06:03.036124] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:02.628 00:06:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@563 -- # '[' 0b11e361-3bf9-45cf-bf83-79e3abfccb87 '!=' 0b11e361-3bf9-45cf-bf83-79e3abfccb87 ']' 00:24:02.628 00:06:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@568 -- # killprocess 506357 00:24:02.628 00:06:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@946 -- # '[' -z 506357 ']' 00:24:02.628 00:06:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@950 -- # kill -0 506357 00:24:02.628 00:06:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@951 -- # uname 00:24:02.628 00:06:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:24:02.628 00:06:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 506357 00:24:02.628 00:06:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:24:02.628 00:06:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:24:02.628 00:06:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@964 -- # echo 'killing process with pid 506357' 00:24:02.628 killing process with pid 506357 00:24:02.628 00:06:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@965 -- # kill 506357 00:24:02.628 [2024-05-15 00:06:03.106021] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:02.628 [2024-05-15 00:06:03.106093] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:02.628 [2024-05-15 00:06:03.106140] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:02.628 [2024-05-15 00:06:03.106155] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12efbd0 name raid_bdev1, state offline 00:24:02.628 00:06:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@970 -- # wait 506357 00:24:02.628 [2024-05-15 00:06:03.123134] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:02.887 00:06:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@570 -- # return 0 00:24:02.887 00:24:02.887 real 0m13.403s 00:24:02.887 user 0m24.173s 00:24:02.887 sys 0m2.467s 00:24:02.887 00:06:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1122 -- # xtrace_disable 00:24:02.887 00:06:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:24:02.887 ************************************ 00:24:02.887 END TEST raid_superblock_test_4k 00:24:02.887 ************************************ 00:24:02.887 00:06:03 bdev_raid -- bdev/bdev_raid.sh@846 -- # '[' true = true ']' 00:24:02.887 00:06:03 bdev_raid -- bdev/bdev_raid.sh@847 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:24:02.887 00:06:03 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:24:02.887 00:06:03 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:24:02.887 00:06:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:02.887 ************************************ 00:24:02.887 START TEST raid_rebuild_test_sb_4k 00:24:02.887 ************************************ 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 2 true false true 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local raid_level=raid1 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local num_base_bdevs=2 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local superblock=true 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local background_io=false 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local verify=true 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@579 -- # (( i = 1 )) 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@581 -- # echo BaseBdev1 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@581 -- # echo BaseBdev2 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@579 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@579 -- # local base_bdevs 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@580 -- # local raid_bdev_name=raid_bdev1 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@581 -- # local strip_size 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@582 -- # local create_arg 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@583 -- # local raid_bdev_size 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@584 -- # local data_offset 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@586 -- # '[' raid1 '!=' raid1 ']' 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@594 -- # strip_size=0 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # '[' true = true ']' 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@598 -- # create_arg+=' -s' 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # raid_pid=508441 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@603 -- # waitforlisten 508441 /var/tmp/spdk-raid.sock 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@827 -- # '[' -z 508441 ']' 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:02.887 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@832 -- # local max_retries=100 00:24:02.888 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:02.888 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:02.888 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@836 -- # xtrace_disable 00:24:02.888 00:06:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:03.146 [2024-05-15 00:06:03.513866] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:24:03.146 [2024-05-15 00:06:03.513926] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid508441 ] 00:24:03.146 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:03.146 Zero copy mechanism will not be used. 00:24:03.146 [2024-05-15 00:06:03.640415] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:03.404 [2024-05-15 00:06:03.743740] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:24:03.404 [2024-05-15 00:06:03.802746] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:03.404 [2024-05-15 00:06:03.802792] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:03.972 00:06:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:24:03.972 00:06:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@860 -- # return 0 00:24:03.972 00:06:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:24:03.972 00:06:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:24:04.230 BaseBdev1_malloc 00:24:04.230 00:06:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:04.489 [2024-05-15 00:06:04.931535] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:04.489 [2024-05-15 00:06:04.931582] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:04.489 [2024-05-15 00:06:04.931605] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c90b50 00:24:04.489 [2024-05-15 00:06:04.931618] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:04.489 [2024-05-15 00:06:04.933451] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:04.489 [2024-05-15 00:06:04.933479] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:04.489 BaseBdev1 00:24:04.489 00:06:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:24:04.489 00:06:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:24:04.748 BaseBdev2_malloc 00:24:04.748 00:06:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:05.006 [2024-05-15 00:06:05.410820] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:05.006 [2024-05-15 00:06:05.410864] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:05.006 [2024-05-15 00:06:05.410883] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e36d10 00:24:05.006 [2024-05-15 00:06:05.410896] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:05.006 [2024-05-15 00:06:05.412530] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:05.006 [2024-05-15 00:06:05.412558] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:05.006 BaseBdev2 00:24:05.006 00:06:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:24:05.264 spare_malloc 00:24:05.264 00:06:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@613 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:05.523 spare_delay 00:24:05.523 00:06:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@614 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:05.782 [2024-05-15 00:06:06.150308] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:05.782 [2024-05-15 00:06:06.150353] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:05.782 [2024-05-15 00:06:06.150372] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e39240 00:24:05.782 [2024-05-15 00:06:06.150385] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:05.782 [2024-05-15 00:06:06.152020] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:05.782 [2024-05-15 00:06:06.152048] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:05.782 spare 00:24:05.782 00:06:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:06.040 [2024-05-15 00:06:06.378948] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:06.040 [2024-05-15 00:06:06.380275] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:06.040 [2024-05-15 00:06:06.380453] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c89f00 00:24:06.040 [2024-05-15 00:06:06.380467] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:06.040 [2024-05-15 00:06:06.380663] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c88510 00:24:06.040 [2024-05-15 00:06:06.380808] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c89f00 00:24:06.040 [2024-05-15 00:06:06.380818] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c89f00 00:24:06.040 [2024-05-15 00:06:06.380920] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:06.040 00:06:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:06.040 00:06:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:06.040 00:06:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:06.040 00:06:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:06.040 00:06:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:06.040 00:06:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:24:06.040 00:06:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:06.040 00:06:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:06.040 00:06:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:06.040 00:06:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:06.040 00:06:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:06.040 00:06:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:06.320 00:06:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:06.320 "name": "raid_bdev1", 00:24:06.320 "uuid": "544de383-906b-4887-9a79-c7236b7911ec", 00:24:06.320 "strip_size_kb": 0, 00:24:06.320 "state": "online", 00:24:06.320 "raid_level": "raid1", 00:24:06.320 "superblock": true, 00:24:06.320 "num_base_bdevs": 2, 00:24:06.320 "num_base_bdevs_discovered": 2, 00:24:06.320 "num_base_bdevs_operational": 2, 00:24:06.320 "base_bdevs_list": [ 00:24:06.320 { 00:24:06.320 "name": "BaseBdev1", 00:24:06.320 "uuid": "64e42f19-d05b-58cc-a945-c7edb2a6a200", 00:24:06.320 "is_configured": true, 00:24:06.321 "data_offset": 256, 00:24:06.321 "data_size": 7936 00:24:06.321 }, 00:24:06.321 { 00:24:06.321 "name": "BaseBdev2", 00:24:06.321 "uuid": "0c8584dd-90d9-5dd4-9f21-99867554e6ac", 00:24:06.321 "is_configured": true, 00:24:06.321 "data_offset": 256, 00:24:06.321 "data_size": 7936 00:24:06.321 } 00:24:06.321 ] 00:24:06.321 }' 00:24:06.321 00:06:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:06.321 00:06:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:06.891 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@621 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:06.891 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@621 -- # jq -r '.[].num_blocks' 00:24:06.891 [2024-05-15 00:06:07.453968] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:06.891 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@621 -- # raid_bdev_size=7936 00:24:07.150 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:07.150 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:07.150 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # data_offset=256 00:24:07.150 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@626 -- # '[' false = true ']' 00:24:07.150 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@629 -- # '[' true = true ']' 00:24:07.150 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@630 -- # local write_unit_size 00:24:07.150 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@633 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:24:07.150 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:07.150 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:24:07.150 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:07.150 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:07.150 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:07.150 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:24:07.150 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:07.150 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:07.150 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:24:07.408 [2024-05-15 00:06:07.951117] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c88510 00:24:07.408 /dev/nbd0 00:24:07.408 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:07.408 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:07.408 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:24:07.409 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@865 -- # local i 00:24:07.409 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:24:07.409 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:24:07.409 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:24:07.409 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # break 00:24:07.409 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:24:07.409 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:24:07.409 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:07.409 1+0 records in 00:24:07.409 1+0 records out 00:24:07.409 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000249368 s, 16.4 MB/s 00:24:07.409 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:07.666 00:06:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # size=4096 00:24:07.666 00:06:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:07.666 00:06:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:24:07.666 00:06:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # return 0 00:24:07.666 00:06:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:07.666 00:06:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:07.666 00:06:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # '[' raid1 = raid5f ']' 00:24:07.666 00:06:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@638 -- # write_unit_size=1 00:24:07.666 00:06:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@640 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:24:08.231 7936+0 records in 00:24:08.231 7936+0 records out 00:24:08.231 32505856 bytes (33 MB, 31 MiB) copied, 0.748441 s, 43.4 MB/s 00:24:08.231 00:06:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@641 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:08.231 00:06:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:08.231 00:06:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:08.231 00:06:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:08.231 00:06:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:24:08.231 00:06:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:08.231 00:06:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:08.489 00:06:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:08.489 [2024-05-15 00:06:09.028929] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:08.489 00:06:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:08.489 00:06:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:08.490 00:06:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:08.490 00:06:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:08.490 00:06:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:08.490 00:06:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:24:08.490 00:06:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:24:08.490 00:06:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:08.747 [2024-05-15 00:06:09.253580] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:08.747 00:06:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@648 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:08.747 00:06:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:08.747 00:06:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:08.747 00:06:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:08.747 00:06:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:08.747 00:06:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:24:08.747 00:06:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:08.747 00:06:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:08.747 00:06:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:08.747 00:06:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:08.747 00:06:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:08.747 00:06:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:09.005 00:06:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:09.005 "name": "raid_bdev1", 00:24:09.005 "uuid": "544de383-906b-4887-9a79-c7236b7911ec", 00:24:09.005 "strip_size_kb": 0, 00:24:09.005 "state": "online", 00:24:09.005 "raid_level": "raid1", 00:24:09.005 "superblock": true, 00:24:09.005 "num_base_bdevs": 2, 00:24:09.005 "num_base_bdevs_discovered": 1, 00:24:09.005 "num_base_bdevs_operational": 1, 00:24:09.005 "base_bdevs_list": [ 00:24:09.005 { 00:24:09.005 "name": null, 00:24:09.005 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:09.005 "is_configured": false, 00:24:09.005 "data_offset": 256, 00:24:09.005 "data_size": 7936 00:24:09.005 }, 00:24:09.005 { 00:24:09.005 "name": "BaseBdev2", 00:24:09.005 "uuid": "0c8584dd-90d9-5dd4-9f21-99867554e6ac", 00:24:09.005 "is_configured": true, 00:24:09.005 "data_offset": 256, 00:24:09.005 "data_size": 7936 00:24:09.005 } 00:24:09.005 ] 00:24:09.005 }' 00:24:09.005 00:06:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:09.005 00:06:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:09.570 00:06:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:09.828 [2024-05-15 00:06:10.256256] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:09.828 [2024-05-15 00:06:10.261239] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c8c750 00:24:09.828 [2024-05-15 00:06:10.263457] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:09.828 00:06:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@652 -- # sleep 1 00:24:10.761 00:06:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:10.761 00:06:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:10.761 00:06:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:24:10.761 00:06:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=spare 00:24:10.761 00:06:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:10.761 00:06:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:10.761 00:06:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:11.019 00:06:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:11.019 "name": "raid_bdev1", 00:24:11.019 "uuid": "544de383-906b-4887-9a79-c7236b7911ec", 00:24:11.019 "strip_size_kb": 0, 00:24:11.019 "state": "online", 00:24:11.019 "raid_level": "raid1", 00:24:11.019 "superblock": true, 00:24:11.019 "num_base_bdevs": 2, 00:24:11.019 "num_base_bdevs_discovered": 2, 00:24:11.019 "num_base_bdevs_operational": 2, 00:24:11.019 "process": { 00:24:11.019 "type": "rebuild", 00:24:11.019 "target": "spare", 00:24:11.019 "progress": { 00:24:11.019 "blocks": 2816, 00:24:11.019 "percent": 35 00:24:11.019 } 00:24:11.019 }, 00:24:11.019 "base_bdevs_list": [ 00:24:11.019 { 00:24:11.019 "name": "spare", 00:24:11.019 "uuid": "98fd503a-a9e7-5cd1-bc5a-1d3792adfe8c", 00:24:11.019 "is_configured": true, 00:24:11.019 "data_offset": 256, 00:24:11.019 "data_size": 7936 00:24:11.019 }, 00:24:11.019 { 00:24:11.019 "name": "BaseBdev2", 00:24:11.019 "uuid": "0c8584dd-90d9-5dd4-9f21-99867554e6ac", 00:24:11.019 "is_configured": true, 00:24:11.019 "data_offset": 256, 00:24:11.019 "data_size": 7936 00:24:11.019 } 00:24:11.019 ] 00:24:11.019 }' 00:24:11.019 00:06:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:11.019 00:06:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:11.019 00:06:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:11.019 00:06:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:24:11.019 00:06:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:11.277 [2024-05-15 00:06:11.770435] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:11.277 [2024-05-15 00:06:11.775451] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:11.277 [2024-05-15 00:06:11.775493] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:11.277 00:06:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:11.277 00:06:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:11.277 00:06:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:11.277 00:06:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:11.277 00:06:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:11.277 00:06:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:24:11.277 00:06:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:11.277 00:06:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:11.277 00:06:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:11.277 00:06:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:11.277 00:06:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:11.277 00:06:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:11.535 00:06:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:11.535 "name": "raid_bdev1", 00:24:11.535 "uuid": "544de383-906b-4887-9a79-c7236b7911ec", 00:24:11.535 "strip_size_kb": 0, 00:24:11.535 "state": "online", 00:24:11.535 "raid_level": "raid1", 00:24:11.535 "superblock": true, 00:24:11.535 "num_base_bdevs": 2, 00:24:11.535 "num_base_bdevs_discovered": 1, 00:24:11.535 "num_base_bdevs_operational": 1, 00:24:11.535 "base_bdevs_list": [ 00:24:11.535 { 00:24:11.535 "name": null, 00:24:11.535 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:11.535 "is_configured": false, 00:24:11.536 "data_offset": 256, 00:24:11.536 "data_size": 7936 00:24:11.536 }, 00:24:11.536 { 00:24:11.536 "name": "BaseBdev2", 00:24:11.536 "uuid": "0c8584dd-90d9-5dd4-9f21-99867554e6ac", 00:24:11.536 "is_configured": true, 00:24:11.536 "data_offset": 256, 00:24:11.536 "data_size": 7936 00:24:11.536 } 00:24:11.536 ] 00:24:11.536 }' 00:24:11.536 00:06:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:11.536 00:06:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:12.101 00:06:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:12.101 00:06:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:12.101 00:06:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:24:12.101 00:06:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=none 00:24:12.101 00:06:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:12.101 00:06:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:12.101 00:06:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:12.358 00:06:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:12.358 "name": "raid_bdev1", 00:24:12.358 "uuid": "544de383-906b-4887-9a79-c7236b7911ec", 00:24:12.358 "strip_size_kb": 0, 00:24:12.358 "state": "online", 00:24:12.358 "raid_level": "raid1", 00:24:12.358 "superblock": true, 00:24:12.358 "num_base_bdevs": 2, 00:24:12.358 "num_base_bdevs_discovered": 1, 00:24:12.358 "num_base_bdevs_operational": 1, 00:24:12.358 "base_bdevs_list": [ 00:24:12.358 { 00:24:12.358 "name": null, 00:24:12.358 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:12.358 "is_configured": false, 00:24:12.358 "data_offset": 256, 00:24:12.358 "data_size": 7936 00:24:12.358 }, 00:24:12.358 { 00:24:12.358 "name": "BaseBdev2", 00:24:12.358 "uuid": "0c8584dd-90d9-5dd4-9f21-99867554e6ac", 00:24:12.358 "is_configured": true, 00:24:12.358 "data_offset": 256, 00:24:12.358 "data_size": 7936 00:24:12.358 } 00:24:12.358 ] 00:24:12.358 }' 00:24:12.358 00:06:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:12.358 00:06:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:12.358 00:06:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:12.616 00:06:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:24:12.617 00:06:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@667 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:12.617 [2024-05-15 00:06:13.200428] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:12.617 [2024-05-15 00:06:13.206028] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e42680 00:24:12.874 [2024-05-15 00:06:13.207539] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:12.874 00:06:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@668 -- # sleep 1 00:24:13.807 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@669 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:13.807 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:13.807 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:24:13.807 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=spare 00:24:13.807 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:13.807 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:13.807 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:14.065 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:14.065 "name": "raid_bdev1", 00:24:14.065 "uuid": "544de383-906b-4887-9a79-c7236b7911ec", 00:24:14.065 "strip_size_kb": 0, 00:24:14.065 "state": "online", 00:24:14.065 "raid_level": "raid1", 00:24:14.066 "superblock": true, 00:24:14.066 "num_base_bdevs": 2, 00:24:14.066 "num_base_bdevs_discovered": 2, 00:24:14.066 "num_base_bdevs_operational": 2, 00:24:14.066 "process": { 00:24:14.066 "type": "rebuild", 00:24:14.066 "target": "spare", 00:24:14.066 "progress": { 00:24:14.066 "blocks": 3072, 00:24:14.066 "percent": 38 00:24:14.066 } 00:24:14.066 }, 00:24:14.066 "base_bdevs_list": [ 00:24:14.066 { 00:24:14.066 "name": "spare", 00:24:14.066 "uuid": "98fd503a-a9e7-5cd1-bc5a-1d3792adfe8c", 00:24:14.066 "is_configured": true, 00:24:14.066 "data_offset": 256, 00:24:14.066 "data_size": 7936 00:24:14.066 }, 00:24:14.066 { 00:24:14.066 "name": "BaseBdev2", 00:24:14.066 "uuid": "0c8584dd-90d9-5dd4-9f21-99867554e6ac", 00:24:14.066 "is_configured": true, 00:24:14.066 "data_offset": 256, 00:24:14.066 "data_size": 7936 00:24:14.066 } 00:24:14.066 ] 00:24:14.066 }' 00:24:14.066 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:14.066 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:14.066 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:14.066 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:24:14.066 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@671 -- # '[' true = true ']' 00:24:14.066 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@671 -- # '[' = false ']' 00:24:14.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 671: [: =: unary operator expected 00:24:14.066 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@696 -- # local num_base_bdevs_operational=2 00:24:14.066 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@698 -- # '[' raid1 = raid1 ']' 00:24:14.066 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@698 -- # '[' 2 -gt 2 ']' 00:24:14.066 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@711 -- # local timeout=865 00:24:14.066 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:24:14.066 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:14.066 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:14.066 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:24:14.066 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=spare 00:24:14.066 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:14.066 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:14.066 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:14.324 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:14.324 "name": "raid_bdev1", 00:24:14.324 "uuid": "544de383-906b-4887-9a79-c7236b7911ec", 00:24:14.324 "strip_size_kb": 0, 00:24:14.324 "state": "online", 00:24:14.324 "raid_level": "raid1", 00:24:14.324 "superblock": true, 00:24:14.324 "num_base_bdevs": 2, 00:24:14.324 "num_base_bdevs_discovered": 2, 00:24:14.324 "num_base_bdevs_operational": 2, 00:24:14.324 "process": { 00:24:14.324 "type": "rebuild", 00:24:14.324 "target": "spare", 00:24:14.324 "progress": { 00:24:14.324 "blocks": 3840, 00:24:14.324 "percent": 48 00:24:14.324 } 00:24:14.324 }, 00:24:14.324 "base_bdevs_list": [ 00:24:14.324 { 00:24:14.324 "name": "spare", 00:24:14.324 "uuid": "98fd503a-a9e7-5cd1-bc5a-1d3792adfe8c", 00:24:14.324 "is_configured": true, 00:24:14.324 "data_offset": 256, 00:24:14.324 "data_size": 7936 00:24:14.324 }, 00:24:14.324 { 00:24:14.324 "name": "BaseBdev2", 00:24:14.324 "uuid": "0c8584dd-90d9-5dd4-9f21-99867554e6ac", 00:24:14.324 "is_configured": true, 00:24:14.324 "data_offset": 256, 00:24:14.324 "data_size": 7936 00:24:14.324 } 00:24:14.324 ] 00:24:14.324 }' 00:24:14.324 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:14.324 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:14.324 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:14.324 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:24:14.324 00:06:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@716 -- # sleep 1 00:24:15.699 00:06:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:24:15.699 00:06:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:15.699 00:06:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:15.699 00:06:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:24:15.699 00:06:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=spare 00:24:15.699 00:06:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:15.699 00:06:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:15.699 00:06:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:15.699 00:06:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:15.699 "name": "raid_bdev1", 00:24:15.699 "uuid": "544de383-906b-4887-9a79-c7236b7911ec", 00:24:15.699 "strip_size_kb": 0, 00:24:15.699 "state": "online", 00:24:15.699 "raid_level": "raid1", 00:24:15.699 "superblock": true, 00:24:15.699 "num_base_bdevs": 2, 00:24:15.699 "num_base_bdevs_discovered": 2, 00:24:15.699 "num_base_bdevs_operational": 2, 00:24:15.699 "process": { 00:24:15.699 "type": "rebuild", 00:24:15.699 "target": "spare", 00:24:15.699 "progress": { 00:24:15.699 "blocks": 7424, 00:24:15.699 "percent": 93 00:24:15.699 } 00:24:15.699 }, 00:24:15.699 "base_bdevs_list": [ 00:24:15.699 { 00:24:15.699 "name": "spare", 00:24:15.699 "uuid": "98fd503a-a9e7-5cd1-bc5a-1d3792adfe8c", 00:24:15.699 "is_configured": true, 00:24:15.699 "data_offset": 256, 00:24:15.699 "data_size": 7936 00:24:15.699 }, 00:24:15.699 { 00:24:15.699 "name": "BaseBdev2", 00:24:15.699 "uuid": "0c8584dd-90d9-5dd4-9f21-99867554e6ac", 00:24:15.699 "is_configured": true, 00:24:15.699 "data_offset": 256, 00:24:15.699 "data_size": 7936 00:24:15.699 } 00:24:15.699 ] 00:24:15.699 }' 00:24:15.699 00:06:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:15.699 00:06:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:15.699 00:06:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:15.699 00:06:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:24:15.699 00:06:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@716 -- # sleep 1 00:24:15.957 [2024-05-15 00:06:16.332030] bdev_raid.c:2741:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:15.957 [2024-05-15 00:06:16.332089] bdev_raid.c:2458:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:15.957 [2024-05-15 00:06:16.332174] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:16.890 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:24:16.890 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:16.890 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:16.890 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:24:16.890 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=spare 00:24:16.890 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:16.890 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:16.890 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:17.148 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:17.148 "name": "raid_bdev1", 00:24:17.148 "uuid": "544de383-906b-4887-9a79-c7236b7911ec", 00:24:17.148 "strip_size_kb": 0, 00:24:17.148 "state": "online", 00:24:17.148 "raid_level": "raid1", 00:24:17.148 "superblock": true, 00:24:17.148 "num_base_bdevs": 2, 00:24:17.148 "num_base_bdevs_discovered": 2, 00:24:17.148 "num_base_bdevs_operational": 2, 00:24:17.148 "base_bdevs_list": [ 00:24:17.148 { 00:24:17.148 "name": "spare", 00:24:17.148 "uuid": "98fd503a-a9e7-5cd1-bc5a-1d3792adfe8c", 00:24:17.148 "is_configured": true, 00:24:17.148 "data_offset": 256, 00:24:17.148 "data_size": 7936 00:24:17.148 }, 00:24:17.148 { 00:24:17.148 "name": "BaseBdev2", 00:24:17.148 "uuid": "0c8584dd-90d9-5dd4-9f21-99867554e6ac", 00:24:17.148 "is_configured": true, 00:24:17.148 "data_offset": 256, 00:24:17.148 "data_size": 7936 00:24:17.148 } 00:24:17.148 ] 00:24:17.148 }' 00:24:17.148 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:17.148 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:17.148 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:17.148 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:24:17.148 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@714 -- # break 00:24:17.148 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@720 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:17.148 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:17.148 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:24:17.148 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=none 00:24:17.148 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:17.149 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:17.149 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:17.407 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:17.407 "name": "raid_bdev1", 00:24:17.407 "uuid": "544de383-906b-4887-9a79-c7236b7911ec", 00:24:17.407 "strip_size_kb": 0, 00:24:17.407 "state": "online", 00:24:17.407 "raid_level": "raid1", 00:24:17.407 "superblock": true, 00:24:17.407 "num_base_bdevs": 2, 00:24:17.407 "num_base_bdevs_discovered": 2, 00:24:17.407 "num_base_bdevs_operational": 2, 00:24:17.407 "base_bdevs_list": [ 00:24:17.407 { 00:24:17.407 "name": "spare", 00:24:17.407 "uuid": "98fd503a-a9e7-5cd1-bc5a-1d3792adfe8c", 00:24:17.407 "is_configured": true, 00:24:17.407 "data_offset": 256, 00:24:17.407 "data_size": 7936 00:24:17.407 }, 00:24:17.407 { 00:24:17.407 "name": "BaseBdev2", 00:24:17.407 "uuid": "0c8584dd-90d9-5dd4-9f21-99867554e6ac", 00:24:17.407 "is_configured": true, 00:24:17.407 "data_offset": 256, 00:24:17.407 "data_size": 7936 00:24:17.407 } 00:24:17.407 ] 00:24:17.407 }' 00:24:17.407 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:17.407 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:17.407 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:17.407 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:24:17.407 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:17.407 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:17.407 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:17.407 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:17.407 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:17.407 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:24:17.407 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:17.407 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:17.407 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:17.407 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:17.407 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:17.407 00:06:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:17.666 00:06:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:17.666 "name": "raid_bdev1", 00:24:17.666 "uuid": "544de383-906b-4887-9a79-c7236b7911ec", 00:24:17.666 "strip_size_kb": 0, 00:24:17.666 "state": "online", 00:24:17.666 "raid_level": "raid1", 00:24:17.666 "superblock": true, 00:24:17.666 "num_base_bdevs": 2, 00:24:17.666 "num_base_bdevs_discovered": 2, 00:24:17.666 "num_base_bdevs_operational": 2, 00:24:17.666 "base_bdevs_list": [ 00:24:17.666 { 00:24:17.666 "name": "spare", 00:24:17.666 "uuid": "98fd503a-a9e7-5cd1-bc5a-1d3792adfe8c", 00:24:17.666 "is_configured": true, 00:24:17.666 "data_offset": 256, 00:24:17.666 "data_size": 7936 00:24:17.666 }, 00:24:17.666 { 00:24:17.666 "name": "BaseBdev2", 00:24:17.666 "uuid": "0c8584dd-90d9-5dd4-9f21-99867554e6ac", 00:24:17.666 "is_configured": true, 00:24:17.666 "data_offset": 256, 00:24:17.666 "data_size": 7936 00:24:17.666 } 00:24:17.666 ] 00:24:17.666 }' 00:24:17.666 00:06:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:17.666 00:06:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:18.263 00:06:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@724 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:18.521 [2024-05-15 00:06:18.976360] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:18.521 [2024-05-15 00:06:18.976387] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:18.521 [2024-05-15 00:06:18.976452] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:18.521 [2024-05-15 00:06:18.976507] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:18.521 [2024-05-15 00:06:18.976519] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c89f00 name raid_bdev1, state offline 00:24:18.521 00:06:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@725 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:18.521 00:06:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@725 -- # jq length 00:24:18.779 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@725 -- # [[ 0 == 0 ]] 00:24:18.779 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@727 -- # '[' true = true ']' 00:24:18.779 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@728 -- # '[' false = true ']' 00:24:18.779 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@742 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:24:18.779 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:18.779 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:24:18.779 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:18.780 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:18.780 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:18.780 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:24:18.780 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:18.780 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:18.780 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:24:19.038 /dev/nbd0 00:24:19.038 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:19.038 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:19.038 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:24:19.038 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@865 -- # local i 00:24:19.038 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:24:19.038 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:24:19.038 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:24:19.038 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # break 00:24:19.038 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:24:19.038 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:24:19.038 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:19.038 1+0 records in 00:24:19.038 1+0 records out 00:24:19.038 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000242786 s, 16.9 MB/s 00:24:19.038 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:19.038 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # size=4096 00:24:19.038 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:19.038 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:24:19.038 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # return 0 00:24:19.038 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:19.038 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:19.038 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:24:19.296 /dev/nbd1 00:24:19.296 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:19.296 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:19.296 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:24:19.296 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@865 -- # local i 00:24:19.296 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:24:19.296 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:24:19.297 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:24:19.297 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # break 00:24:19.297 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:24:19.297 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:24:19.297 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:19.297 1+0 records in 00:24:19.297 1+0 records out 00:24:19.297 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000300524 s, 13.6 MB/s 00:24:19.297 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:19.297 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # size=4096 00:24:19.297 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:19.297 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:24:19.297 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # return 0 00:24:19.297 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:19.297 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:19.297 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@743 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:24:19.297 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@744 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:24:19.297 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:19.297 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:19.297 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:19.297 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:24:19.297 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:19.297 00:06:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:19.554 00:06:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:19.554 00:06:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:19.554 00:06:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:19.554 00:06:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:19.554 00:06:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:19.555 00:06:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:19.555 00:06:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:24:19.555 00:06:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:24:19.555 00:06:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:19.555 00:06:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:19.813 00:06:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:19.813 00:06:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:19.813 00:06:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:19.813 00:06:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:19.813 00:06:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:19.813 00:06:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:19.813 00:06:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:24:19.813 00:06:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:24:19.813 00:06:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # '[' true = true ']' 00:24:19.813 00:06:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:24:19.813 00:06:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev1 ']' 00:24:19.813 00:06:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:24:20.071 00:06:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:20.329 [2024-05-15 00:06:20.830737] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:20.329 [2024-05-15 00:06:20.830786] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:20.329 [2024-05-15 00:06:20.830806] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c88e00 00:24:20.329 [2024-05-15 00:06:20.830819] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:20.329 [2024-05-15 00:06:20.832424] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:20.329 [2024-05-15 00:06:20.832453] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:20.329 [2024-05-15 00:06:20.832522] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev1 00:24:20.329 [2024-05-15 00:06:20.832549] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:20.329 BaseBdev1 00:24:20.329 00:06:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:24:20.329 00:06:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev2 ']' 00:24:20.329 00:06:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev2 00:24:20.587 00:06:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:20.845 [2024-05-15 00:06:21.312005] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:20.845 [2024-05-15 00:06:21.312039] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:20.845 [2024-05-15 00:06:21.312058] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c89920 00:24:20.845 [2024-05-15 00:06:21.312070] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:20.845 [2024-05-15 00:06:21.312381] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:20.845 [2024-05-15 00:06:21.312406] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:20.845 [2024-05-15 00:06:21.312463] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev2 00:24:20.845 [2024-05-15 00:06:21.312475] bdev_raid.c:3396:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev2 (3) greater than existing raid bdev raid_bdev1 (1) 00:24:20.845 [2024-05-15 00:06:21.312484] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:20.845 [2024-05-15 00:06:21.312498] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c87640 name raid_bdev1, state configuring 00:24:20.845 [2024-05-15 00:06:21.312527] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:20.845 BaseBdev2 00:24:20.845 00:06:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@757 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:21.102 00:06:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@758 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:21.360 [2024-05-15 00:06:21.789269] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:21.360 [2024-05-15 00:06:21.789304] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:21.360 [2024-05-15 00:06:21.789322] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e38510 00:24:21.360 [2024-05-15 00:06:21.789334] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:21.360 [2024-05-15 00:06:21.789674] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:21.360 [2024-05-15 00:06:21.789691] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:21.360 [2024-05-15 00:06:21.789759] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev spare 00:24:21.360 [2024-05-15 00:06:21.789778] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:21.360 spare 00:24:21.360 00:06:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:21.360 00:06:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:21.360 00:06:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:21.360 00:06:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:21.360 00:06:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:21.360 00:06:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:24:21.360 00:06:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:21.361 00:06:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:21.361 00:06:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:21.361 00:06:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:21.361 00:06:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:21.361 00:06:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:21.361 [2024-05-15 00:06:21.890101] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c8e4f0 00:24:21.361 [2024-05-15 00:06:21.890116] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:21.361 [2024-05-15 00:06:21.890291] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c8af00 00:24:21.361 [2024-05-15 00:06:21.890438] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c8e4f0 00:24:21.361 [2024-05-15 00:06:21.890448] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c8e4f0 00:24:21.361 [2024-05-15 00:06:21.890548] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:21.619 00:06:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:21.619 "name": "raid_bdev1", 00:24:21.619 "uuid": "544de383-906b-4887-9a79-c7236b7911ec", 00:24:21.619 "strip_size_kb": 0, 00:24:21.619 "state": "online", 00:24:21.619 "raid_level": "raid1", 00:24:21.619 "superblock": true, 00:24:21.619 "num_base_bdevs": 2, 00:24:21.619 "num_base_bdevs_discovered": 2, 00:24:21.619 "num_base_bdevs_operational": 2, 00:24:21.619 "base_bdevs_list": [ 00:24:21.619 { 00:24:21.619 "name": "spare", 00:24:21.619 "uuid": "98fd503a-a9e7-5cd1-bc5a-1d3792adfe8c", 00:24:21.619 "is_configured": true, 00:24:21.619 "data_offset": 256, 00:24:21.619 "data_size": 7936 00:24:21.619 }, 00:24:21.619 { 00:24:21.619 "name": "BaseBdev2", 00:24:21.619 "uuid": "0c8584dd-90d9-5dd4-9f21-99867554e6ac", 00:24:21.619 "is_configured": true, 00:24:21.619 "data_offset": 256, 00:24:21.619 "data_size": 7936 00:24:21.619 } 00:24:21.619 ] 00:24:21.619 }' 00:24:21.619 00:06:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:21.619 00:06:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:22.183 00:06:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:22.183 00:06:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:22.183 00:06:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:24:22.183 00:06:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=none 00:24:22.183 00:06:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:22.183 00:06:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:22.183 00:06:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:22.440 00:06:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:22.440 "name": "raid_bdev1", 00:24:22.440 "uuid": "544de383-906b-4887-9a79-c7236b7911ec", 00:24:22.440 "strip_size_kb": 0, 00:24:22.440 "state": "online", 00:24:22.440 "raid_level": "raid1", 00:24:22.440 "superblock": true, 00:24:22.440 "num_base_bdevs": 2, 00:24:22.440 "num_base_bdevs_discovered": 2, 00:24:22.440 "num_base_bdevs_operational": 2, 00:24:22.440 "base_bdevs_list": [ 00:24:22.440 { 00:24:22.440 "name": "spare", 00:24:22.440 "uuid": "98fd503a-a9e7-5cd1-bc5a-1d3792adfe8c", 00:24:22.440 "is_configured": true, 00:24:22.440 "data_offset": 256, 00:24:22.440 "data_size": 7936 00:24:22.440 }, 00:24:22.440 { 00:24:22.440 "name": "BaseBdev2", 00:24:22.440 "uuid": "0c8584dd-90d9-5dd4-9f21-99867554e6ac", 00:24:22.440 "is_configured": true, 00:24:22.440 "data_offset": 256, 00:24:22.440 "data_size": 7936 00:24:22.440 } 00:24:22.440 ] 00:24:22.440 }' 00:24:22.440 00:06:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:22.440 00:06:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:22.440 00:06:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:22.440 00:06:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:24:22.440 00:06:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:22.440 00:06:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # jq -r '.[].base_bdevs_list[0].name' 00:24:22.699 00:06:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # [[ spare == \s\p\a\r\e ]] 00:24:22.699 00:06:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:22.957 [2024-05-15 00:06:23.437771] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:22.957 00:06:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@766 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:22.957 00:06:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:22.957 00:06:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:22.957 00:06:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:22.957 00:06:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:22.957 00:06:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:24:22.957 00:06:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:22.957 00:06:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:22.957 00:06:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:22.957 00:06:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:22.957 00:06:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:22.957 00:06:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:23.214 00:06:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:23.214 "name": "raid_bdev1", 00:24:23.214 "uuid": "544de383-906b-4887-9a79-c7236b7911ec", 00:24:23.214 "strip_size_kb": 0, 00:24:23.214 "state": "online", 00:24:23.214 "raid_level": "raid1", 00:24:23.214 "superblock": true, 00:24:23.214 "num_base_bdevs": 2, 00:24:23.214 "num_base_bdevs_discovered": 1, 00:24:23.214 "num_base_bdevs_operational": 1, 00:24:23.214 "base_bdevs_list": [ 00:24:23.214 { 00:24:23.214 "name": null, 00:24:23.214 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:23.214 "is_configured": false, 00:24:23.214 "data_offset": 256, 00:24:23.214 "data_size": 7936 00:24:23.214 }, 00:24:23.214 { 00:24:23.214 "name": "BaseBdev2", 00:24:23.214 "uuid": "0c8584dd-90d9-5dd4-9f21-99867554e6ac", 00:24:23.214 "is_configured": true, 00:24:23.214 "data_offset": 256, 00:24:23.214 "data_size": 7936 00:24:23.214 } 00:24:23.214 ] 00:24:23.214 }' 00:24:23.214 00:06:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:23.214 00:06:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:23.779 00:06:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:24.037 [2024-05-15 00:06:24.516656] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:24.037 [2024-05-15 00:06:24.516807] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:24.037 [2024-05-15 00:06:24.516824] bdev_raid.c:3452:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:24.037 [2024-05-15 00:06:24.516853] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:24.037 [2024-05-15 00:06:24.521632] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c8af00 00:24:24.037 [2024-05-15 00:06:24.522968] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:24.037 00:06:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # sleep 1 00:24:24.972 00:06:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:24.972 00:06:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:24.972 00:06:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:24:24.972 00:06:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=spare 00:24:24.972 00:06:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:24.972 00:06:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:24.972 00:06:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:25.230 00:06:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:25.230 "name": "raid_bdev1", 00:24:25.230 "uuid": "544de383-906b-4887-9a79-c7236b7911ec", 00:24:25.230 "strip_size_kb": 0, 00:24:25.230 "state": "online", 00:24:25.230 "raid_level": "raid1", 00:24:25.230 "superblock": true, 00:24:25.230 "num_base_bdevs": 2, 00:24:25.230 "num_base_bdevs_discovered": 2, 00:24:25.230 "num_base_bdevs_operational": 2, 00:24:25.230 "process": { 00:24:25.230 "type": "rebuild", 00:24:25.230 "target": "spare", 00:24:25.230 "progress": { 00:24:25.230 "blocks": 3072, 00:24:25.230 "percent": 38 00:24:25.230 } 00:24:25.230 }, 00:24:25.230 "base_bdevs_list": [ 00:24:25.230 { 00:24:25.230 "name": "spare", 00:24:25.230 "uuid": "98fd503a-a9e7-5cd1-bc5a-1d3792adfe8c", 00:24:25.230 "is_configured": true, 00:24:25.230 "data_offset": 256, 00:24:25.230 "data_size": 7936 00:24:25.230 }, 00:24:25.230 { 00:24:25.230 "name": "BaseBdev2", 00:24:25.230 "uuid": "0c8584dd-90d9-5dd4-9f21-99867554e6ac", 00:24:25.230 "is_configured": true, 00:24:25.230 "data_offset": 256, 00:24:25.230 "data_size": 7936 00:24:25.230 } 00:24:25.230 ] 00:24:25.230 }' 00:24:25.230 00:06:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:25.489 00:06:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:25.489 00:06:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:25.489 00:06:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:24:25.489 00:06:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:25.748 [2024-05-15 00:06:26.102406] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:25.748 [2024-05-15 00:06:26.135421] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:25.748 [2024-05-15 00:06:26.135465] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:25.748 00:06:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:25.748 00:06:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:25.748 00:06:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:25.748 00:06:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:25.748 00:06:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:25.748 00:06:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:24:25.748 00:06:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:25.748 00:06:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:25.748 00:06:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:25.748 00:06:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:25.748 00:06:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:25.748 00:06:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:26.007 00:06:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:26.007 "name": "raid_bdev1", 00:24:26.007 "uuid": "544de383-906b-4887-9a79-c7236b7911ec", 00:24:26.007 "strip_size_kb": 0, 00:24:26.007 "state": "online", 00:24:26.007 "raid_level": "raid1", 00:24:26.007 "superblock": true, 00:24:26.007 "num_base_bdevs": 2, 00:24:26.007 "num_base_bdevs_discovered": 1, 00:24:26.007 "num_base_bdevs_operational": 1, 00:24:26.007 "base_bdevs_list": [ 00:24:26.007 { 00:24:26.007 "name": null, 00:24:26.007 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:26.007 "is_configured": false, 00:24:26.007 "data_offset": 256, 00:24:26.007 "data_size": 7936 00:24:26.007 }, 00:24:26.007 { 00:24:26.007 "name": "BaseBdev2", 00:24:26.007 "uuid": "0c8584dd-90d9-5dd4-9f21-99867554e6ac", 00:24:26.007 "is_configured": true, 00:24:26.007 "data_offset": 256, 00:24:26.007 "data_size": 7936 00:24:26.007 } 00:24:26.007 ] 00:24:26.007 }' 00:24:26.007 00:06:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:26.007 00:06:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:26.574 00:06:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:26.834 [2024-05-15 00:06:27.223347] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:26.834 [2024-05-15 00:06:27.223395] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:26.834 [2024-05-15 00:06:27.223422] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c8f5a0 00:24:26.834 [2024-05-15 00:06:27.223435] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:26.834 [2024-05-15 00:06:27.223804] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:26.834 [2024-05-15 00:06:27.223822] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:26.834 [2024-05-15 00:06:27.223902] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev spare 00:24:26.834 [2024-05-15 00:06:27.223914] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:26.834 [2024-05-15 00:06:27.223925] bdev_raid.c:3452:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:26.834 [2024-05-15 00:06:27.223945] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:26.834 [2024-05-15 00:06:27.228769] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c87ea0 00:24:26.834 spare 00:24:26.834 [2024-05-15 00:06:27.230190] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:26.834 00:06:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # sleep 1 00:24:27.768 00:06:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:27.768 00:06:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:27.768 00:06:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:24:27.768 00:06:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=spare 00:24:27.768 00:06:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:27.768 00:06:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:27.768 00:06:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:28.027 00:06:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:28.027 "name": "raid_bdev1", 00:24:28.027 "uuid": "544de383-906b-4887-9a79-c7236b7911ec", 00:24:28.027 "strip_size_kb": 0, 00:24:28.027 "state": "online", 00:24:28.027 "raid_level": "raid1", 00:24:28.027 "superblock": true, 00:24:28.027 "num_base_bdevs": 2, 00:24:28.027 "num_base_bdevs_discovered": 2, 00:24:28.027 "num_base_bdevs_operational": 2, 00:24:28.027 "process": { 00:24:28.027 "type": "rebuild", 00:24:28.027 "target": "spare", 00:24:28.027 "progress": { 00:24:28.027 "blocks": 3072, 00:24:28.027 "percent": 38 00:24:28.027 } 00:24:28.027 }, 00:24:28.027 "base_bdevs_list": [ 00:24:28.027 { 00:24:28.027 "name": "spare", 00:24:28.027 "uuid": "98fd503a-a9e7-5cd1-bc5a-1d3792adfe8c", 00:24:28.027 "is_configured": true, 00:24:28.027 "data_offset": 256, 00:24:28.027 "data_size": 7936 00:24:28.027 }, 00:24:28.027 { 00:24:28.027 "name": "BaseBdev2", 00:24:28.027 "uuid": "0c8584dd-90d9-5dd4-9f21-99867554e6ac", 00:24:28.027 "is_configured": true, 00:24:28.027 "data_offset": 256, 00:24:28.027 "data_size": 7936 00:24:28.027 } 00:24:28.027 ] 00:24:28.027 }' 00:24:28.027 00:06:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:28.027 00:06:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:28.027 00:06:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:28.027 00:06:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:24:28.027 00:06:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:28.285 [2024-05-15 00:06:28.813463] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:28.285 [2024-05-15 00:06:28.842846] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:28.285 [2024-05-15 00:06:28.842889] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:28.285 00:06:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@780 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:28.285 00:06:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:28.285 00:06:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:28.285 00:06:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:28.285 00:06:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:28.285 00:06:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:24:28.285 00:06:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:28.285 00:06:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:28.543 00:06:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:28.543 00:06:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:28.543 00:06:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:28.543 00:06:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:28.801 00:06:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:28.801 "name": "raid_bdev1", 00:24:28.801 "uuid": "544de383-906b-4887-9a79-c7236b7911ec", 00:24:28.801 "strip_size_kb": 0, 00:24:28.801 "state": "online", 00:24:28.801 "raid_level": "raid1", 00:24:28.801 "superblock": true, 00:24:28.801 "num_base_bdevs": 2, 00:24:28.801 "num_base_bdevs_discovered": 1, 00:24:28.801 "num_base_bdevs_operational": 1, 00:24:28.801 "base_bdevs_list": [ 00:24:28.801 { 00:24:28.801 "name": null, 00:24:28.801 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:28.801 "is_configured": false, 00:24:28.801 "data_offset": 256, 00:24:28.801 "data_size": 7936 00:24:28.801 }, 00:24:28.801 { 00:24:28.801 "name": "BaseBdev2", 00:24:28.801 "uuid": "0c8584dd-90d9-5dd4-9f21-99867554e6ac", 00:24:28.801 "is_configured": true, 00:24:28.801 "data_offset": 256, 00:24:28.801 "data_size": 7936 00:24:28.801 } 00:24:28.801 ] 00:24:28.801 }' 00:24:28.801 00:06:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:28.801 00:06:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:29.366 00:06:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@781 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:29.367 00:06:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:29.367 00:06:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:24:29.367 00:06:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=none 00:24:29.367 00:06:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:29.367 00:06:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:29.367 00:06:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:29.624 00:06:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:29.624 "name": "raid_bdev1", 00:24:29.624 "uuid": "544de383-906b-4887-9a79-c7236b7911ec", 00:24:29.624 "strip_size_kb": 0, 00:24:29.624 "state": "online", 00:24:29.624 "raid_level": "raid1", 00:24:29.624 "superblock": true, 00:24:29.624 "num_base_bdevs": 2, 00:24:29.624 "num_base_bdevs_discovered": 1, 00:24:29.624 "num_base_bdevs_operational": 1, 00:24:29.624 "base_bdevs_list": [ 00:24:29.624 { 00:24:29.624 "name": null, 00:24:29.624 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:29.624 "is_configured": false, 00:24:29.624 "data_offset": 256, 00:24:29.624 "data_size": 7936 00:24:29.624 }, 00:24:29.624 { 00:24:29.624 "name": "BaseBdev2", 00:24:29.624 "uuid": "0c8584dd-90d9-5dd4-9f21-99867554e6ac", 00:24:29.624 "is_configured": true, 00:24:29.624 "data_offset": 256, 00:24:29.624 "data_size": 7936 00:24:29.624 } 00:24:29.624 ] 00:24:29.624 }' 00:24:29.624 00:06:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:29.624 00:06:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:29.624 00:06:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:29.624 00:06:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:24:29.624 00:06:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:24:29.883 00:06:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@785 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:30.141 [2024-05-15 00:06:30.527733] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:30.141 [2024-05-15 00:06:30.527780] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:30.141 [2024-05-15 00:06:30.527803] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c88e00 00:24:30.141 [2024-05-15 00:06:30.527815] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:30.141 [2024-05-15 00:06:30.528156] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:30.141 [2024-05-15 00:06:30.528172] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:30.141 [2024-05-15 00:06:30.528234] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev1 00:24:30.141 [2024-05-15 00:06:30.528246] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:30.141 [2024-05-15 00:06:30.528256] bdev_raid.c:3430:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:30.141 BaseBdev1 00:24:30.141 00:06:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@786 -- # sleep 1 00:24:31.076 00:06:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@787 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:31.076 00:06:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:31.076 00:06:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:31.076 00:06:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:31.076 00:06:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:31.076 00:06:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:24:31.076 00:06:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:31.076 00:06:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:31.076 00:06:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:31.076 00:06:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:31.076 00:06:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:31.076 00:06:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:31.334 00:06:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:31.334 "name": "raid_bdev1", 00:24:31.334 "uuid": "544de383-906b-4887-9a79-c7236b7911ec", 00:24:31.334 "strip_size_kb": 0, 00:24:31.334 "state": "online", 00:24:31.334 "raid_level": "raid1", 00:24:31.334 "superblock": true, 00:24:31.334 "num_base_bdevs": 2, 00:24:31.334 "num_base_bdevs_discovered": 1, 00:24:31.334 "num_base_bdevs_operational": 1, 00:24:31.334 "base_bdevs_list": [ 00:24:31.334 { 00:24:31.334 "name": null, 00:24:31.334 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:31.334 "is_configured": false, 00:24:31.334 "data_offset": 256, 00:24:31.334 "data_size": 7936 00:24:31.334 }, 00:24:31.334 { 00:24:31.334 "name": "BaseBdev2", 00:24:31.334 "uuid": "0c8584dd-90d9-5dd4-9f21-99867554e6ac", 00:24:31.334 "is_configured": true, 00:24:31.334 "data_offset": 256, 00:24:31.334 "data_size": 7936 00:24:31.334 } 00:24:31.334 ] 00:24:31.334 }' 00:24:31.334 00:06:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:31.334 00:06:31 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:31.900 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@788 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:31.900 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:31.900 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:24:31.900 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=none 00:24:31.900 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:31.900 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:31.900 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:32.159 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:32.159 "name": "raid_bdev1", 00:24:32.159 "uuid": "544de383-906b-4887-9a79-c7236b7911ec", 00:24:32.159 "strip_size_kb": 0, 00:24:32.159 "state": "online", 00:24:32.159 "raid_level": "raid1", 00:24:32.159 "superblock": true, 00:24:32.159 "num_base_bdevs": 2, 00:24:32.159 "num_base_bdevs_discovered": 1, 00:24:32.159 "num_base_bdevs_operational": 1, 00:24:32.159 "base_bdevs_list": [ 00:24:32.159 { 00:24:32.160 "name": null, 00:24:32.160 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:32.160 "is_configured": false, 00:24:32.160 "data_offset": 256, 00:24:32.160 "data_size": 7936 00:24:32.160 }, 00:24:32.160 { 00:24:32.160 "name": "BaseBdev2", 00:24:32.160 "uuid": "0c8584dd-90d9-5dd4-9f21-99867554e6ac", 00:24:32.160 "is_configured": true, 00:24:32.160 "data_offset": 256, 00:24:32.160 "data_size": 7936 00:24:32.160 } 00:24:32.160 ] 00:24:32.160 }' 00:24:32.160 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:32.160 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:32.160 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:32.160 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:24:32.160 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@789 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:32.160 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@648 -- # local es=0 00:24:32.160 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:32.160 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:32.160 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:32.160 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:32.160 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:32.160 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:32.160 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:32.160 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:32.160 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:32.160 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:32.418 [2024-05-15 00:06:32.910084] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:32.418 [2024-05-15 00:06:32.910209] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:32.418 [2024-05-15 00:06:32.910224] bdev_raid.c:3430:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:32.419 request: 00:24:32.419 { 00:24:32.419 "raid_bdev": "raid_bdev1", 00:24:32.419 "base_bdev": "BaseBdev1", 00:24:32.419 "method": "bdev_raid_add_base_bdev", 00:24:32.419 "req_id": 1 00:24:32.419 } 00:24:32.419 Got JSON-RPC error response 00:24:32.419 response: 00:24:32.419 { 00:24:32.419 "code": -22, 00:24:32.419 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:24:32.419 } 00:24:32.419 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # es=1 00:24:32.419 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:24:32.419 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:24:32.419 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:24:32.419 00:06:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@790 -- # sleep 1 00:24:33.351 00:06:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:33.351 00:06:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:33.351 00:06:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:33.351 00:06:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:33.351 00:06:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:33.351 00:06:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:24:33.351 00:06:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:33.351 00:06:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:33.351 00:06:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:33.351 00:06:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:33.351 00:06:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.351 00:06:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:33.608 00:06:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:33.608 "name": "raid_bdev1", 00:24:33.608 "uuid": "544de383-906b-4887-9a79-c7236b7911ec", 00:24:33.608 "strip_size_kb": 0, 00:24:33.608 "state": "online", 00:24:33.608 "raid_level": "raid1", 00:24:33.608 "superblock": true, 00:24:33.608 "num_base_bdevs": 2, 00:24:33.608 "num_base_bdevs_discovered": 1, 00:24:33.608 "num_base_bdevs_operational": 1, 00:24:33.608 "base_bdevs_list": [ 00:24:33.608 { 00:24:33.608 "name": null, 00:24:33.608 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:33.608 "is_configured": false, 00:24:33.608 "data_offset": 256, 00:24:33.608 "data_size": 7936 00:24:33.608 }, 00:24:33.608 { 00:24:33.608 "name": "BaseBdev2", 00:24:33.608 "uuid": "0c8584dd-90d9-5dd4-9f21-99867554e6ac", 00:24:33.608 "is_configured": true, 00:24:33.608 "data_offset": 256, 00:24:33.608 "data_size": 7936 00:24:33.608 } 00:24:33.608 ] 00:24:33.608 }' 00:24:33.608 00:06:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:33.608 00:06:34 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:34.543 00:06:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@792 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:34.543 00:06:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:34.543 00:06:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:24:34.543 00:06:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=none 00:24:34.543 00:06:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:34.543 00:06:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:34.543 00:06:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:34.543 00:06:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:34.543 "name": "raid_bdev1", 00:24:34.543 "uuid": "544de383-906b-4887-9a79-c7236b7911ec", 00:24:34.543 "strip_size_kb": 0, 00:24:34.543 "state": "online", 00:24:34.543 "raid_level": "raid1", 00:24:34.543 "superblock": true, 00:24:34.543 "num_base_bdevs": 2, 00:24:34.543 "num_base_bdevs_discovered": 1, 00:24:34.543 "num_base_bdevs_operational": 1, 00:24:34.543 "base_bdevs_list": [ 00:24:34.543 { 00:24:34.543 "name": null, 00:24:34.543 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:34.543 "is_configured": false, 00:24:34.543 "data_offset": 256, 00:24:34.543 "data_size": 7936 00:24:34.543 }, 00:24:34.543 { 00:24:34.543 "name": "BaseBdev2", 00:24:34.543 "uuid": "0c8584dd-90d9-5dd4-9f21-99867554e6ac", 00:24:34.543 "is_configured": true, 00:24:34.543 "data_offset": 256, 00:24:34.543 "data_size": 7936 00:24:34.543 } 00:24:34.543 ] 00:24:34.543 }' 00:24:34.543 00:06:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:34.543 00:06:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:34.543 00:06:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:34.543 00:06:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:24:34.543 00:06:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@795 -- # killprocess 508441 00:24:34.543 00:06:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@946 -- # '[' -z 508441 ']' 00:24:34.543 00:06:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@950 -- # kill -0 508441 00:24:34.543 00:06:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@951 -- # uname 00:24:34.543 00:06:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:24:34.543 00:06:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 508441 00:24:34.802 00:06:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:24:34.802 00:06:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:24:34.802 00:06:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@964 -- # echo 'killing process with pid 508441' 00:24:34.802 killing process with pid 508441 00:24:34.802 00:06:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@965 -- # kill 508441 00:24:34.802 Received shutdown signal, test time was about 60.000000 seconds 00:24:34.802 00:24:34.802 Latency(us) 00:24:34.802 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:34.802 =================================================================================================================== 00:24:34.802 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:24:34.802 00:06:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@970 -- # wait 508441 00:24:34.802 [2024-05-15 00:06:35.147857] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:34.802 [2024-05-15 00:06:35.147962] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:34.802 [2024-05-15 00:06:35.148004] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:34.802 [2024-05-15 00:06:35.148015] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c8e4f0 name raid_bdev1, state offline 00:24:34.802 [2024-05-15 00:06:35.174731] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:35.081 00:06:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@797 -- # return 0 00:24:35.081 00:24:35.081 real 0m31.945s 00:24:35.081 user 0m49.906s 00:24:35.081 sys 0m5.246s 00:24:35.081 00:06:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1122 -- # xtrace_disable 00:24:35.081 00:06:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:35.081 ************************************ 00:24:35.081 END TEST raid_rebuild_test_sb_4k 00:24:35.081 ************************************ 00:24:35.081 00:06:35 bdev_raid -- bdev/bdev_raid.sh@850 -- # base_malloc_params='-m 32' 00:24:35.081 00:06:35 bdev_raid -- bdev/bdev_raid.sh@851 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:24:35.081 00:06:35 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:24:35.081 00:06:35 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:24:35.081 00:06:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:35.081 ************************************ 00:24:35.081 START TEST raid_state_function_test_sb_md_separate 00:24:35.081 ************************************ 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1121 -- # raid_state_function_test raid1 2 true 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local raid_level=raid1 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=2 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@231 -- # '[' raid1 '!=' raid1 ']' 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@235 -- # strip_size=0 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # raid_pid=513404 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 513404' 00:24:35.081 Process raid pid: 513404 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@247 -- # waitforlisten 513404 /var/tmp/spdk-raid.sock 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@827 -- # '[' -z 513404 ']' 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@832 -- # local max_retries=100 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:35.081 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@836 -- # xtrace_disable 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:35.081 00:06:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:24:35.081 [2024-05-15 00:06:35.545873] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:24:35.081 [2024-05-15 00:06:35.545932] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:24:35.346 [2024-05-15 00:06:35.675725] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:35.346 [2024-05-15 00:06:35.782391] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:24:35.346 [2024-05-15 00:06:35.847308] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:35.346 [2024-05-15 00:06:35.847346] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:35.913 00:06:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:24:35.913 00:06:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@860 -- # return 0 00:24:35.913 00:06:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:24:36.172 [2024-05-15 00:06:36.682796] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:24:36.172 [2024-05-15 00:06:36.682837] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:24:36.172 [2024-05-15 00:06:36.682848] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:36.172 [2024-05-15 00:06:36.682860] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:36.172 00:06:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:24:36.172 00:06:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:24:36.172 00:06:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:24:36.172 00:06:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:36.172 00:06:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:36.172 00:06:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:24:36.172 00:06:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:36.172 00:06:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:36.172 00:06:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:36.172 00:06:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:36.172 00:06:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:36.172 00:06:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:36.431 00:06:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:36.431 "name": "Existed_Raid", 00:24:36.431 "uuid": "0347a6a9-e2fc-47a3-9200-49533ba8b978", 00:24:36.431 "strip_size_kb": 0, 00:24:36.431 "state": "configuring", 00:24:36.431 "raid_level": "raid1", 00:24:36.431 "superblock": true, 00:24:36.431 "num_base_bdevs": 2, 00:24:36.431 "num_base_bdevs_discovered": 0, 00:24:36.431 "num_base_bdevs_operational": 2, 00:24:36.431 "base_bdevs_list": [ 00:24:36.431 { 00:24:36.431 "name": "BaseBdev1", 00:24:36.431 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:36.431 "is_configured": false, 00:24:36.431 "data_offset": 0, 00:24:36.431 "data_size": 0 00:24:36.431 }, 00:24:36.431 { 00:24:36.431 "name": "BaseBdev2", 00:24:36.431 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:36.431 "is_configured": false, 00:24:36.431 "data_offset": 0, 00:24:36.431 "data_size": 0 00:24:36.431 } 00:24:36.431 ] 00:24:36.431 }' 00:24:36.431 00:06:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:36.431 00:06:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:36.997 00:06:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:24:37.255 [2024-05-15 00:06:37.729463] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:24:37.255 [2024-05-15 00:06:37.729492] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11dabc0 name Existed_Raid, state configuring 00:24:37.255 00:06:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:24:37.513 [2024-05-15 00:06:37.978134] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:24:37.513 [2024-05-15 00:06:37.978159] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:24:37.513 [2024-05-15 00:06:37.978169] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:37.513 [2024-05-15 00:06:37.978181] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:37.513 00:06:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:24:37.771 [2024-05-15 00:06:38.237174] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:37.772 BaseBdev1 00:24:37.772 00:06:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:24:37.772 00:06:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:24:37.772 00:06:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:24:37.772 00:06:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local i 00:24:37.772 00:06:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:24:37.772 00:06:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:24:37.772 00:06:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:38.030 00:06:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:24:38.289 [ 00:24:38.289 { 00:24:38.289 "name": "BaseBdev1", 00:24:38.289 "aliases": [ 00:24:38.289 "bbc655da-421c-4507-8c00-c08149ef008e" 00:24:38.289 ], 00:24:38.289 "product_name": "Malloc disk", 00:24:38.289 "block_size": 4096, 00:24:38.289 "num_blocks": 8192, 00:24:38.289 "uuid": "bbc655da-421c-4507-8c00-c08149ef008e", 00:24:38.289 "md_size": 32, 00:24:38.289 "md_interleave": false, 00:24:38.289 "dif_type": 0, 00:24:38.289 "assigned_rate_limits": { 00:24:38.289 "rw_ios_per_sec": 0, 00:24:38.289 "rw_mbytes_per_sec": 0, 00:24:38.289 "r_mbytes_per_sec": 0, 00:24:38.289 "w_mbytes_per_sec": 0 00:24:38.289 }, 00:24:38.289 "claimed": true, 00:24:38.289 "claim_type": "exclusive_write", 00:24:38.289 "zoned": false, 00:24:38.289 "supported_io_types": { 00:24:38.289 "read": true, 00:24:38.289 "write": true, 00:24:38.289 "unmap": true, 00:24:38.289 "write_zeroes": true, 00:24:38.289 "flush": true, 00:24:38.289 "reset": true, 00:24:38.289 "compare": false, 00:24:38.289 "compare_and_write": false, 00:24:38.289 "abort": true, 00:24:38.289 "nvme_admin": false, 00:24:38.289 "nvme_io": false 00:24:38.289 }, 00:24:38.289 "memory_domains": [ 00:24:38.289 { 00:24:38.289 "dma_device_id": "system", 00:24:38.289 "dma_device_type": 1 00:24:38.289 }, 00:24:38.289 { 00:24:38.289 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:38.289 "dma_device_type": 2 00:24:38.289 } 00:24:38.289 ], 00:24:38.289 "driver_specific": {} 00:24:38.289 } 00:24:38.289 ] 00:24:38.289 00:06:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@903 -- # return 0 00:24:38.289 00:06:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:24:38.289 00:06:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:24:38.289 00:06:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:24:38.289 00:06:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:38.289 00:06:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:38.289 00:06:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:24:38.289 00:06:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:38.289 00:06:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:38.289 00:06:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:38.289 00:06:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:38.289 00:06:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:38.289 00:06:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:38.289 00:06:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:38.289 "name": "Existed_Raid", 00:24:38.289 "uuid": "e880320e-2161-4d23-9af5-46271c7de457", 00:24:38.289 "strip_size_kb": 0, 00:24:38.289 "state": "configuring", 00:24:38.289 "raid_level": "raid1", 00:24:38.289 "superblock": true, 00:24:38.289 "num_base_bdevs": 2, 00:24:38.289 "num_base_bdevs_discovered": 1, 00:24:38.289 "num_base_bdevs_operational": 2, 00:24:38.289 "base_bdevs_list": [ 00:24:38.289 { 00:24:38.289 "name": "BaseBdev1", 00:24:38.289 "uuid": "bbc655da-421c-4507-8c00-c08149ef008e", 00:24:38.289 "is_configured": true, 00:24:38.289 "data_offset": 256, 00:24:38.289 "data_size": 7936 00:24:38.289 }, 00:24:38.289 { 00:24:38.289 "name": "BaseBdev2", 00:24:38.289 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:38.289 "is_configured": false, 00:24:38.289 "data_offset": 0, 00:24:38.289 "data_size": 0 00:24:38.289 } 00:24:38.289 ] 00:24:38.289 }' 00:24:38.289 00:06:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:38.289 00:06:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:39.225 00:06:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:24:39.225 [2024-05-15 00:06:39.620865] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:24:39.225 [2024-05-15 00:06:39.620904] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11dae60 name Existed_Raid, state configuring 00:24:39.225 00:06:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:24:39.484 [2024-05-15 00:06:39.869556] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:39.484 [2024-05-15 00:06:39.871040] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:39.484 [2024-05-15 00:06:39.871071] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:39.484 00:06:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:24:39.484 00:06:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:24:39.484 00:06:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:24:39.484 00:06:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:24:39.484 00:06:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:24:39.484 00:06:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:39.484 00:06:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:39.484 00:06:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:24:39.484 00:06:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:39.484 00:06:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:39.484 00:06:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:39.484 00:06:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:39.484 00:06:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:39.484 00:06:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:39.743 00:06:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:39.743 "name": "Existed_Raid", 00:24:39.743 "uuid": "04189714-a683-42d6-aa1e-6bdb63a73990", 00:24:39.743 "strip_size_kb": 0, 00:24:39.743 "state": "configuring", 00:24:39.743 "raid_level": "raid1", 00:24:39.743 "superblock": true, 00:24:39.743 "num_base_bdevs": 2, 00:24:39.743 "num_base_bdevs_discovered": 1, 00:24:39.743 "num_base_bdevs_operational": 2, 00:24:39.743 "base_bdevs_list": [ 00:24:39.743 { 00:24:39.743 "name": "BaseBdev1", 00:24:39.743 "uuid": "bbc655da-421c-4507-8c00-c08149ef008e", 00:24:39.743 "is_configured": true, 00:24:39.743 "data_offset": 256, 00:24:39.743 "data_size": 7936 00:24:39.743 }, 00:24:39.743 { 00:24:39.743 "name": "BaseBdev2", 00:24:39.743 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:39.743 "is_configured": false, 00:24:39.743 "data_offset": 0, 00:24:39.743 "data_size": 0 00:24:39.743 } 00:24:39.743 ] 00:24:39.743 }' 00:24:39.743 00:06:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:39.743 00:06:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:40.310 00:06:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:24:40.310 [2024-05-15 00:06:40.884267] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:40.310 [2024-05-15 00:06:40.884427] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x11da4b0 00:24:40.310 [2024-05-15 00:06:40.884446] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:40.310 [2024-05-15 00:06:40.884508] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1378970 00:24:40.310 [2024-05-15 00:06:40.884606] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11da4b0 00:24:40.310 [2024-05-15 00:06:40.884616] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x11da4b0 00:24:40.310 [2024-05-15 00:06:40.884682] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:40.310 BaseBdev2 00:24:40.568 00:06:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:24:40.568 00:06:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:24:40.568 00:06:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:24:40.568 00:06:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local i 00:24:40.568 00:06:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:24:40.568 00:06:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:24:40.568 00:06:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:40.568 00:06:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:24:40.826 [ 00:24:40.826 { 00:24:40.826 "name": "BaseBdev2", 00:24:40.826 "aliases": [ 00:24:40.826 "7da4cb7e-709b-4f0b-994d-7f0ab1d3013b" 00:24:40.826 ], 00:24:40.826 "product_name": "Malloc disk", 00:24:40.826 "block_size": 4096, 00:24:40.826 "num_blocks": 8192, 00:24:40.826 "uuid": "7da4cb7e-709b-4f0b-994d-7f0ab1d3013b", 00:24:40.826 "md_size": 32, 00:24:40.826 "md_interleave": false, 00:24:40.826 "dif_type": 0, 00:24:40.826 "assigned_rate_limits": { 00:24:40.826 "rw_ios_per_sec": 0, 00:24:40.826 "rw_mbytes_per_sec": 0, 00:24:40.826 "r_mbytes_per_sec": 0, 00:24:40.826 "w_mbytes_per_sec": 0 00:24:40.826 }, 00:24:40.826 "claimed": true, 00:24:40.826 "claim_type": "exclusive_write", 00:24:40.826 "zoned": false, 00:24:40.826 "supported_io_types": { 00:24:40.826 "read": true, 00:24:40.826 "write": true, 00:24:40.826 "unmap": true, 00:24:40.826 "write_zeroes": true, 00:24:40.826 "flush": true, 00:24:40.826 "reset": true, 00:24:40.826 "compare": false, 00:24:40.826 "compare_and_write": false, 00:24:40.827 "abort": true, 00:24:40.827 "nvme_admin": false, 00:24:40.827 "nvme_io": false 00:24:40.827 }, 00:24:40.827 "memory_domains": [ 00:24:40.827 { 00:24:40.827 "dma_device_id": "system", 00:24:40.827 "dma_device_type": 1 00:24:40.827 }, 00:24:40.827 { 00:24:40.827 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:40.827 "dma_device_type": 2 00:24:40.827 } 00:24:40.827 ], 00:24:40.827 "driver_specific": {} 00:24:40.827 } 00:24:40.827 ] 00:24:40.827 00:06:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@903 -- # return 0 00:24:40.827 00:06:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:24:40.827 00:06:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:24:40.827 00:06:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:24:40.827 00:06:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:24:40.827 00:06:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:40.827 00:06:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:40.827 00:06:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:40.827 00:06:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:24:40.827 00:06:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:40.827 00:06:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:40.827 00:06:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:40.827 00:06:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:40.827 00:06:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:40.827 00:06:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:41.085 00:06:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:41.085 "name": "Existed_Raid", 00:24:41.085 "uuid": "04189714-a683-42d6-aa1e-6bdb63a73990", 00:24:41.085 "strip_size_kb": 0, 00:24:41.085 "state": "online", 00:24:41.085 "raid_level": "raid1", 00:24:41.085 "superblock": true, 00:24:41.085 "num_base_bdevs": 2, 00:24:41.085 "num_base_bdevs_discovered": 2, 00:24:41.085 "num_base_bdevs_operational": 2, 00:24:41.085 "base_bdevs_list": [ 00:24:41.085 { 00:24:41.085 "name": "BaseBdev1", 00:24:41.085 "uuid": "bbc655da-421c-4507-8c00-c08149ef008e", 00:24:41.085 "is_configured": true, 00:24:41.085 "data_offset": 256, 00:24:41.085 "data_size": 7936 00:24:41.085 }, 00:24:41.085 { 00:24:41.085 "name": "BaseBdev2", 00:24:41.085 "uuid": "7da4cb7e-709b-4f0b-994d-7f0ab1d3013b", 00:24:41.085 "is_configured": true, 00:24:41.085 "data_offset": 256, 00:24:41.085 "data_size": 7936 00:24:41.085 } 00:24:41.085 ] 00:24:41.085 }' 00:24:41.085 00:06:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:41.085 00:06:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:41.651 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:24:41.651 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:24:41.651 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:24:41.651 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:24:41.651 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:24:41.651 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@199 -- # local name 00:24:41.651 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:24:41.651 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:24:41.909 [2024-05-15 00:06:42.448683] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:41.909 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:24:41.909 "name": "Existed_Raid", 00:24:41.909 "aliases": [ 00:24:41.909 "04189714-a683-42d6-aa1e-6bdb63a73990" 00:24:41.909 ], 00:24:41.909 "product_name": "Raid Volume", 00:24:41.909 "block_size": 4096, 00:24:41.909 "num_blocks": 7936, 00:24:41.909 "uuid": "04189714-a683-42d6-aa1e-6bdb63a73990", 00:24:41.909 "md_size": 32, 00:24:41.909 "md_interleave": false, 00:24:41.909 "dif_type": 0, 00:24:41.909 "assigned_rate_limits": { 00:24:41.909 "rw_ios_per_sec": 0, 00:24:41.909 "rw_mbytes_per_sec": 0, 00:24:41.909 "r_mbytes_per_sec": 0, 00:24:41.909 "w_mbytes_per_sec": 0 00:24:41.909 }, 00:24:41.909 "claimed": false, 00:24:41.909 "zoned": false, 00:24:41.909 "supported_io_types": { 00:24:41.909 "read": true, 00:24:41.909 "write": true, 00:24:41.909 "unmap": false, 00:24:41.909 "write_zeroes": true, 00:24:41.909 "flush": false, 00:24:41.909 "reset": true, 00:24:41.909 "compare": false, 00:24:41.909 "compare_and_write": false, 00:24:41.909 "abort": false, 00:24:41.909 "nvme_admin": false, 00:24:41.909 "nvme_io": false 00:24:41.909 }, 00:24:41.909 "memory_domains": [ 00:24:41.909 { 00:24:41.909 "dma_device_id": "system", 00:24:41.909 "dma_device_type": 1 00:24:41.909 }, 00:24:41.909 { 00:24:41.909 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:41.909 "dma_device_type": 2 00:24:41.909 }, 00:24:41.909 { 00:24:41.909 "dma_device_id": "system", 00:24:41.909 "dma_device_type": 1 00:24:41.909 }, 00:24:41.909 { 00:24:41.909 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:41.909 "dma_device_type": 2 00:24:41.909 } 00:24:41.909 ], 00:24:41.909 "driver_specific": { 00:24:41.909 "raid": { 00:24:41.909 "uuid": "04189714-a683-42d6-aa1e-6bdb63a73990", 00:24:41.909 "strip_size_kb": 0, 00:24:41.909 "state": "online", 00:24:41.909 "raid_level": "raid1", 00:24:41.909 "superblock": true, 00:24:41.909 "num_base_bdevs": 2, 00:24:41.909 "num_base_bdevs_discovered": 2, 00:24:41.909 "num_base_bdevs_operational": 2, 00:24:41.909 "base_bdevs_list": [ 00:24:41.909 { 00:24:41.909 "name": "BaseBdev1", 00:24:41.909 "uuid": "bbc655da-421c-4507-8c00-c08149ef008e", 00:24:41.909 "is_configured": true, 00:24:41.909 "data_offset": 256, 00:24:41.909 "data_size": 7936 00:24:41.909 }, 00:24:41.909 { 00:24:41.909 "name": "BaseBdev2", 00:24:41.909 "uuid": "7da4cb7e-709b-4f0b-994d-7f0ab1d3013b", 00:24:41.909 "is_configured": true, 00:24:41.909 "data_offset": 256, 00:24:41.909 "data_size": 7936 00:24:41.909 } 00:24:41.909 ] 00:24:41.909 } 00:24:41.909 } 00:24:41.909 }' 00:24:41.909 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:42.167 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:24:42.167 BaseBdev2' 00:24:42.167 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:24:42.167 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:24:42.167 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:24:42.168 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:24:42.168 "name": "BaseBdev1", 00:24:42.168 "aliases": [ 00:24:42.168 "bbc655da-421c-4507-8c00-c08149ef008e" 00:24:42.168 ], 00:24:42.168 "product_name": "Malloc disk", 00:24:42.168 "block_size": 4096, 00:24:42.168 "num_blocks": 8192, 00:24:42.168 "uuid": "bbc655da-421c-4507-8c00-c08149ef008e", 00:24:42.168 "md_size": 32, 00:24:42.168 "md_interleave": false, 00:24:42.168 "dif_type": 0, 00:24:42.168 "assigned_rate_limits": { 00:24:42.168 "rw_ios_per_sec": 0, 00:24:42.168 "rw_mbytes_per_sec": 0, 00:24:42.168 "r_mbytes_per_sec": 0, 00:24:42.168 "w_mbytes_per_sec": 0 00:24:42.168 }, 00:24:42.168 "claimed": true, 00:24:42.168 "claim_type": "exclusive_write", 00:24:42.168 "zoned": false, 00:24:42.168 "supported_io_types": { 00:24:42.168 "read": true, 00:24:42.168 "write": true, 00:24:42.168 "unmap": true, 00:24:42.168 "write_zeroes": true, 00:24:42.168 "flush": true, 00:24:42.168 "reset": true, 00:24:42.168 "compare": false, 00:24:42.168 "compare_and_write": false, 00:24:42.168 "abort": true, 00:24:42.168 "nvme_admin": false, 00:24:42.168 "nvme_io": false 00:24:42.168 }, 00:24:42.168 "memory_domains": [ 00:24:42.168 { 00:24:42.168 "dma_device_id": "system", 00:24:42.168 "dma_device_type": 1 00:24:42.168 }, 00:24:42.168 { 00:24:42.168 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:42.168 "dma_device_type": 2 00:24:42.168 } 00:24:42.168 ], 00:24:42.168 "driver_specific": {} 00:24:42.168 }' 00:24:42.168 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:24:42.168 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:24:42.426 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:24:42.426 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:24:42.426 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:24:42.426 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:24:42.426 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:24:42.426 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:24:42.426 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ false == false ]] 00:24:42.426 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:24:42.426 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:24:42.426 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:24:42.426 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:24:42.426 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:24:42.426 00:06:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:24:42.683 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:24:42.683 "name": "BaseBdev2", 00:24:42.683 "aliases": [ 00:24:42.683 "7da4cb7e-709b-4f0b-994d-7f0ab1d3013b" 00:24:42.683 ], 00:24:42.683 "product_name": "Malloc disk", 00:24:42.683 "block_size": 4096, 00:24:42.683 "num_blocks": 8192, 00:24:42.683 "uuid": "7da4cb7e-709b-4f0b-994d-7f0ab1d3013b", 00:24:42.683 "md_size": 32, 00:24:42.683 "md_interleave": false, 00:24:42.683 "dif_type": 0, 00:24:42.683 "assigned_rate_limits": { 00:24:42.683 "rw_ios_per_sec": 0, 00:24:42.683 "rw_mbytes_per_sec": 0, 00:24:42.683 "r_mbytes_per_sec": 0, 00:24:42.683 "w_mbytes_per_sec": 0 00:24:42.683 }, 00:24:42.683 "claimed": true, 00:24:42.683 "claim_type": "exclusive_write", 00:24:42.683 "zoned": false, 00:24:42.683 "supported_io_types": { 00:24:42.683 "read": true, 00:24:42.683 "write": true, 00:24:42.683 "unmap": true, 00:24:42.683 "write_zeroes": true, 00:24:42.683 "flush": true, 00:24:42.683 "reset": true, 00:24:42.683 "compare": false, 00:24:42.683 "compare_and_write": false, 00:24:42.683 "abort": true, 00:24:42.683 "nvme_admin": false, 00:24:42.683 "nvme_io": false 00:24:42.683 }, 00:24:42.683 "memory_domains": [ 00:24:42.683 { 00:24:42.684 "dma_device_id": "system", 00:24:42.684 "dma_device_type": 1 00:24:42.684 }, 00:24:42.684 { 00:24:42.684 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:42.684 "dma_device_type": 2 00:24:42.684 } 00:24:42.684 ], 00:24:42.684 "driver_specific": {} 00:24:42.684 }' 00:24:42.684 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:24:42.684 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:24:42.684 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:24:42.684 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:24:42.942 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:24:42.942 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:24:42.942 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:24:42.942 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:24:42.942 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ false == false ]] 00:24:42.942 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:24:42.942 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:24:42.942 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:24:42.942 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:24:43.201 [2024-05-15 00:06:43.743932] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:43.201 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # local expected_state 00:24:43.201 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@277 -- # has_redundancy raid1 00:24:43.201 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # case $1 in 00:24:43.201 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@215 -- # return 0 00:24:43.201 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@280 -- # expected_state=online 00:24:43.201 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:24:43.201 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:24:43.201 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:43.201 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:43.201 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:43.201 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:24:43.201 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:43.201 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:43.201 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:43.201 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:43.201 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:43.201 00:06:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:43.460 00:06:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:43.460 "name": "Existed_Raid", 00:24:43.460 "uuid": "04189714-a683-42d6-aa1e-6bdb63a73990", 00:24:43.460 "strip_size_kb": 0, 00:24:43.460 "state": "online", 00:24:43.460 "raid_level": "raid1", 00:24:43.460 "superblock": true, 00:24:43.460 "num_base_bdevs": 2, 00:24:43.460 "num_base_bdevs_discovered": 1, 00:24:43.460 "num_base_bdevs_operational": 1, 00:24:43.460 "base_bdevs_list": [ 00:24:43.460 { 00:24:43.460 "name": null, 00:24:43.460 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:43.460 "is_configured": false, 00:24:43.460 "data_offset": 256, 00:24:43.460 "data_size": 7936 00:24:43.460 }, 00:24:43.460 { 00:24:43.460 "name": "BaseBdev2", 00:24:43.460 "uuid": "7da4cb7e-709b-4f0b-994d-7f0ab1d3013b", 00:24:43.460 "is_configured": true, 00:24:43.460 "data_offset": 256, 00:24:43.460 "data_size": 7936 00:24:43.460 } 00:24:43.460 ] 00:24:43.460 }' 00:24:43.460 00:06:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:43.460 00:06:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:44.394 00:06:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:24:44.394 00:06:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:24:44.394 00:06:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:44.394 00:06:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:24:44.394 00:06:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:24:44.394 00:06:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:24:44.394 00:06:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:24:44.652 [2024-05-15 00:06:45.079928] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:24:44.652 [2024-05-15 00:06:45.080012] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:44.652 [2024-05-15 00:06:45.093241] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:44.652 [2024-05-15 00:06:45.093305] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:44.652 [2024-05-15 00:06:45.093318] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11da4b0 name Existed_Raid, state offline 00:24:44.652 00:06:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:24:44.652 00:06:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:24:44.652 00:06:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:44.652 00:06:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:24:44.910 00:06:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:24:44.910 00:06:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:24:44.910 00:06:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@300 -- # '[' 2 -gt 2 ']' 00:24:44.910 00:06:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@342 -- # killprocess 513404 00:24:44.910 00:06:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@946 -- # '[' -z 513404 ']' 00:24:44.910 00:06:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@950 -- # kill -0 513404 00:24:44.910 00:06:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@951 -- # uname 00:24:44.910 00:06:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:24:44.910 00:06:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 513404 00:24:44.910 00:06:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:24:44.910 00:06:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:24:44.910 00:06:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@964 -- # echo 'killing process with pid 513404' 00:24:44.910 killing process with pid 513404 00:24:44.910 00:06:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@965 -- # kill 513404 00:24:44.910 [2024-05-15 00:06:45.396746] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:44.910 00:06:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@970 -- # wait 513404 00:24:44.910 [2024-05-15 00:06:45.397712] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:45.169 00:06:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@344 -- # return 0 00:24:45.169 00:24:45.169 real 0m10.165s 00:24:45.169 user 0m18.033s 00:24:45.169 sys 0m1.884s 00:24:45.169 00:06:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1122 -- # xtrace_disable 00:24:45.170 00:06:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:45.170 ************************************ 00:24:45.170 END TEST raid_state_function_test_sb_md_separate 00:24:45.170 ************************************ 00:24:45.170 00:06:45 bdev_raid -- bdev/bdev_raid.sh@852 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:24:45.170 00:06:45 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:24:45.170 00:06:45 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:24:45.170 00:06:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:45.170 ************************************ 00:24:45.170 START TEST raid_superblock_test_md_separate 00:24:45.170 ************************************ 00:24:45.170 00:06:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1121 -- # raid_superblock_test raid1 2 00:24:45.170 00:06:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local raid_level=raid1 00:24:45.170 00:06:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=2 00:24:45.170 00:06:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:24:45.170 00:06:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:24:45.170 00:06:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:24:45.170 00:06:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:24:45.170 00:06:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:24:45.170 00:06:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:24:45.170 00:06:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:24:45.170 00:06:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size 00:24:45.170 00:06:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:24:45.170 00:06:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:24:45.170 00:06:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:24:45.170 00:06:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@404 -- # '[' raid1 '!=' raid1 ']' 00:24:45.170 00:06:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@408 -- # strip_size=0 00:24:45.170 00:06:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:24:45.170 00:06:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # raid_pid=514866 00:24:45.170 00:06:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@413 -- # waitforlisten 514866 /var/tmp/spdk-raid.sock 00:24:45.170 00:06:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@827 -- # '[' -z 514866 ']' 00:24:45.170 00:06:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:45.170 00:06:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@832 -- # local max_retries=100 00:24:45.170 00:06:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:45.170 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:45.170 00:06:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@836 -- # xtrace_disable 00:24:45.170 00:06:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:45.429 [2024-05-15 00:06:45.772814] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:24:45.429 [2024-05-15 00:06:45.772857] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid514866 ] 00:24:45.429 [2024-05-15 00:06:45.886683] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:45.429 [2024-05-15 00:06:45.996412] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:24:45.688 [2024-05-15 00:06:46.063438] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:45.688 [2024-05-15 00:06:46.063477] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:46.255 00:06:46 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:24:46.255 00:06:46 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@860 -- # return 0 00:24:46.255 00:06:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:24:46.255 00:06:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:24:46.255 00:06:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:24:46.255 00:06:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:24:46.255 00:06:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:24:46.255 00:06:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:24:46.255 00:06:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:24:46.255 00:06:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:24:46.255 00:06:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:24:46.513 malloc1 00:24:46.513 00:06:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:24:46.771 [2024-05-15 00:06:47.202191] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:24:46.772 [2024-05-15 00:06:47.202240] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:46.772 [2024-05-15 00:06:47.202263] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1832300 00:24:46.772 [2024-05-15 00:06:47.202275] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:46.772 [2024-05-15 00:06:47.203866] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:46.772 [2024-05-15 00:06:47.203895] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:24:46.772 pt1 00:24:46.772 00:06:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:24:46.772 00:06:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:24:46.772 00:06:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:24:46.772 00:06:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:24:46.772 00:06:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:24:46.772 00:06:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:24:46.772 00:06:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:24:46.772 00:06:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:24:46.772 00:06:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:24:47.029 malloc2 00:24:47.029 00:06:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:24:47.287 [2024-05-15 00:06:47.690248] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:24:47.287 [2024-05-15 00:06:47.690295] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:47.287 [2024-05-15 00:06:47.690315] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19b2a40 00:24:47.287 [2024-05-15 00:06:47.690327] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:47.287 [2024-05-15 00:06:47.691929] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:47.287 [2024-05-15 00:06:47.691957] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:24:47.287 pt2 00:24:47.287 00:06:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:24:47.287 00:06:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:24:47.287 00:06:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:24:47.546 [2024-05-15 00:06:47.934907] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:24:47.546 [2024-05-15 00:06:47.936233] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:47.546 [2024-05-15 00:06:47.936384] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x19b34c0 00:24:47.546 [2024-05-15 00:06:47.936405] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:47.546 [2024-05-15 00:06:47.936481] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19b3170 00:24:47.546 [2024-05-15 00:06:47.936600] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19b34c0 00:24:47.546 [2024-05-15 00:06:47.936609] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19b34c0 00:24:47.546 [2024-05-15 00:06:47.936682] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:47.546 00:06:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:47.546 00:06:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:47.546 00:06:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:47.546 00:06:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:47.546 00:06:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:47.546 00:06:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:24:47.546 00:06:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:47.546 00:06:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:47.546 00:06:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:47.546 00:06:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:47.546 00:06:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:47.546 00:06:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:47.805 00:06:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:47.805 "name": "raid_bdev1", 00:24:47.805 "uuid": "351a1b2b-ee63-4215-af42-8558f886a6df", 00:24:47.805 "strip_size_kb": 0, 00:24:47.805 "state": "online", 00:24:47.805 "raid_level": "raid1", 00:24:47.805 "superblock": true, 00:24:47.805 "num_base_bdevs": 2, 00:24:47.805 "num_base_bdevs_discovered": 2, 00:24:47.805 "num_base_bdevs_operational": 2, 00:24:47.805 "base_bdevs_list": [ 00:24:47.805 { 00:24:47.805 "name": "pt1", 00:24:47.805 "uuid": "c0070c21-ed20-57d9-9190-64b9c1a3cc83", 00:24:47.805 "is_configured": true, 00:24:47.805 "data_offset": 256, 00:24:47.805 "data_size": 7936 00:24:47.805 }, 00:24:47.805 { 00:24:47.805 "name": "pt2", 00:24:47.805 "uuid": "8e918d48-a51b-592d-856f-d35144310e5c", 00:24:47.805 "is_configured": true, 00:24:47.805 "data_offset": 256, 00:24:47.805 "data_size": 7936 00:24:47.805 } 00:24:47.805 ] 00:24:47.805 }' 00:24:47.805 00:06:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:47.805 00:06:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:48.369 00:06:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:24:48.369 00:06:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:24:48.369 00:06:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:24:48.369 00:06:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:24:48.369 00:06:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:24:48.369 00:06:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@199 -- # local name 00:24:48.369 00:06:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:48.369 00:06:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:24:48.626 [2024-05-15 00:06:49.005945] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:48.626 00:06:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:24:48.626 "name": "raid_bdev1", 00:24:48.626 "aliases": [ 00:24:48.626 "351a1b2b-ee63-4215-af42-8558f886a6df" 00:24:48.626 ], 00:24:48.626 "product_name": "Raid Volume", 00:24:48.626 "block_size": 4096, 00:24:48.626 "num_blocks": 7936, 00:24:48.626 "uuid": "351a1b2b-ee63-4215-af42-8558f886a6df", 00:24:48.626 "md_size": 32, 00:24:48.626 "md_interleave": false, 00:24:48.626 "dif_type": 0, 00:24:48.626 "assigned_rate_limits": { 00:24:48.626 "rw_ios_per_sec": 0, 00:24:48.626 "rw_mbytes_per_sec": 0, 00:24:48.626 "r_mbytes_per_sec": 0, 00:24:48.626 "w_mbytes_per_sec": 0 00:24:48.626 }, 00:24:48.626 "claimed": false, 00:24:48.626 "zoned": false, 00:24:48.626 "supported_io_types": { 00:24:48.626 "read": true, 00:24:48.626 "write": true, 00:24:48.626 "unmap": false, 00:24:48.626 "write_zeroes": true, 00:24:48.626 "flush": false, 00:24:48.626 "reset": true, 00:24:48.626 "compare": false, 00:24:48.626 "compare_and_write": false, 00:24:48.626 "abort": false, 00:24:48.626 "nvme_admin": false, 00:24:48.626 "nvme_io": false 00:24:48.626 }, 00:24:48.626 "memory_domains": [ 00:24:48.626 { 00:24:48.626 "dma_device_id": "system", 00:24:48.626 "dma_device_type": 1 00:24:48.626 }, 00:24:48.626 { 00:24:48.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:48.627 "dma_device_type": 2 00:24:48.627 }, 00:24:48.627 { 00:24:48.627 "dma_device_id": "system", 00:24:48.627 "dma_device_type": 1 00:24:48.627 }, 00:24:48.627 { 00:24:48.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:48.627 "dma_device_type": 2 00:24:48.627 } 00:24:48.627 ], 00:24:48.627 "driver_specific": { 00:24:48.627 "raid": { 00:24:48.627 "uuid": "351a1b2b-ee63-4215-af42-8558f886a6df", 00:24:48.627 "strip_size_kb": 0, 00:24:48.627 "state": "online", 00:24:48.627 "raid_level": "raid1", 00:24:48.627 "superblock": true, 00:24:48.627 "num_base_bdevs": 2, 00:24:48.627 "num_base_bdevs_discovered": 2, 00:24:48.627 "num_base_bdevs_operational": 2, 00:24:48.627 "base_bdevs_list": [ 00:24:48.627 { 00:24:48.627 "name": "pt1", 00:24:48.627 "uuid": "c0070c21-ed20-57d9-9190-64b9c1a3cc83", 00:24:48.627 "is_configured": true, 00:24:48.627 "data_offset": 256, 00:24:48.627 "data_size": 7936 00:24:48.627 }, 00:24:48.627 { 00:24:48.627 "name": "pt2", 00:24:48.627 "uuid": "8e918d48-a51b-592d-856f-d35144310e5c", 00:24:48.627 "is_configured": true, 00:24:48.627 "data_offset": 256, 00:24:48.627 "data_size": 7936 00:24:48.627 } 00:24:48.627 ] 00:24:48.627 } 00:24:48.627 } 00:24:48.627 }' 00:24:48.627 00:06:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:48.627 00:06:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:24:48.627 pt2' 00:24:48.627 00:06:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:24:48.627 00:06:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:24:48.627 00:06:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:24:48.885 00:06:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:24:48.885 "name": "pt1", 00:24:48.885 "aliases": [ 00:24:48.885 "c0070c21-ed20-57d9-9190-64b9c1a3cc83" 00:24:48.885 ], 00:24:48.885 "product_name": "passthru", 00:24:48.885 "block_size": 4096, 00:24:48.885 "num_blocks": 8192, 00:24:48.885 "uuid": "c0070c21-ed20-57d9-9190-64b9c1a3cc83", 00:24:48.885 "md_size": 32, 00:24:48.885 "md_interleave": false, 00:24:48.885 "dif_type": 0, 00:24:48.885 "assigned_rate_limits": { 00:24:48.885 "rw_ios_per_sec": 0, 00:24:48.885 "rw_mbytes_per_sec": 0, 00:24:48.885 "r_mbytes_per_sec": 0, 00:24:48.885 "w_mbytes_per_sec": 0 00:24:48.885 }, 00:24:48.885 "claimed": true, 00:24:48.885 "claim_type": "exclusive_write", 00:24:48.885 "zoned": false, 00:24:48.885 "supported_io_types": { 00:24:48.885 "read": true, 00:24:48.885 "write": true, 00:24:48.885 "unmap": true, 00:24:48.885 "write_zeroes": true, 00:24:48.885 "flush": true, 00:24:48.885 "reset": true, 00:24:48.885 "compare": false, 00:24:48.885 "compare_and_write": false, 00:24:48.885 "abort": true, 00:24:48.885 "nvme_admin": false, 00:24:48.885 "nvme_io": false 00:24:48.885 }, 00:24:48.885 "memory_domains": [ 00:24:48.885 { 00:24:48.885 "dma_device_id": "system", 00:24:48.885 "dma_device_type": 1 00:24:48.885 }, 00:24:48.885 { 00:24:48.885 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:48.885 "dma_device_type": 2 00:24:48.885 } 00:24:48.885 ], 00:24:48.885 "driver_specific": { 00:24:48.885 "passthru": { 00:24:48.885 "name": "pt1", 00:24:48.885 "base_bdev_name": "malloc1" 00:24:48.885 } 00:24:48.885 } 00:24:48.885 }' 00:24:48.885 00:06:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:24:48.885 00:06:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:24:48.885 00:06:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:24:48.885 00:06:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:24:48.885 00:06:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:24:49.143 00:06:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:24:49.143 00:06:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:24:49.143 00:06:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:24:49.143 00:06:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ false == false ]] 00:24:49.143 00:06:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:24:49.143 00:06:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:24:49.143 00:06:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:24:49.143 00:06:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:24:49.143 00:06:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:24:49.143 00:06:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:24:49.402 00:06:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:24:49.402 "name": "pt2", 00:24:49.402 "aliases": [ 00:24:49.402 "8e918d48-a51b-592d-856f-d35144310e5c" 00:24:49.402 ], 00:24:49.402 "product_name": "passthru", 00:24:49.402 "block_size": 4096, 00:24:49.402 "num_blocks": 8192, 00:24:49.402 "uuid": "8e918d48-a51b-592d-856f-d35144310e5c", 00:24:49.402 "md_size": 32, 00:24:49.402 "md_interleave": false, 00:24:49.402 "dif_type": 0, 00:24:49.402 "assigned_rate_limits": { 00:24:49.402 "rw_ios_per_sec": 0, 00:24:49.402 "rw_mbytes_per_sec": 0, 00:24:49.402 "r_mbytes_per_sec": 0, 00:24:49.402 "w_mbytes_per_sec": 0 00:24:49.402 }, 00:24:49.402 "claimed": true, 00:24:49.402 "claim_type": "exclusive_write", 00:24:49.402 "zoned": false, 00:24:49.402 "supported_io_types": { 00:24:49.402 "read": true, 00:24:49.402 "write": true, 00:24:49.402 "unmap": true, 00:24:49.402 "write_zeroes": true, 00:24:49.402 "flush": true, 00:24:49.402 "reset": true, 00:24:49.402 "compare": false, 00:24:49.402 "compare_and_write": false, 00:24:49.402 "abort": true, 00:24:49.402 "nvme_admin": false, 00:24:49.402 "nvme_io": false 00:24:49.402 }, 00:24:49.402 "memory_domains": [ 00:24:49.402 { 00:24:49.402 "dma_device_id": "system", 00:24:49.402 "dma_device_type": 1 00:24:49.402 }, 00:24:49.402 { 00:24:49.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:49.402 "dma_device_type": 2 00:24:49.402 } 00:24:49.402 ], 00:24:49.402 "driver_specific": { 00:24:49.402 "passthru": { 00:24:49.402 "name": "pt2", 00:24:49.402 "base_bdev_name": "malloc2" 00:24:49.402 } 00:24:49.402 } 00:24:49.402 }' 00:24:49.402 00:06:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:24:49.402 00:06:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:24:49.660 00:06:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:24:49.660 00:06:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:24:49.660 00:06:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:24:49.660 00:06:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:24:49.660 00:06:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:24:49.660 00:06:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:24:49.660 00:06:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ false == false ]] 00:24:49.660 00:06:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:24:49.918 00:06:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:24:49.918 00:06:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:24:49.918 00:06:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:49.918 00:06:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:24:50.176 [2024-05-15 00:06:50.521980] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:50.176 00:06:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=351a1b2b-ee63-4215-af42-8558f886a6df 00:24:50.176 00:06:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@436 -- # '[' -z 351a1b2b-ee63-4215-af42-8558f886a6df ']' 00:24:50.176 00:06:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:50.433 [2024-05-15 00:06:50.766378] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:50.433 [2024-05-15 00:06:50.766397] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:50.433 [2024-05-15 00:06:50.766461] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:50.433 [2024-05-15 00:06:50.766514] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:50.434 [2024-05-15 00:06:50.766525] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19b34c0 name raid_bdev1, state offline 00:24:50.434 00:06:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:50.434 00:06:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:24:50.692 00:06:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:24:50.692 00:06:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:24:50.692 00:06:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:24:50.692 00:06:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:24:50.692 00:06:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:24:50.692 00:06:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:24:50.950 00:06:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:24:50.950 00:06:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:24:51.207 00:06:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:24:51.207 00:06:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:24:51.207 00:06:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:24:51.207 00:06:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:24:51.207 00:06:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:51.207 00:06:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:51.207 00:06:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:51.207 00:06:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:51.207 00:06:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:51.207 00:06:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:51.207 00:06:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:51.207 00:06:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:51.208 00:06:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:24:51.480 [2024-05-15 00:06:51.993586] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:24:51.480 [2024-05-15 00:06:51.994926] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:24:51.480 [2024-05-15 00:06:51.994983] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:24:51.480 [2024-05-15 00:06:51.995023] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:24:51.480 [2024-05-15 00:06:51.995042] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:51.480 [2024-05-15 00:06:51.995052] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1831cc0 name raid_bdev1, state configuring 00:24:51.480 request: 00:24:51.480 { 00:24:51.480 "name": "raid_bdev1", 00:24:51.480 "raid_level": "raid1", 00:24:51.480 "base_bdevs": [ 00:24:51.480 "malloc1", 00:24:51.480 "malloc2" 00:24:51.480 ], 00:24:51.480 "superblock": false, 00:24:51.480 "method": "bdev_raid_create", 00:24:51.480 "req_id": 1 00:24:51.480 } 00:24:51.480 Got JSON-RPC error response 00:24:51.480 response: 00:24:51.480 { 00:24:51.480 "code": -17, 00:24:51.480 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:24:51.480 } 00:24:51.480 00:06:52 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # es=1 00:24:51.480 00:06:52 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:24:51.480 00:06:52 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:24:51.480 00:06:52 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:24:51.480 00:06:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:51.480 00:06:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:24:51.750 00:06:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:24:51.750 00:06:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:24:51.750 00:06:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:24:52.008 [2024-05-15 00:06:52.486822] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:24:52.008 [2024-05-15 00:06:52.486859] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:52.008 [2024-05-15 00:06:52.486878] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1832530 00:24:52.008 [2024-05-15 00:06:52.486891] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:52.008 [2024-05-15 00:06:52.488287] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:52.008 [2024-05-15 00:06:52.488313] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:24:52.008 [2024-05-15 00:06:52.488355] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt1 00:24:52.008 [2024-05-15 00:06:52.488379] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:24:52.008 pt1 00:24:52.008 00:06:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:24:52.008 00:06:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:52.008 00:06:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:24:52.008 00:06:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:52.008 00:06:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:52.008 00:06:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:24:52.008 00:06:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:52.008 00:06:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:52.008 00:06:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:52.008 00:06:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:52.008 00:06:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:52.008 00:06:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:52.266 00:06:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:52.266 "name": "raid_bdev1", 00:24:52.266 "uuid": "351a1b2b-ee63-4215-af42-8558f886a6df", 00:24:52.266 "strip_size_kb": 0, 00:24:52.266 "state": "configuring", 00:24:52.266 "raid_level": "raid1", 00:24:52.266 "superblock": true, 00:24:52.266 "num_base_bdevs": 2, 00:24:52.266 "num_base_bdevs_discovered": 1, 00:24:52.266 "num_base_bdevs_operational": 2, 00:24:52.266 "base_bdevs_list": [ 00:24:52.266 { 00:24:52.266 "name": "pt1", 00:24:52.266 "uuid": "c0070c21-ed20-57d9-9190-64b9c1a3cc83", 00:24:52.266 "is_configured": true, 00:24:52.266 "data_offset": 256, 00:24:52.266 "data_size": 7936 00:24:52.266 }, 00:24:52.266 { 00:24:52.266 "name": null, 00:24:52.266 "uuid": "8e918d48-a51b-592d-856f-d35144310e5c", 00:24:52.266 "is_configured": false, 00:24:52.266 "data_offset": 256, 00:24:52.266 "data_size": 7936 00:24:52.266 } 00:24:52.266 ] 00:24:52.266 }' 00:24:52.266 00:06:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:52.266 00:06:52 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:52.831 00:06:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@470 -- # '[' 2 -gt 2 ']' 00:24:52.831 00:06:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:24:52.831 00:06:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:24:52.831 00:06:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:24:53.090 [2024-05-15 00:06:53.585735] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:24:53.090 [2024-05-15 00:06:53.585784] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:53.090 [2024-05-15 00:06:53.585806] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19b5680 00:24:53.090 [2024-05-15 00:06:53.585819] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:53.090 [2024-05-15 00:06:53.586009] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:53.090 [2024-05-15 00:06:53.586025] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:24:53.090 [2024-05-15 00:06:53.586067] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:24:53.090 [2024-05-15 00:06:53.586085] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:53.090 [2024-05-15 00:06:53.586176] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x19b6030 00:24:53.090 [2024-05-15 00:06:53.586187] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:53.090 [2024-05-15 00:06:53.586246] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1832c10 00:24:53.090 [2024-05-15 00:06:53.586349] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19b6030 00:24:53.090 [2024-05-15 00:06:53.586359] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19b6030 00:24:53.090 [2024-05-15 00:06:53.586440] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:53.090 pt2 00:24:53.090 00:06:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:24:53.090 00:06:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:24:53.090 00:06:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:53.090 00:06:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:53.090 00:06:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:53.090 00:06:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:53.090 00:06:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:53.090 00:06:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:24:53.090 00:06:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:53.090 00:06:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:53.090 00:06:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:53.090 00:06:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:53.090 00:06:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:53.090 00:06:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:53.348 00:06:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:53.348 "name": "raid_bdev1", 00:24:53.348 "uuid": "351a1b2b-ee63-4215-af42-8558f886a6df", 00:24:53.348 "strip_size_kb": 0, 00:24:53.348 "state": "online", 00:24:53.348 "raid_level": "raid1", 00:24:53.348 "superblock": true, 00:24:53.348 "num_base_bdevs": 2, 00:24:53.348 "num_base_bdevs_discovered": 2, 00:24:53.348 "num_base_bdevs_operational": 2, 00:24:53.348 "base_bdevs_list": [ 00:24:53.348 { 00:24:53.348 "name": "pt1", 00:24:53.348 "uuid": "c0070c21-ed20-57d9-9190-64b9c1a3cc83", 00:24:53.348 "is_configured": true, 00:24:53.348 "data_offset": 256, 00:24:53.348 "data_size": 7936 00:24:53.348 }, 00:24:53.348 { 00:24:53.348 "name": "pt2", 00:24:53.348 "uuid": "8e918d48-a51b-592d-856f-d35144310e5c", 00:24:53.348 "is_configured": true, 00:24:53.348 "data_offset": 256, 00:24:53.348 "data_size": 7936 00:24:53.348 } 00:24:53.348 ] 00:24:53.348 }' 00:24:53.348 00:06:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:53.348 00:06:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:53.914 00:06:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:24:53.914 00:06:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:24:53.914 00:06:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:24:53.914 00:06:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:24:53.914 00:06:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:24:53.914 00:06:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@199 -- # local name 00:24:53.914 00:06:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:24:53.914 00:06:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:54.172 [2024-05-15 00:06:54.684870] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:54.172 00:06:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:24:54.172 "name": "raid_bdev1", 00:24:54.172 "aliases": [ 00:24:54.172 "351a1b2b-ee63-4215-af42-8558f886a6df" 00:24:54.172 ], 00:24:54.172 "product_name": "Raid Volume", 00:24:54.172 "block_size": 4096, 00:24:54.172 "num_blocks": 7936, 00:24:54.172 "uuid": "351a1b2b-ee63-4215-af42-8558f886a6df", 00:24:54.172 "md_size": 32, 00:24:54.172 "md_interleave": false, 00:24:54.172 "dif_type": 0, 00:24:54.172 "assigned_rate_limits": { 00:24:54.172 "rw_ios_per_sec": 0, 00:24:54.172 "rw_mbytes_per_sec": 0, 00:24:54.172 "r_mbytes_per_sec": 0, 00:24:54.172 "w_mbytes_per_sec": 0 00:24:54.172 }, 00:24:54.172 "claimed": false, 00:24:54.172 "zoned": false, 00:24:54.172 "supported_io_types": { 00:24:54.172 "read": true, 00:24:54.172 "write": true, 00:24:54.172 "unmap": false, 00:24:54.172 "write_zeroes": true, 00:24:54.172 "flush": false, 00:24:54.172 "reset": true, 00:24:54.172 "compare": false, 00:24:54.172 "compare_and_write": false, 00:24:54.172 "abort": false, 00:24:54.172 "nvme_admin": false, 00:24:54.172 "nvme_io": false 00:24:54.172 }, 00:24:54.172 "memory_domains": [ 00:24:54.172 { 00:24:54.172 "dma_device_id": "system", 00:24:54.172 "dma_device_type": 1 00:24:54.172 }, 00:24:54.172 { 00:24:54.172 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:54.172 "dma_device_type": 2 00:24:54.172 }, 00:24:54.172 { 00:24:54.172 "dma_device_id": "system", 00:24:54.172 "dma_device_type": 1 00:24:54.172 }, 00:24:54.172 { 00:24:54.172 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:54.172 "dma_device_type": 2 00:24:54.172 } 00:24:54.172 ], 00:24:54.172 "driver_specific": { 00:24:54.172 "raid": { 00:24:54.172 "uuid": "351a1b2b-ee63-4215-af42-8558f886a6df", 00:24:54.172 "strip_size_kb": 0, 00:24:54.172 "state": "online", 00:24:54.172 "raid_level": "raid1", 00:24:54.172 "superblock": true, 00:24:54.172 "num_base_bdevs": 2, 00:24:54.172 "num_base_bdevs_discovered": 2, 00:24:54.172 "num_base_bdevs_operational": 2, 00:24:54.172 "base_bdevs_list": [ 00:24:54.172 { 00:24:54.172 "name": "pt1", 00:24:54.172 "uuid": "c0070c21-ed20-57d9-9190-64b9c1a3cc83", 00:24:54.172 "is_configured": true, 00:24:54.172 "data_offset": 256, 00:24:54.172 "data_size": 7936 00:24:54.172 }, 00:24:54.172 { 00:24:54.172 "name": "pt2", 00:24:54.172 "uuid": "8e918d48-a51b-592d-856f-d35144310e5c", 00:24:54.172 "is_configured": true, 00:24:54.172 "data_offset": 256, 00:24:54.172 "data_size": 7936 00:24:54.172 } 00:24:54.172 ] 00:24:54.172 } 00:24:54.172 } 00:24:54.172 }' 00:24:54.172 00:06:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:54.172 00:06:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:24:54.172 pt2' 00:24:54.172 00:06:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:24:54.172 00:06:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:24:54.172 00:06:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:24:54.430 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:24:54.430 "name": "pt1", 00:24:54.430 "aliases": [ 00:24:54.430 "c0070c21-ed20-57d9-9190-64b9c1a3cc83" 00:24:54.430 ], 00:24:54.430 "product_name": "passthru", 00:24:54.430 "block_size": 4096, 00:24:54.430 "num_blocks": 8192, 00:24:54.430 "uuid": "c0070c21-ed20-57d9-9190-64b9c1a3cc83", 00:24:54.430 "md_size": 32, 00:24:54.430 "md_interleave": false, 00:24:54.430 "dif_type": 0, 00:24:54.430 "assigned_rate_limits": { 00:24:54.430 "rw_ios_per_sec": 0, 00:24:54.430 "rw_mbytes_per_sec": 0, 00:24:54.430 "r_mbytes_per_sec": 0, 00:24:54.430 "w_mbytes_per_sec": 0 00:24:54.430 }, 00:24:54.430 "claimed": true, 00:24:54.430 "claim_type": "exclusive_write", 00:24:54.430 "zoned": false, 00:24:54.430 "supported_io_types": { 00:24:54.430 "read": true, 00:24:54.430 "write": true, 00:24:54.430 "unmap": true, 00:24:54.430 "write_zeroes": true, 00:24:54.430 "flush": true, 00:24:54.430 "reset": true, 00:24:54.430 "compare": false, 00:24:54.430 "compare_and_write": false, 00:24:54.430 "abort": true, 00:24:54.430 "nvme_admin": false, 00:24:54.430 "nvme_io": false 00:24:54.430 }, 00:24:54.430 "memory_domains": [ 00:24:54.430 { 00:24:54.430 "dma_device_id": "system", 00:24:54.430 "dma_device_type": 1 00:24:54.430 }, 00:24:54.430 { 00:24:54.430 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:54.430 "dma_device_type": 2 00:24:54.430 } 00:24:54.430 ], 00:24:54.430 "driver_specific": { 00:24:54.430 "passthru": { 00:24:54.430 "name": "pt1", 00:24:54.430 "base_bdev_name": "malloc1" 00:24:54.430 } 00:24:54.430 } 00:24:54.430 }' 00:24:54.430 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:24:54.688 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:24:54.688 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:24:54.688 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:24:54.688 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:24:54.688 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:24:54.688 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:24:54.688 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:24:54.688 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ false == false ]] 00:24:54.688 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:24:54.944 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:24:54.944 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:24:54.944 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:24:54.944 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:24:54.944 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:24:55.202 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:24:55.202 "name": "pt2", 00:24:55.202 "aliases": [ 00:24:55.202 "8e918d48-a51b-592d-856f-d35144310e5c" 00:24:55.202 ], 00:24:55.202 "product_name": "passthru", 00:24:55.202 "block_size": 4096, 00:24:55.202 "num_blocks": 8192, 00:24:55.202 "uuid": "8e918d48-a51b-592d-856f-d35144310e5c", 00:24:55.202 "md_size": 32, 00:24:55.202 "md_interleave": false, 00:24:55.202 "dif_type": 0, 00:24:55.202 "assigned_rate_limits": { 00:24:55.202 "rw_ios_per_sec": 0, 00:24:55.202 "rw_mbytes_per_sec": 0, 00:24:55.202 "r_mbytes_per_sec": 0, 00:24:55.202 "w_mbytes_per_sec": 0 00:24:55.202 }, 00:24:55.202 "claimed": true, 00:24:55.202 "claim_type": "exclusive_write", 00:24:55.202 "zoned": false, 00:24:55.202 "supported_io_types": { 00:24:55.202 "read": true, 00:24:55.202 "write": true, 00:24:55.202 "unmap": true, 00:24:55.202 "write_zeroes": true, 00:24:55.202 "flush": true, 00:24:55.202 "reset": true, 00:24:55.202 "compare": false, 00:24:55.202 "compare_and_write": false, 00:24:55.202 "abort": true, 00:24:55.202 "nvme_admin": false, 00:24:55.202 "nvme_io": false 00:24:55.202 }, 00:24:55.202 "memory_domains": [ 00:24:55.202 { 00:24:55.202 "dma_device_id": "system", 00:24:55.202 "dma_device_type": 1 00:24:55.202 }, 00:24:55.202 { 00:24:55.202 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:55.202 "dma_device_type": 2 00:24:55.202 } 00:24:55.202 ], 00:24:55.202 "driver_specific": { 00:24:55.202 "passthru": { 00:24:55.202 "name": "pt2", 00:24:55.202 "base_bdev_name": "malloc2" 00:24:55.202 } 00:24:55.202 } 00:24:55.202 }' 00:24:55.202 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:24:55.202 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:24:55.202 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:24:55.202 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:24:55.202 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:24:55.202 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:24:55.202 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:24:55.460 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:24:55.460 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ false == false ]] 00:24:55.460 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:24:55.460 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:24:55.460 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:24:55.460 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:55.460 00:06:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:24:55.718 [2024-05-15 00:06:56.172814] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:55.718 00:06:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@487 -- # '[' 351a1b2b-ee63-4215-af42-8558f886a6df '!=' 351a1b2b-ee63-4215-af42-8558f886a6df ']' 00:24:55.718 00:06:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@491 -- # has_redundancy raid1 00:24:55.718 00:06:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # case $1 in 00:24:55.718 00:06:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@215 -- # return 0 00:24:55.718 00:06:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@493 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:24:55.975 [2024-05-15 00:06:56.413237] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:24:55.975 00:06:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@496 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:55.975 00:06:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:55.975 00:06:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:55.975 00:06:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:55.975 00:06:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:55.975 00:06:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:24:55.975 00:06:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:55.975 00:06:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:55.975 00:06:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:55.975 00:06:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:55.975 00:06:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:55.975 00:06:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:56.232 00:06:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:56.232 "name": "raid_bdev1", 00:24:56.232 "uuid": "351a1b2b-ee63-4215-af42-8558f886a6df", 00:24:56.232 "strip_size_kb": 0, 00:24:56.232 "state": "online", 00:24:56.232 "raid_level": "raid1", 00:24:56.232 "superblock": true, 00:24:56.232 "num_base_bdevs": 2, 00:24:56.232 "num_base_bdevs_discovered": 1, 00:24:56.232 "num_base_bdevs_operational": 1, 00:24:56.232 "base_bdevs_list": [ 00:24:56.232 { 00:24:56.232 "name": null, 00:24:56.232 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:56.232 "is_configured": false, 00:24:56.232 "data_offset": 256, 00:24:56.232 "data_size": 7936 00:24:56.232 }, 00:24:56.232 { 00:24:56.232 "name": "pt2", 00:24:56.232 "uuid": "8e918d48-a51b-592d-856f-d35144310e5c", 00:24:56.232 "is_configured": true, 00:24:56.232 "data_offset": 256, 00:24:56.232 "data_size": 7936 00:24:56.232 } 00:24:56.232 ] 00:24:56.232 }' 00:24:56.232 00:06:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:56.232 00:06:56 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:56.854 00:06:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:57.111 [2024-05-15 00:06:57.496092] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:57.111 [2024-05-15 00:06:57.496118] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:57.111 [2024-05-15 00:06:57.496179] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:57.111 [2024-05-15 00:06:57.496223] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:57.111 [2024-05-15 00:06:57.496235] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19b6030 name raid_bdev1, state offline 00:24:57.111 00:06:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:57.111 00:06:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # jq -r '.[]' 00:24:57.367 00:06:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # raid_bdev= 00:24:57.367 00:06:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@501 -- # '[' -n '' ']' 00:24:57.367 00:06:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # (( i = 1 )) 00:24:57.368 00:06:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:24:57.368 00:06:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@507 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:24:57.625 00:06:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # (( i++ )) 00:24:57.625 00:06:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:24:57.625 00:06:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@511 -- # (( i = 1 )) 00:24:57.625 00:06:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@511 -- # (( i < num_base_bdevs - 1 )) 00:24:57.625 00:06:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # i=1 00:24:57.625 00:06:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@520 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:24:57.882 [2024-05-15 00:06:58.217977] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:24:57.882 [2024-05-15 00:06:58.218026] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:57.882 [2024-05-15 00:06:58.218045] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18327a0 00:24:57.882 [2024-05-15 00:06:58.218058] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:57.882 [2024-05-15 00:06:58.219571] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:57.882 [2024-05-15 00:06:58.219599] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:24:57.882 [2024-05-15 00:06:58.219650] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:24:57.882 [2024-05-15 00:06:58.219677] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:57.882 [2024-05-15 00:06:58.219759] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x19b6470 00:24:57.882 [2024-05-15 00:06:58.219775] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:57.882 [2024-05-15 00:06:58.219832] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19b2df0 00:24:57.882 [2024-05-15 00:06:58.219935] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19b6470 00:24:57.882 [2024-05-15 00:06:58.219945] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19b6470 00:24:57.882 [2024-05-15 00:06:58.220013] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:57.882 pt2 00:24:57.882 00:06:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@523 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:57.882 00:06:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:57.882 00:06:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:57.882 00:06:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:57.882 00:06:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:57.882 00:06:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:24:57.882 00:06:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:57.882 00:06:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:57.882 00:06:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:57.882 00:06:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:57.882 00:06:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:57.882 00:06:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:58.140 00:06:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:58.140 "name": "raid_bdev1", 00:24:58.140 "uuid": "351a1b2b-ee63-4215-af42-8558f886a6df", 00:24:58.140 "strip_size_kb": 0, 00:24:58.140 "state": "online", 00:24:58.140 "raid_level": "raid1", 00:24:58.140 "superblock": true, 00:24:58.140 "num_base_bdevs": 2, 00:24:58.140 "num_base_bdevs_discovered": 1, 00:24:58.140 "num_base_bdevs_operational": 1, 00:24:58.140 "base_bdevs_list": [ 00:24:58.140 { 00:24:58.140 "name": null, 00:24:58.140 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:58.140 "is_configured": false, 00:24:58.140 "data_offset": 256, 00:24:58.140 "data_size": 7936 00:24:58.140 }, 00:24:58.140 { 00:24:58.140 "name": "pt2", 00:24:58.140 "uuid": "8e918d48-a51b-592d-856f-d35144310e5c", 00:24:58.140 "is_configured": true, 00:24:58.140 "data_offset": 256, 00:24:58.140 "data_size": 7936 00:24:58.140 } 00:24:58.140 ] 00:24:58.140 }' 00:24:58.140 00:06:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:58.140 00:06:58 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:58.705 00:06:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@525 -- # '[' 2 -gt 2 ']' 00:24:58.705 00:06:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@563 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:58.705 00:06:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@563 -- # jq -r '.[] | .uuid' 00:24:58.705 [2024-05-15 00:06:59.293022] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:58.962 00:06:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@563 -- # '[' 351a1b2b-ee63-4215-af42-8558f886a6df '!=' 351a1b2b-ee63-4215-af42-8558f886a6df ']' 00:24:58.962 00:06:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@568 -- # killprocess 514866 00:24:58.962 00:06:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@946 -- # '[' -z 514866 ']' 00:24:58.962 00:06:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@950 -- # kill -0 514866 00:24:58.962 00:06:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@951 -- # uname 00:24:58.962 00:06:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:24:58.962 00:06:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 514866 00:24:58.962 00:06:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:24:58.962 00:06:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:24:58.962 00:06:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@964 -- # echo 'killing process with pid 514866' 00:24:58.962 killing process with pid 514866 00:24:58.962 00:06:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@965 -- # kill 514866 00:24:58.962 [2024-05-15 00:06:59.369822] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:58.962 [2024-05-15 00:06:59.369888] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:58.962 [2024-05-15 00:06:59.369933] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:58.962 [2024-05-15 00:06:59.369947] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19b6470 name raid_bdev1, state offline 00:24:58.962 00:06:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@970 -- # wait 514866 00:24:58.962 [2024-05-15 00:06:59.393432] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:59.220 00:06:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@570 -- # return 0 00:24:59.220 00:24:59.220 real 0m13.903s 00:24:59.220 user 0m25.038s 00:24:59.220 sys 0m2.582s 00:24:59.220 00:06:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1122 -- # xtrace_disable 00:24:59.220 00:06:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:59.220 ************************************ 00:24:59.220 END TEST raid_superblock_test_md_separate 00:24:59.220 ************************************ 00:24:59.220 00:06:59 bdev_raid -- bdev/bdev_raid.sh@853 -- # '[' true = true ']' 00:24:59.220 00:06:59 bdev_raid -- bdev/bdev_raid.sh@854 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:24:59.220 00:06:59 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:24:59.220 00:06:59 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:24:59.220 00:06:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:59.220 ************************************ 00:24:59.220 START TEST raid_rebuild_test_sb_md_separate 00:24:59.220 ************************************ 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 2 true false true 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local raid_level=raid1 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local num_base_bdevs=2 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local superblock=true 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local background_io=false 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local verify=true 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@579 -- # (( i = 1 )) 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@581 -- # echo BaseBdev1 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@581 -- # echo BaseBdev2 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@579 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@579 -- # local base_bdevs 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@580 -- # local raid_bdev_name=raid_bdev1 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@581 -- # local strip_size 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@582 -- # local create_arg 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@583 -- # local raid_bdev_size 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@584 -- # local data_offset 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@586 -- # '[' raid1 '!=' raid1 ']' 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@594 -- # strip_size=0 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # '[' true = true ']' 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@598 -- # create_arg+=' -s' 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # raid_pid=516992 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@603 -- # waitforlisten 516992 /var/tmp/spdk-raid.sock 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@827 -- # '[' -z 516992 ']' 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@832 -- # local max_retries=100 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:59.220 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@836 -- # xtrace_disable 00:24:59.220 00:06:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:59.220 [2024-05-15 00:06:59.799967] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:24:59.220 [2024-05-15 00:06:59.800033] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid516992 ] 00:24:59.220 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:59.220 Zero copy mechanism will not be used. 00:24:59.477 [2024-05-15 00:06:59.923345] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:59.477 [2024-05-15 00:07:00.041567] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:24:59.734 [2024-05-15 00:07:00.117050] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:59.734 [2024-05-15 00:07:00.117087] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:00.298 00:07:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:25:00.298 00:07:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@860 -- # return 0 00:25:00.298 00:07:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:25:00.298 00:07:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:25:00.556 BaseBdev1_malloc 00:25:00.556 00:07:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:00.813 [2024-05-15 00:07:01.196100] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:00.813 [2024-05-15 00:07:01.196147] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:00.814 [2024-05-15 00:07:01.196171] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22af640 00:25:00.814 [2024-05-15 00:07:01.196189] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:00.814 [2024-05-15 00:07:01.197783] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:00.814 [2024-05-15 00:07:01.197811] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:00.814 BaseBdev1 00:25:00.814 00:07:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:25:00.814 00:07:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:25:01.071 BaseBdev2_malloc 00:25:01.071 00:07:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:01.328 [2024-05-15 00:07:01.674866] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:01.328 [2024-05-15 00:07:01.674913] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:01.328 [2024-05-15 00:07:01.674935] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23c2850 00:25:01.328 [2024-05-15 00:07:01.674948] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:01.328 [2024-05-15 00:07:01.676383] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:01.328 [2024-05-15 00:07:01.676417] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:01.328 BaseBdev2 00:25:01.328 00:07:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:25:01.585 spare_malloc 00:25:01.585 00:07:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@613 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:01.585 spare_delay 00:25:01.585 00:07:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@614 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:01.841 [2024-05-15 00:07:02.395410] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:01.841 [2024-05-15 00:07:02.395451] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:01.841 [2024-05-15 00:07:02.395471] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23c5610 00:25:01.841 [2024-05-15 00:07:02.395484] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:01.841 [2024-05-15 00:07:02.396762] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:01.841 [2024-05-15 00:07:02.396788] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:01.841 spare 00:25:01.841 00:07:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:25:02.099 [2024-05-15 00:07:02.648108] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:02.099 [2024-05-15 00:07:02.649493] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:02.099 [2024-05-15 00:07:02.649664] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x23c7730 00:25:02.099 [2024-05-15 00:07:02.649678] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:02.099 [2024-05-15 00:07:02.649750] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23c92c0 00:25:02.099 [2024-05-15 00:07:02.649864] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23c7730 00:25:02.099 [2024-05-15 00:07:02.649874] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23c7730 00:25:02.099 [2024-05-15 00:07:02.649948] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:02.099 00:07:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:02.099 00:07:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:02.099 00:07:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:02.099 00:07:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:02.099 00:07:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:02.099 00:07:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:25:02.099 00:07:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:02.099 00:07:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:02.099 00:07:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:02.099 00:07:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:02.099 00:07:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:02.099 00:07:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:02.356 00:07:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:02.356 "name": "raid_bdev1", 00:25:02.356 "uuid": "02454f3c-b9a3-4d0d-8a86-8192d2341601", 00:25:02.356 "strip_size_kb": 0, 00:25:02.356 "state": "online", 00:25:02.356 "raid_level": "raid1", 00:25:02.356 "superblock": true, 00:25:02.356 "num_base_bdevs": 2, 00:25:02.356 "num_base_bdevs_discovered": 2, 00:25:02.356 "num_base_bdevs_operational": 2, 00:25:02.356 "base_bdevs_list": [ 00:25:02.356 { 00:25:02.356 "name": "BaseBdev1", 00:25:02.356 "uuid": "1b820ead-3c4c-5f90-b95b-6479b0f4769b", 00:25:02.356 "is_configured": true, 00:25:02.356 "data_offset": 256, 00:25:02.356 "data_size": 7936 00:25:02.356 }, 00:25:02.356 { 00:25:02.356 "name": "BaseBdev2", 00:25:02.356 "uuid": "bb264898-5ade-522a-8463-382857fddca2", 00:25:02.356 "is_configured": true, 00:25:02.356 "data_offset": 256, 00:25:02.356 "data_size": 7936 00:25:02.356 } 00:25:02.356 ] 00:25:02.356 }' 00:25:02.356 00:07:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:02.356 00:07:02 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:03.288 00:07:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@621 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:03.288 00:07:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@621 -- # jq -r '.[].num_blocks' 00:25:03.288 [2024-05-15 00:07:03.735188] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:03.288 00:07:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@621 -- # raid_bdev_size=7936 00:25:03.288 00:07:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:03.288 00:07:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:03.547 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # data_offset=256 00:25:03.547 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@626 -- # '[' false = true ']' 00:25:03.547 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@629 -- # '[' true = true ']' 00:25:03.547 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@630 -- # local write_unit_size 00:25:03.547 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@633 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:25:03.547 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:03.547 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:25:03.547 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:03.547 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:03.547 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:03.547 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:25:03.547 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:03.547 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:03.547 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:25:03.804 [2024-05-15 00:07:04.228321] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23c9120 00:25:03.804 /dev/nbd0 00:25:03.804 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:03.804 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:03.804 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:25:03.804 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@865 -- # local i 00:25:03.804 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:25:03.804 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:25:03.804 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:25:03.804 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # break 00:25:03.804 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:25:03.805 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:25:03.805 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:03.805 1+0 records in 00:25:03.805 1+0 records out 00:25:03.805 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000240412 s, 17.0 MB/s 00:25:03.805 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:03.805 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # size=4096 00:25:03.805 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:03.805 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:25:03.805 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # return 0 00:25:03.805 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:03.805 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:03.805 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # '[' raid1 = raid5f ']' 00:25:03.805 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@638 -- # write_unit_size=1 00:25:03.805 00:07:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@640 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:25:04.737 7936+0 records in 00:25:04.737 7936+0 records out 00:25:04.737 32505856 bytes (33 MB, 31 MiB) copied, 0.762371 s, 42.6 MB/s 00:25:04.737 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@641 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:04.737 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:04.737 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:04.737 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:04.737 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:25:04.737 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:04.737 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:04.737 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:04.737 [2024-05-15 00:07:05.322039] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:04.737 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:04.737 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:04.994 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:04.994 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:04.994 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:04.994 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:25:04.994 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:25:04.994 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:04.994 [2024-05-15 00:07:05.558714] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:04.994 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@648 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:04.994 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:04.994 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:04.994 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:04.994 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:04.994 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:25:04.994 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:04.994 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:04.994 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:04.994 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:05.252 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:05.252 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:05.252 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:05.252 "name": "raid_bdev1", 00:25:05.252 "uuid": "02454f3c-b9a3-4d0d-8a86-8192d2341601", 00:25:05.252 "strip_size_kb": 0, 00:25:05.252 "state": "online", 00:25:05.252 "raid_level": "raid1", 00:25:05.252 "superblock": true, 00:25:05.252 "num_base_bdevs": 2, 00:25:05.252 "num_base_bdevs_discovered": 1, 00:25:05.252 "num_base_bdevs_operational": 1, 00:25:05.252 "base_bdevs_list": [ 00:25:05.252 { 00:25:05.252 "name": null, 00:25:05.252 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:05.252 "is_configured": false, 00:25:05.252 "data_offset": 256, 00:25:05.252 "data_size": 7936 00:25:05.252 }, 00:25:05.252 { 00:25:05.252 "name": "BaseBdev2", 00:25:05.252 "uuid": "bb264898-5ade-522a-8463-382857fddca2", 00:25:05.252 "is_configured": true, 00:25:05.252 "data_offset": 256, 00:25:05.252 "data_size": 7936 00:25:05.252 } 00:25:05.252 ] 00:25:05.252 }' 00:25:05.252 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:05.252 00:07:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:06.186 00:07:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:06.186 [2024-05-15 00:07:06.649618] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:06.186 [2024-05-15 00:07:06.651922] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23c9e00 00:25:06.186 [2024-05-15 00:07:06.654182] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:06.186 00:07:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@652 -- # sleep 1 00:25:07.120 00:07:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:07.120 00:07:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:25:07.120 00:07:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:25:07.120 00:07:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=spare 00:25:07.120 00:07:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:25:07.120 00:07:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:07.120 00:07:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:07.378 00:07:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:25:07.378 "name": "raid_bdev1", 00:25:07.378 "uuid": "02454f3c-b9a3-4d0d-8a86-8192d2341601", 00:25:07.378 "strip_size_kb": 0, 00:25:07.378 "state": "online", 00:25:07.378 "raid_level": "raid1", 00:25:07.378 "superblock": true, 00:25:07.378 "num_base_bdevs": 2, 00:25:07.378 "num_base_bdevs_discovered": 2, 00:25:07.378 "num_base_bdevs_operational": 2, 00:25:07.378 "process": { 00:25:07.378 "type": "rebuild", 00:25:07.378 "target": "spare", 00:25:07.378 "progress": { 00:25:07.378 "blocks": 3072, 00:25:07.378 "percent": 38 00:25:07.378 } 00:25:07.378 }, 00:25:07.378 "base_bdevs_list": [ 00:25:07.378 { 00:25:07.378 "name": "spare", 00:25:07.378 "uuid": "c3d848bb-7ecc-5d07-b5e0-aadd2f90b7a5", 00:25:07.378 "is_configured": true, 00:25:07.378 "data_offset": 256, 00:25:07.378 "data_size": 7936 00:25:07.378 }, 00:25:07.378 { 00:25:07.378 "name": "BaseBdev2", 00:25:07.378 "uuid": "bb264898-5ade-522a-8463-382857fddca2", 00:25:07.378 "is_configured": true, 00:25:07.378 "data_offset": 256, 00:25:07.378 "data_size": 7936 00:25:07.378 } 00:25:07.378 ] 00:25:07.378 }' 00:25:07.378 00:07:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:25:07.637 00:07:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:07.637 00:07:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:25:07.637 00:07:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:25:07.637 00:07:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:07.895 [2024-05-15 00:07:08.243115] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:07.895 [2024-05-15 00:07:08.267175] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:07.895 [2024-05-15 00:07:08.267217] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:07.895 00:07:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:07.895 00:07:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:07.895 00:07:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:07.895 00:07:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:07.895 00:07:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:07.895 00:07:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:25:07.895 00:07:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:07.895 00:07:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:07.895 00:07:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:07.895 00:07:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:07.895 00:07:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:07.895 00:07:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:07.895 00:07:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:07.895 "name": "raid_bdev1", 00:25:07.895 "uuid": "02454f3c-b9a3-4d0d-8a86-8192d2341601", 00:25:07.895 "strip_size_kb": 0, 00:25:07.895 "state": "online", 00:25:07.895 "raid_level": "raid1", 00:25:07.895 "superblock": true, 00:25:07.895 "num_base_bdevs": 2, 00:25:07.895 "num_base_bdevs_discovered": 1, 00:25:07.895 "num_base_bdevs_operational": 1, 00:25:07.895 "base_bdevs_list": [ 00:25:07.895 { 00:25:07.895 "name": null, 00:25:07.895 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:07.895 "is_configured": false, 00:25:07.895 "data_offset": 256, 00:25:07.895 "data_size": 7936 00:25:07.895 }, 00:25:07.895 { 00:25:07.895 "name": "BaseBdev2", 00:25:07.895 "uuid": "bb264898-5ade-522a-8463-382857fddca2", 00:25:07.895 "is_configured": true, 00:25:07.895 "data_offset": 256, 00:25:07.895 "data_size": 7936 00:25:07.895 } 00:25:07.895 ] 00:25:07.895 }' 00:25:07.895 00:07:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:07.895 00:07:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:08.493 00:07:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:08.493 00:07:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:25:08.493 00:07:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:25:08.493 00:07:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=none 00:25:08.493 00:07:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:25:08.493 00:07:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:08.493 00:07:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:08.750 00:07:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:25:08.750 "name": "raid_bdev1", 00:25:08.750 "uuid": "02454f3c-b9a3-4d0d-8a86-8192d2341601", 00:25:08.750 "strip_size_kb": 0, 00:25:08.750 "state": "online", 00:25:08.750 "raid_level": "raid1", 00:25:08.750 "superblock": true, 00:25:08.750 "num_base_bdevs": 2, 00:25:08.750 "num_base_bdevs_discovered": 1, 00:25:08.750 "num_base_bdevs_operational": 1, 00:25:08.750 "base_bdevs_list": [ 00:25:08.750 { 00:25:08.750 "name": null, 00:25:08.750 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:08.750 "is_configured": false, 00:25:08.750 "data_offset": 256, 00:25:08.750 "data_size": 7936 00:25:08.750 }, 00:25:08.750 { 00:25:08.750 "name": "BaseBdev2", 00:25:08.750 "uuid": "bb264898-5ade-522a-8463-382857fddca2", 00:25:08.750 "is_configured": true, 00:25:08.750 "data_offset": 256, 00:25:08.750 "data_size": 7936 00:25:08.750 } 00:25:08.750 ] 00:25:08.750 }' 00:25:08.750 00:07:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:25:08.750 00:07:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:08.750 00:07:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:25:08.750 00:07:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:25:08.750 00:07:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@667 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:09.008 [2024-05-15 00:07:09.534635] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:09.008 [2024-05-15 00:07:09.537198] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23bd310 00:25:09.008 [2024-05-15 00:07:09.538695] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:09.008 00:07:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@668 -- # sleep 1 00:25:10.381 00:07:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@669 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:10.381 00:07:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:25:10.381 00:07:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:25:10.381 00:07:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=spare 00:25:10.381 00:07:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:25:10.381 00:07:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:10.381 00:07:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:10.381 00:07:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:25:10.381 "name": "raid_bdev1", 00:25:10.381 "uuid": "02454f3c-b9a3-4d0d-8a86-8192d2341601", 00:25:10.381 "strip_size_kb": 0, 00:25:10.381 "state": "online", 00:25:10.381 "raid_level": "raid1", 00:25:10.381 "superblock": true, 00:25:10.381 "num_base_bdevs": 2, 00:25:10.381 "num_base_bdevs_discovered": 2, 00:25:10.381 "num_base_bdevs_operational": 2, 00:25:10.381 "process": { 00:25:10.381 "type": "rebuild", 00:25:10.381 "target": "spare", 00:25:10.381 "progress": { 00:25:10.381 "blocks": 3072, 00:25:10.381 "percent": 38 00:25:10.381 } 00:25:10.381 }, 00:25:10.381 "base_bdevs_list": [ 00:25:10.381 { 00:25:10.381 "name": "spare", 00:25:10.381 "uuid": "c3d848bb-7ecc-5d07-b5e0-aadd2f90b7a5", 00:25:10.381 "is_configured": true, 00:25:10.381 "data_offset": 256, 00:25:10.381 "data_size": 7936 00:25:10.381 }, 00:25:10.381 { 00:25:10.381 "name": "BaseBdev2", 00:25:10.381 "uuid": "bb264898-5ade-522a-8463-382857fddca2", 00:25:10.382 "is_configured": true, 00:25:10.382 "data_offset": 256, 00:25:10.382 "data_size": 7936 00:25:10.382 } 00:25:10.382 ] 00:25:10.382 }' 00:25:10.382 00:07:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:25:10.382 00:07:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:10.382 00:07:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:25:10.382 00:07:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:25:10.382 00:07:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@671 -- # '[' true = true ']' 00:25:10.382 00:07:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@671 -- # '[' = false ']' 00:25:10.382 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 671: [: =: unary operator expected 00:25:10.382 00:07:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@696 -- # local num_base_bdevs_operational=2 00:25:10.382 00:07:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@698 -- # '[' raid1 = raid1 ']' 00:25:10.382 00:07:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@698 -- # '[' 2 -gt 2 ']' 00:25:10.382 00:07:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@711 -- # local timeout=921 00:25:10.382 00:07:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:25:10.382 00:07:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:10.382 00:07:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:25:10.382 00:07:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:25:10.382 00:07:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=spare 00:25:10.382 00:07:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:25:10.382 00:07:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:10.382 00:07:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:10.640 00:07:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:25:10.640 "name": "raid_bdev1", 00:25:10.640 "uuid": "02454f3c-b9a3-4d0d-8a86-8192d2341601", 00:25:10.640 "strip_size_kb": 0, 00:25:10.640 "state": "online", 00:25:10.640 "raid_level": "raid1", 00:25:10.640 "superblock": true, 00:25:10.640 "num_base_bdevs": 2, 00:25:10.640 "num_base_bdevs_discovered": 2, 00:25:10.640 "num_base_bdevs_operational": 2, 00:25:10.640 "process": { 00:25:10.640 "type": "rebuild", 00:25:10.640 "target": "spare", 00:25:10.640 "progress": { 00:25:10.640 "blocks": 3840, 00:25:10.640 "percent": 48 00:25:10.640 } 00:25:10.640 }, 00:25:10.640 "base_bdevs_list": [ 00:25:10.640 { 00:25:10.640 "name": "spare", 00:25:10.640 "uuid": "c3d848bb-7ecc-5d07-b5e0-aadd2f90b7a5", 00:25:10.640 "is_configured": true, 00:25:10.640 "data_offset": 256, 00:25:10.640 "data_size": 7936 00:25:10.640 }, 00:25:10.640 { 00:25:10.640 "name": "BaseBdev2", 00:25:10.640 "uuid": "bb264898-5ade-522a-8463-382857fddca2", 00:25:10.640 "is_configured": true, 00:25:10.640 "data_offset": 256, 00:25:10.640 "data_size": 7936 00:25:10.640 } 00:25:10.640 ] 00:25:10.640 }' 00:25:10.640 00:07:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:25:10.640 00:07:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:10.640 00:07:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:25:10.897 00:07:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:25:10.897 00:07:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@716 -- # sleep 1 00:25:11.832 00:07:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:25:11.832 00:07:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:11.832 00:07:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:25:11.832 00:07:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:25:11.832 00:07:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=spare 00:25:11.832 00:07:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:25:11.832 00:07:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:11.832 00:07:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:12.090 00:07:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:25:12.090 "name": "raid_bdev1", 00:25:12.090 "uuid": "02454f3c-b9a3-4d0d-8a86-8192d2341601", 00:25:12.090 "strip_size_kb": 0, 00:25:12.090 "state": "online", 00:25:12.090 "raid_level": "raid1", 00:25:12.090 "superblock": true, 00:25:12.090 "num_base_bdevs": 2, 00:25:12.090 "num_base_bdevs_discovered": 2, 00:25:12.090 "num_base_bdevs_operational": 2, 00:25:12.090 "process": { 00:25:12.090 "type": "rebuild", 00:25:12.090 "target": "spare", 00:25:12.090 "progress": { 00:25:12.090 "blocks": 7424, 00:25:12.090 "percent": 93 00:25:12.090 } 00:25:12.090 }, 00:25:12.090 "base_bdevs_list": [ 00:25:12.090 { 00:25:12.090 "name": "spare", 00:25:12.090 "uuid": "c3d848bb-7ecc-5d07-b5e0-aadd2f90b7a5", 00:25:12.090 "is_configured": true, 00:25:12.090 "data_offset": 256, 00:25:12.090 "data_size": 7936 00:25:12.090 }, 00:25:12.090 { 00:25:12.090 "name": "BaseBdev2", 00:25:12.090 "uuid": "bb264898-5ade-522a-8463-382857fddca2", 00:25:12.090 "is_configured": true, 00:25:12.090 "data_offset": 256, 00:25:12.090 "data_size": 7936 00:25:12.090 } 00:25:12.090 ] 00:25:12.090 }' 00:25:12.090 00:07:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:25:12.090 00:07:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:12.090 00:07:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:25:12.090 00:07:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:25:12.090 00:07:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@716 -- # sleep 1 00:25:12.090 [2024-05-15 00:07:12.663183] bdev_raid.c:2741:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:12.090 [2024-05-15 00:07:12.663238] bdev_raid.c:2458:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:12.090 [2024-05-15 00:07:12.663319] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:13.031 00:07:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:25:13.031 00:07:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:13.031 00:07:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:25:13.031 00:07:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:25:13.031 00:07:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=spare 00:25:13.032 00:07:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:25:13.032 00:07:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.032 00:07:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:13.297 00:07:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:25:13.297 "name": "raid_bdev1", 00:25:13.297 "uuid": "02454f3c-b9a3-4d0d-8a86-8192d2341601", 00:25:13.297 "strip_size_kb": 0, 00:25:13.297 "state": "online", 00:25:13.297 "raid_level": "raid1", 00:25:13.297 "superblock": true, 00:25:13.297 "num_base_bdevs": 2, 00:25:13.297 "num_base_bdevs_discovered": 2, 00:25:13.297 "num_base_bdevs_operational": 2, 00:25:13.297 "base_bdevs_list": [ 00:25:13.297 { 00:25:13.297 "name": "spare", 00:25:13.297 "uuid": "c3d848bb-7ecc-5d07-b5e0-aadd2f90b7a5", 00:25:13.297 "is_configured": true, 00:25:13.297 "data_offset": 256, 00:25:13.297 "data_size": 7936 00:25:13.297 }, 00:25:13.297 { 00:25:13.297 "name": "BaseBdev2", 00:25:13.297 "uuid": "bb264898-5ade-522a-8463-382857fddca2", 00:25:13.297 "is_configured": true, 00:25:13.297 "data_offset": 256, 00:25:13.297 "data_size": 7936 00:25:13.297 } 00:25:13.297 ] 00:25:13.297 }' 00:25:13.297 00:07:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:25:13.297 00:07:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:13.297 00:07:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:25:13.555 00:07:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:25:13.555 00:07:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@714 -- # break 00:25:13.555 00:07:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@720 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:13.555 00:07:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:25:13.555 00:07:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:25:13.555 00:07:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=none 00:25:13.555 00:07:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:25:13.555 00:07:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.555 00:07:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:13.555 00:07:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:25:13.555 "name": "raid_bdev1", 00:25:13.555 "uuid": "02454f3c-b9a3-4d0d-8a86-8192d2341601", 00:25:13.555 "strip_size_kb": 0, 00:25:13.555 "state": "online", 00:25:13.555 "raid_level": "raid1", 00:25:13.555 "superblock": true, 00:25:13.555 "num_base_bdevs": 2, 00:25:13.555 "num_base_bdevs_discovered": 2, 00:25:13.555 "num_base_bdevs_operational": 2, 00:25:13.555 "base_bdevs_list": [ 00:25:13.555 { 00:25:13.555 "name": "spare", 00:25:13.555 "uuid": "c3d848bb-7ecc-5d07-b5e0-aadd2f90b7a5", 00:25:13.555 "is_configured": true, 00:25:13.555 "data_offset": 256, 00:25:13.555 "data_size": 7936 00:25:13.555 }, 00:25:13.555 { 00:25:13.555 "name": "BaseBdev2", 00:25:13.555 "uuid": "bb264898-5ade-522a-8463-382857fddca2", 00:25:13.555 "is_configured": true, 00:25:13.555 "data_offset": 256, 00:25:13.555 "data_size": 7936 00:25:13.555 } 00:25:13.555 ] 00:25:13.555 }' 00:25:13.814 00:07:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:25:13.814 00:07:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:13.814 00:07:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:25:13.814 00:07:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:25:13.814 00:07:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:13.814 00:07:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:13.814 00:07:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:13.814 00:07:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:13.814 00:07:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:13.814 00:07:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:25:13.814 00:07:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:13.814 00:07:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:13.814 00:07:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:13.814 00:07:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:13.814 00:07:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.814 00:07:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:14.072 00:07:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:14.072 "name": "raid_bdev1", 00:25:14.072 "uuid": "02454f3c-b9a3-4d0d-8a86-8192d2341601", 00:25:14.072 "strip_size_kb": 0, 00:25:14.072 "state": "online", 00:25:14.072 "raid_level": "raid1", 00:25:14.072 "superblock": true, 00:25:14.072 "num_base_bdevs": 2, 00:25:14.072 "num_base_bdevs_discovered": 2, 00:25:14.072 "num_base_bdevs_operational": 2, 00:25:14.072 "base_bdevs_list": [ 00:25:14.072 { 00:25:14.072 "name": "spare", 00:25:14.072 "uuid": "c3d848bb-7ecc-5d07-b5e0-aadd2f90b7a5", 00:25:14.072 "is_configured": true, 00:25:14.072 "data_offset": 256, 00:25:14.072 "data_size": 7936 00:25:14.072 }, 00:25:14.072 { 00:25:14.072 "name": "BaseBdev2", 00:25:14.072 "uuid": "bb264898-5ade-522a-8463-382857fddca2", 00:25:14.072 "is_configured": true, 00:25:14.072 "data_offset": 256, 00:25:14.072 "data_size": 7936 00:25:14.072 } 00:25:14.072 ] 00:25:14.072 }' 00:25:14.072 00:07:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:14.072 00:07:14 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:14.637 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@724 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:14.895 [2024-05-15 00:07:15.281624] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:14.895 [2024-05-15 00:07:15.281652] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:14.895 [2024-05-15 00:07:15.281711] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:14.895 [2024-05-15 00:07:15.281766] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:14.895 [2024-05-15 00:07:15.281778] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23c7730 name raid_bdev1, state offline 00:25:14.895 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@725 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:14.895 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@725 -- # jq length 00:25:15.152 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@725 -- # [[ 0 == 0 ]] 00:25:15.152 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@727 -- # '[' true = true ']' 00:25:15.152 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@728 -- # '[' false = true ']' 00:25:15.152 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@742 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:25:15.152 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:15.152 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:25:15.152 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:15.152 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:15.152 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:15.152 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:25:15.152 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:15.153 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:15.153 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:25:15.410 /dev/nbd0 00:25:15.410 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:15.410 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:15.410 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:25:15.410 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@865 -- # local i 00:25:15.410 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:25:15.410 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:25:15.410 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:25:15.410 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # break 00:25:15.410 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:25:15.410 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:25:15.410 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:15.410 1+0 records in 00:25:15.410 1+0 records out 00:25:15.410 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000260164 s, 15.7 MB/s 00:25:15.410 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:15.410 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # size=4096 00:25:15.410 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:15.410 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:25:15.410 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # return 0 00:25:15.410 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:15.410 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:15.410 00:07:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:25:15.668 /dev/nbd1 00:25:15.668 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:15.668 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:15.669 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:25:15.669 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@865 -- # local i 00:25:15.669 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:25:15.669 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:25:15.669 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:25:15.669 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # break 00:25:15.669 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:25:15.669 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:25:15.669 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:15.669 1+0 records in 00:25:15.669 1+0 records out 00:25:15.669 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000313101 s, 13.1 MB/s 00:25:15.669 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:15.669 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # size=4096 00:25:15.669 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:15.669 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:25:15.669 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # return 0 00:25:15.669 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:15.669 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:15.669 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@743 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:25:15.669 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@744 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:25:15.669 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:15.669 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:15.669 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:15.669 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:25:15.669 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:15.669 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:15.926 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:15.926 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:15.926 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:15.926 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:15.926 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:15.926 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:15.926 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:25:15.927 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:25:15.927 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:15.927 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:16.184 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:16.184 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:16.184 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:16.184 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:16.184 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:16.184 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:16.184 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:25:16.184 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:25:16.184 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # '[' true = true ']' 00:25:16.184 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:25:16.184 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev1 ']' 00:25:16.184 00:07:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:16.442 00:07:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:16.700 [2024-05-15 00:07:17.222705] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:16.700 [2024-05-15 00:07:17.222751] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:16.700 [2024-05-15 00:07:17.222771] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x222efd0 00:25:16.700 [2024-05-15 00:07:17.222784] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:16.700 [2024-05-15 00:07:17.224206] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:16.700 [2024-05-15 00:07:17.224232] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:16.700 [2024-05-15 00:07:17.224276] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:16.700 [2024-05-15 00:07:17.224302] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:16.700 BaseBdev1 00:25:16.700 00:07:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:25:16.700 00:07:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev2 ']' 00:25:16.700 00:07:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev2 00:25:16.958 00:07:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:17.216 [2024-05-15 00:07:17.711990] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:17.216 [2024-05-15 00:07:17.712026] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:17.216 [2024-05-15 00:07:17.712046] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23c2a80 00:25:17.216 [2024-05-15 00:07:17.712058] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:17.216 [2024-05-15 00:07:17.712228] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:17.216 [2024-05-15 00:07:17.712244] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:17.216 [2024-05-15 00:07:17.712283] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev2 00:25:17.216 [2024-05-15 00:07:17.712294] bdev_raid.c:3396:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev2 (3) greater than existing raid bdev raid_bdev1 (1) 00:25:17.216 [2024-05-15 00:07:17.712304] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:17.216 [2024-05-15 00:07:17.712320] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23c8a50 name raid_bdev1, state configuring 00:25:17.216 [2024-05-15 00:07:17.712350] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:17.216 BaseBdev2 00:25:17.216 00:07:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@757 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:17.474 00:07:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@758 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:17.732 [2024-05-15 00:07:18.205295] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:17.732 [2024-05-15 00:07:18.205332] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:17.732 [2024-05-15 00:07:18.205354] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23b7be0 00:25:17.732 [2024-05-15 00:07:18.205366] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:17.732 [2024-05-15 00:07:18.205555] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:17.732 [2024-05-15 00:07:18.205571] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:17.732 [2024-05-15 00:07:18.205622] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev spare 00:25:17.732 [2024-05-15 00:07:18.205638] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:17.732 spare 00:25:17.732 00:07:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:17.732 00:07:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:17.732 00:07:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:17.732 00:07:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:17.732 00:07:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:17.732 00:07:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:25:17.732 00:07:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:17.732 00:07:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:17.732 00:07:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:17.732 00:07:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:17.732 00:07:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:17.732 00:07:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:17.732 [2024-05-15 00:07:18.305959] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x23b7e50 00:25:17.732 [2024-05-15 00:07:18.305974] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:17.732 [2024-05-15 00:07:18.306043] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x222f260 00:25:17.732 [2024-05-15 00:07:18.306160] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23b7e50 00:25:17.732 [2024-05-15 00:07:18.306170] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23b7e50 00:25:17.732 [2024-05-15 00:07:18.306243] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:17.990 00:07:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:17.990 "name": "raid_bdev1", 00:25:17.990 "uuid": "02454f3c-b9a3-4d0d-8a86-8192d2341601", 00:25:17.990 "strip_size_kb": 0, 00:25:17.990 "state": "online", 00:25:17.990 "raid_level": "raid1", 00:25:17.990 "superblock": true, 00:25:17.990 "num_base_bdevs": 2, 00:25:17.990 "num_base_bdevs_discovered": 2, 00:25:17.990 "num_base_bdevs_operational": 2, 00:25:17.990 "base_bdevs_list": [ 00:25:17.990 { 00:25:17.990 "name": "spare", 00:25:17.990 "uuid": "c3d848bb-7ecc-5d07-b5e0-aadd2f90b7a5", 00:25:17.990 "is_configured": true, 00:25:17.990 "data_offset": 256, 00:25:17.990 "data_size": 7936 00:25:17.990 }, 00:25:17.990 { 00:25:17.990 "name": "BaseBdev2", 00:25:17.990 "uuid": "bb264898-5ade-522a-8463-382857fddca2", 00:25:17.990 "is_configured": true, 00:25:17.990 "data_offset": 256, 00:25:17.990 "data_size": 7936 00:25:17.990 } 00:25:17.990 ] 00:25:17.990 }' 00:25:17.990 00:07:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:17.990 00:07:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:18.555 00:07:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:18.555 00:07:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:25:18.555 00:07:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:25:18.555 00:07:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=none 00:25:18.555 00:07:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:25:18.555 00:07:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:18.555 00:07:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:18.813 00:07:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:25:18.813 "name": "raid_bdev1", 00:25:18.813 "uuid": "02454f3c-b9a3-4d0d-8a86-8192d2341601", 00:25:18.813 "strip_size_kb": 0, 00:25:18.813 "state": "online", 00:25:18.813 "raid_level": "raid1", 00:25:18.813 "superblock": true, 00:25:18.813 "num_base_bdevs": 2, 00:25:18.814 "num_base_bdevs_discovered": 2, 00:25:18.814 "num_base_bdevs_operational": 2, 00:25:18.814 "base_bdevs_list": [ 00:25:18.814 { 00:25:18.814 "name": "spare", 00:25:18.814 "uuid": "c3d848bb-7ecc-5d07-b5e0-aadd2f90b7a5", 00:25:18.814 "is_configured": true, 00:25:18.814 "data_offset": 256, 00:25:18.814 "data_size": 7936 00:25:18.814 }, 00:25:18.814 { 00:25:18.814 "name": "BaseBdev2", 00:25:18.814 "uuid": "bb264898-5ade-522a-8463-382857fddca2", 00:25:18.814 "is_configured": true, 00:25:18.814 "data_offset": 256, 00:25:18.814 "data_size": 7936 00:25:18.814 } 00:25:18.814 ] 00:25:18.814 }' 00:25:18.814 00:07:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:25:18.814 00:07:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:18.814 00:07:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:25:18.814 00:07:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:25:18.814 00:07:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:18.814 00:07:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # jq -r '.[].base_bdevs_list[0].name' 00:25:19.072 00:07:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # [[ spare == \s\p\a\r\e ]] 00:25:19.072 00:07:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:19.329 [2024-05-15 00:07:19.813687] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:19.329 00:07:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@766 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:19.329 00:07:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:19.329 00:07:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:19.330 00:07:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:19.330 00:07:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:19.330 00:07:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:25:19.330 00:07:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:19.330 00:07:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:19.330 00:07:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:19.330 00:07:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:19.330 00:07:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:19.330 00:07:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:19.588 00:07:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:19.588 "name": "raid_bdev1", 00:25:19.588 "uuid": "02454f3c-b9a3-4d0d-8a86-8192d2341601", 00:25:19.588 "strip_size_kb": 0, 00:25:19.588 "state": "online", 00:25:19.588 "raid_level": "raid1", 00:25:19.588 "superblock": true, 00:25:19.588 "num_base_bdevs": 2, 00:25:19.588 "num_base_bdevs_discovered": 1, 00:25:19.588 "num_base_bdevs_operational": 1, 00:25:19.588 "base_bdevs_list": [ 00:25:19.588 { 00:25:19.588 "name": null, 00:25:19.588 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:19.588 "is_configured": false, 00:25:19.588 "data_offset": 256, 00:25:19.588 "data_size": 7936 00:25:19.588 }, 00:25:19.588 { 00:25:19.588 "name": "BaseBdev2", 00:25:19.588 "uuid": "bb264898-5ade-522a-8463-382857fddca2", 00:25:19.588 "is_configured": true, 00:25:19.588 "data_offset": 256, 00:25:19.588 "data_size": 7936 00:25:19.588 } 00:25:19.588 ] 00:25:19.588 }' 00:25:19.588 00:07:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:19.588 00:07:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:20.152 00:07:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:20.409 [2024-05-15 00:07:20.876502] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:20.409 [2024-05-15 00:07:20.876658] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:20.409 [2024-05-15 00:07:20.876675] bdev_raid.c:3452:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:20.409 [2024-05-15 00:07:20.876702] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:20.409 [2024-05-15 00:07:20.878854] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23c51c0 00:25:20.409 [2024-05-15 00:07:20.880182] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:20.409 00:07:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # sleep 1 00:25:21.341 00:07:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:21.341 00:07:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:25:21.341 00:07:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:25:21.341 00:07:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=spare 00:25:21.341 00:07:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:25:21.341 00:07:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:21.341 00:07:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:21.599 00:07:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:25:21.599 "name": "raid_bdev1", 00:25:21.599 "uuid": "02454f3c-b9a3-4d0d-8a86-8192d2341601", 00:25:21.599 "strip_size_kb": 0, 00:25:21.599 "state": "online", 00:25:21.599 "raid_level": "raid1", 00:25:21.599 "superblock": true, 00:25:21.599 "num_base_bdevs": 2, 00:25:21.599 "num_base_bdevs_discovered": 2, 00:25:21.599 "num_base_bdevs_operational": 2, 00:25:21.599 "process": { 00:25:21.599 "type": "rebuild", 00:25:21.599 "target": "spare", 00:25:21.599 "progress": { 00:25:21.599 "blocks": 3072, 00:25:21.599 "percent": 38 00:25:21.599 } 00:25:21.599 }, 00:25:21.599 "base_bdevs_list": [ 00:25:21.599 { 00:25:21.599 "name": "spare", 00:25:21.599 "uuid": "c3d848bb-7ecc-5d07-b5e0-aadd2f90b7a5", 00:25:21.599 "is_configured": true, 00:25:21.599 "data_offset": 256, 00:25:21.599 "data_size": 7936 00:25:21.599 }, 00:25:21.599 { 00:25:21.599 "name": "BaseBdev2", 00:25:21.600 "uuid": "bb264898-5ade-522a-8463-382857fddca2", 00:25:21.600 "is_configured": true, 00:25:21.600 "data_offset": 256, 00:25:21.600 "data_size": 7936 00:25:21.600 } 00:25:21.600 ] 00:25:21.600 }' 00:25:21.600 00:07:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:25:21.600 00:07:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:21.858 00:07:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:25:21.858 00:07:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:25:21.858 00:07:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:21.858 [2024-05-15 00:07:22.445753] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:22.116 [2024-05-15 00:07:22.492884] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:22.116 [2024-05-15 00:07:22.492929] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:22.116 00:07:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:22.116 00:07:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:22.116 00:07:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:22.116 00:07:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:22.116 00:07:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:22.116 00:07:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:25:22.116 00:07:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:22.116 00:07:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:22.116 00:07:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:22.116 00:07:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:22.116 00:07:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:22.116 00:07:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:22.373 00:07:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:22.373 "name": "raid_bdev1", 00:25:22.373 "uuid": "02454f3c-b9a3-4d0d-8a86-8192d2341601", 00:25:22.373 "strip_size_kb": 0, 00:25:22.373 "state": "online", 00:25:22.373 "raid_level": "raid1", 00:25:22.373 "superblock": true, 00:25:22.373 "num_base_bdevs": 2, 00:25:22.373 "num_base_bdevs_discovered": 1, 00:25:22.373 "num_base_bdevs_operational": 1, 00:25:22.373 "base_bdevs_list": [ 00:25:22.373 { 00:25:22.373 "name": null, 00:25:22.373 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:22.373 "is_configured": false, 00:25:22.373 "data_offset": 256, 00:25:22.373 "data_size": 7936 00:25:22.373 }, 00:25:22.373 { 00:25:22.373 "name": "BaseBdev2", 00:25:22.373 "uuid": "bb264898-5ade-522a-8463-382857fddca2", 00:25:22.373 "is_configured": true, 00:25:22.373 "data_offset": 256, 00:25:22.373 "data_size": 7936 00:25:22.373 } 00:25:22.373 ] 00:25:22.373 }' 00:25:22.373 00:07:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:22.373 00:07:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:22.938 00:07:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:22.938 [2024-05-15 00:07:23.527559] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:22.938 [2024-05-15 00:07:23.527612] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:22.938 [2024-05-15 00:07:23.527634] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x222d930 00:25:22.938 [2024-05-15 00:07:23.527646] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:22.938 [2024-05-15 00:07:23.527861] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:22.938 [2024-05-15 00:07:23.527877] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:22.938 [2024-05-15 00:07:23.527936] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev spare 00:25:22.938 [2024-05-15 00:07:23.527948] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:22.938 [2024-05-15 00:07:23.527964] bdev_raid.c:3452:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:22.938 [2024-05-15 00:07:23.527982] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:23.195 [2024-05-15 00:07:23.530170] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23ca490 00:25:23.195 [2024-05-15 00:07:23.531500] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:23.195 spare 00:25:23.195 00:07:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # sleep 1 00:25:24.129 00:07:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:24.129 00:07:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:25:24.129 00:07:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:25:24.129 00:07:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=spare 00:25:24.129 00:07:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:25:24.129 00:07:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:24.129 00:07:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:24.387 00:07:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:25:24.387 "name": "raid_bdev1", 00:25:24.387 "uuid": "02454f3c-b9a3-4d0d-8a86-8192d2341601", 00:25:24.387 "strip_size_kb": 0, 00:25:24.387 "state": "online", 00:25:24.387 "raid_level": "raid1", 00:25:24.387 "superblock": true, 00:25:24.387 "num_base_bdevs": 2, 00:25:24.387 "num_base_bdevs_discovered": 2, 00:25:24.387 "num_base_bdevs_operational": 2, 00:25:24.387 "process": { 00:25:24.387 "type": "rebuild", 00:25:24.387 "target": "spare", 00:25:24.387 "progress": { 00:25:24.387 "blocks": 3072, 00:25:24.387 "percent": 38 00:25:24.387 } 00:25:24.387 }, 00:25:24.387 "base_bdevs_list": [ 00:25:24.387 { 00:25:24.387 "name": "spare", 00:25:24.387 "uuid": "c3d848bb-7ecc-5d07-b5e0-aadd2f90b7a5", 00:25:24.387 "is_configured": true, 00:25:24.387 "data_offset": 256, 00:25:24.387 "data_size": 7936 00:25:24.387 }, 00:25:24.387 { 00:25:24.387 "name": "BaseBdev2", 00:25:24.387 "uuid": "bb264898-5ade-522a-8463-382857fddca2", 00:25:24.387 "is_configured": true, 00:25:24.387 "data_offset": 256, 00:25:24.387 "data_size": 7936 00:25:24.387 } 00:25:24.387 ] 00:25:24.387 }' 00:25:24.387 00:07:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:25:24.387 00:07:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:24.387 00:07:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:25:24.387 00:07:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:25:24.387 00:07:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:24.645 [2024-05-15 00:07:25.108370] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:24.645 [2024-05-15 00:07:25.144209] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:24.645 [2024-05-15 00:07:25.144259] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:24.645 00:07:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@780 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:24.645 00:07:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:24.645 00:07:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:24.645 00:07:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:24.645 00:07:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:24.645 00:07:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:25:24.645 00:07:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:24.645 00:07:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:24.645 00:07:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:24.645 00:07:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:24.645 00:07:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:24.645 00:07:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:24.904 00:07:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:24.904 "name": "raid_bdev1", 00:25:24.904 "uuid": "02454f3c-b9a3-4d0d-8a86-8192d2341601", 00:25:24.904 "strip_size_kb": 0, 00:25:24.904 "state": "online", 00:25:24.904 "raid_level": "raid1", 00:25:24.904 "superblock": true, 00:25:24.904 "num_base_bdevs": 2, 00:25:24.904 "num_base_bdevs_discovered": 1, 00:25:24.904 "num_base_bdevs_operational": 1, 00:25:24.904 "base_bdevs_list": [ 00:25:24.904 { 00:25:24.904 "name": null, 00:25:24.904 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:24.904 "is_configured": false, 00:25:24.904 "data_offset": 256, 00:25:24.904 "data_size": 7936 00:25:24.904 }, 00:25:24.904 { 00:25:24.904 "name": "BaseBdev2", 00:25:24.904 "uuid": "bb264898-5ade-522a-8463-382857fddca2", 00:25:24.904 "is_configured": true, 00:25:24.904 "data_offset": 256, 00:25:24.904 "data_size": 7936 00:25:24.904 } 00:25:24.904 ] 00:25:24.904 }' 00:25:24.904 00:07:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:24.904 00:07:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:25.509 00:07:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@781 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:25.509 00:07:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:25:25.509 00:07:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:25:25.509 00:07:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=none 00:25:25.509 00:07:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:25:25.509 00:07:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:25.509 00:07:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:25.768 00:07:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:25:25.768 "name": "raid_bdev1", 00:25:25.768 "uuid": "02454f3c-b9a3-4d0d-8a86-8192d2341601", 00:25:25.768 "strip_size_kb": 0, 00:25:25.768 "state": "online", 00:25:25.768 "raid_level": "raid1", 00:25:25.768 "superblock": true, 00:25:25.768 "num_base_bdevs": 2, 00:25:25.768 "num_base_bdevs_discovered": 1, 00:25:25.768 "num_base_bdevs_operational": 1, 00:25:25.768 "base_bdevs_list": [ 00:25:25.768 { 00:25:25.768 "name": null, 00:25:25.768 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:25.768 "is_configured": false, 00:25:25.768 "data_offset": 256, 00:25:25.768 "data_size": 7936 00:25:25.768 }, 00:25:25.768 { 00:25:25.768 "name": "BaseBdev2", 00:25:25.768 "uuid": "bb264898-5ade-522a-8463-382857fddca2", 00:25:25.768 "is_configured": true, 00:25:25.768 "data_offset": 256, 00:25:25.768 "data_size": 7936 00:25:25.768 } 00:25:25.768 ] 00:25:25.768 }' 00:25:25.768 00:07:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:25:25.768 00:07:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:25.768 00:07:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:25:25.768 00:07:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:25:25.768 00:07:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:26.027 00:07:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@785 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:26.287 [2024-05-15 00:07:26.803701] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:26.287 [2024-05-15 00:07:26.803747] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:26.287 [2024-05-15 00:07:26.803768] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23c6e00 00:25:26.287 [2024-05-15 00:07:26.803780] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:26.287 [2024-05-15 00:07:26.803977] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:26.287 [2024-05-15 00:07:26.803992] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:26.287 [2024-05-15 00:07:26.804038] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:26.287 [2024-05-15 00:07:26.804050] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:26.287 [2024-05-15 00:07:26.804060] bdev_raid.c:3430:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:26.287 BaseBdev1 00:25:26.287 00:07:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@786 -- # sleep 1 00:25:27.663 00:07:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@787 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:27.663 00:07:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:27.663 00:07:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:27.663 00:07:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:27.663 00:07:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:27.663 00:07:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:25:27.663 00:07:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:27.663 00:07:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:27.663 00:07:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:27.663 00:07:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:27.663 00:07:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:27.663 00:07:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:27.663 00:07:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:27.663 "name": "raid_bdev1", 00:25:27.663 "uuid": "02454f3c-b9a3-4d0d-8a86-8192d2341601", 00:25:27.663 "strip_size_kb": 0, 00:25:27.663 "state": "online", 00:25:27.663 "raid_level": "raid1", 00:25:27.663 "superblock": true, 00:25:27.663 "num_base_bdevs": 2, 00:25:27.663 "num_base_bdevs_discovered": 1, 00:25:27.663 "num_base_bdevs_operational": 1, 00:25:27.663 "base_bdevs_list": [ 00:25:27.663 { 00:25:27.663 "name": null, 00:25:27.663 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:27.663 "is_configured": false, 00:25:27.663 "data_offset": 256, 00:25:27.663 "data_size": 7936 00:25:27.663 }, 00:25:27.663 { 00:25:27.663 "name": "BaseBdev2", 00:25:27.663 "uuid": "bb264898-5ade-522a-8463-382857fddca2", 00:25:27.663 "is_configured": true, 00:25:27.664 "data_offset": 256, 00:25:27.664 "data_size": 7936 00:25:27.664 } 00:25:27.664 ] 00:25:27.664 }' 00:25:27.664 00:07:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:27.664 00:07:28 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:28.229 00:07:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@788 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:28.229 00:07:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:25:28.229 00:07:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:25:28.229 00:07:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=none 00:25:28.229 00:07:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:25:28.229 00:07:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:28.229 00:07:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:28.488 00:07:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:25:28.488 "name": "raid_bdev1", 00:25:28.488 "uuid": "02454f3c-b9a3-4d0d-8a86-8192d2341601", 00:25:28.488 "strip_size_kb": 0, 00:25:28.488 "state": "online", 00:25:28.488 "raid_level": "raid1", 00:25:28.488 "superblock": true, 00:25:28.488 "num_base_bdevs": 2, 00:25:28.488 "num_base_bdevs_discovered": 1, 00:25:28.488 "num_base_bdevs_operational": 1, 00:25:28.488 "base_bdevs_list": [ 00:25:28.488 { 00:25:28.488 "name": null, 00:25:28.488 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:28.488 "is_configured": false, 00:25:28.488 "data_offset": 256, 00:25:28.488 "data_size": 7936 00:25:28.488 }, 00:25:28.488 { 00:25:28.488 "name": "BaseBdev2", 00:25:28.488 "uuid": "bb264898-5ade-522a-8463-382857fddca2", 00:25:28.488 "is_configured": true, 00:25:28.488 "data_offset": 256, 00:25:28.488 "data_size": 7936 00:25:28.488 } 00:25:28.488 ] 00:25:28.488 }' 00:25:28.488 00:07:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:25:28.488 00:07:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:28.488 00:07:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:25:28.488 00:07:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:25:28.488 00:07:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@789 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:28.488 00:07:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:25:28.488 00:07:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:28.488 00:07:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:28.488 00:07:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:28.488 00:07:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:28.488 00:07:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:28.488 00:07:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:28.488 00:07:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:28.488 00:07:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:28.488 00:07:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:28.488 00:07:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:28.747 [2024-05-15 00:07:29.230267] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:28.747 [2024-05-15 00:07:29.230394] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:28.747 [2024-05-15 00:07:29.230416] bdev_raid.c:3430:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:28.747 request: 00:25:28.747 { 00:25:28.747 "raid_bdev": "raid_bdev1", 00:25:28.747 "base_bdev": "BaseBdev1", 00:25:28.747 "method": "bdev_raid_add_base_bdev", 00:25:28.747 "req_id": 1 00:25:28.747 } 00:25:28.747 Got JSON-RPC error response 00:25:28.747 response: 00:25:28.747 { 00:25:28.747 "code": -22, 00:25:28.747 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:25:28.747 } 00:25:28.747 00:07:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # es=1 00:25:28.747 00:07:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:25:28.747 00:07:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:25:28.747 00:07:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:25:28.747 00:07:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@790 -- # sleep 1 00:25:29.683 00:07:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:29.683 00:07:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:29.683 00:07:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:29.683 00:07:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:29.683 00:07:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:29.683 00:07:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:25:29.683 00:07:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:29.683 00:07:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:29.683 00:07:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:29.683 00:07:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:29.683 00:07:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:29.683 00:07:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:29.942 00:07:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:29.942 "name": "raid_bdev1", 00:25:29.942 "uuid": "02454f3c-b9a3-4d0d-8a86-8192d2341601", 00:25:29.942 "strip_size_kb": 0, 00:25:29.942 "state": "online", 00:25:29.942 "raid_level": "raid1", 00:25:29.942 "superblock": true, 00:25:29.942 "num_base_bdevs": 2, 00:25:29.942 "num_base_bdevs_discovered": 1, 00:25:29.942 "num_base_bdevs_operational": 1, 00:25:29.942 "base_bdevs_list": [ 00:25:29.942 { 00:25:29.942 "name": null, 00:25:29.942 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:29.942 "is_configured": false, 00:25:29.942 "data_offset": 256, 00:25:29.942 "data_size": 7936 00:25:29.942 }, 00:25:29.942 { 00:25:29.942 "name": "BaseBdev2", 00:25:29.942 "uuid": "bb264898-5ade-522a-8463-382857fddca2", 00:25:29.942 "is_configured": true, 00:25:29.942 "data_offset": 256, 00:25:29.942 "data_size": 7936 00:25:29.942 } 00:25:29.942 ] 00:25:29.942 }' 00:25:29.942 00:07:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:29.942 00:07:30 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:30.509 00:07:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@792 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:30.509 00:07:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:25:30.509 00:07:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:25:30.509 00:07:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=none 00:25:30.509 00:07:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:25:30.509 00:07:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:30.509 00:07:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:30.767 00:07:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:25:30.768 "name": "raid_bdev1", 00:25:30.768 "uuid": "02454f3c-b9a3-4d0d-8a86-8192d2341601", 00:25:30.768 "strip_size_kb": 0, 00:25:30.768 "state": "online", 00:25:30.768 "raid_level": "raid1", 00:25:30.768 "superblock": true, 00:25:30.768 "num_base_bdevs": 2, 00:25:30.768 "num_base_bdevs_discovered": 1, 00:25:30.768 "num_base_bdevs_operational": 1, 00:25:30.768 "base_bdevs_list": [ 00:25:30.768 { 00:25:30.768 "name": null, 00:25:30.768 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:30.768 "is_configured": false, 00:25:30.768 "data_offset": 256, 00:25:30.768 "data_size": 7936 00:25:30.768 }, 00:25:30.768 { 00:25:30.768 "name": "BaseBdev2", 00:25:30.768 "uuid": "bb264898-5ade-522a-8463-382857fddca2", 00:25:30.768 "is_configured": true, 00:25:30.768 "data_offset": 256, 00:25:30.768 "data_size": 7936 00:25:30.768 } 00:25:30.768 ] 00:25:30.768 }' 00:25:30.768 00:07:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:25:31.027 00:07:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:31.027 00:07:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:25:31.027 00:07:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:25:31.027 00:07:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@795 -- # killprocess 516992 00:25:31.027 00:07:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@946 -- # '[' -z 516992 ']' 00:25:31.027 00:07:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@950 -- # kill -0 516992 00:25:31.027 00:07:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@951 -- # uname 00:25:31.027 00:07:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:25:31.027 00:07:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 516992 00:25:31.027 00:07:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:25:31.027 00:07:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:25:31.027 00:07:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@964 -- # echo 'killing process with pid 516992' 00:25:31.027 killing process with pid 516992 00:25:31.027 00:07:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@965 -- # kill 516992 00:25:31.027 Received shutdown signal, test time was about 60.000000 seconds 00:25:31.027 00:25:31.027 Latency(us) 00:25:31.027 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:31.027 =================================================================================================================== 00:25:31.027 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:25:31.027 [2024-05-15 00:07:31.443834] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:31.027 [2024-05-15 00:07:31.443928] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:31.027 [2024-05-15 00:07:31.443973] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:31.027 [2024-05-15 00:07:31.443984] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23b7e50 name raid_bdev1, state offline 00:25:31.027 00:07:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@970 -- # wait 516992 00:25:31.027 [2024-05-15 00:07:31.476173] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:31.287 00:07:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@797 -- # return 0 00:25:31.287 00:25:31.287 real 0m31.970s 00:25:31.287 user 0m49.994s 00:25:31.287 sys 0m5.153s 00:25:31.287 00:07:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1122 -- # xtrace_disable 00:25:31.287 00:07:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:31.287 ************************************ 00:25:31.287 END TEST raid_rebuild_test_sb_md_separate 00:25:31.287 ************************************ 00:25:31.287 00:07:31 bdev_raid -- bdev/bdev_raid.sh@857 -- # base_malloc_params='-m 32 -i' 00:25:31.287 00:07:31 bdev_raid -- bdev/bdev_raid.sh@858 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:25:31.287 00:07:31 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:25:31.287 00:07:31 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:25:31.287 00:07:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:31.287 ************************************ 00:25:31.287 START TEST raid_state_function_test_sb_md_interleaved 00:25:31.287 ************************************ 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1121 -- # raid_state_function_test raid1 2 true 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local raid_level=raid1 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=2 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@231 -- # '[' raid1 '!=' raid1 ']' 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@235 -- # strip_size=0 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # raid_pid=521579 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 521579' 00:25:31.287 Process raid pid: 521579 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@247 -- # waitforlisten 521579 /var/tmp/spdk-raid.sock 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@827 -- # '[' -z 521579 ']' 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@832 -- # local max_retries=100 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:31.287 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # xtrace_disable 00:25:31.287 00:07:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:31.287 [2024-05-15 00:07:31.829119] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:25:31.287 [2024-05-15 00:07:31.829163] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:25:31.546 [2024-05-15 00:07:31.938983] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:31.546 [2024-05-15 00:07:32.042690] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:25:31.546 [2024-05-15 00:07:32.108087] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:31.546 [2024-05-15 00:07:32.108125] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:32.482 00:07:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:25:32.482 00:07:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@860 -- # return 0 00:25:32.482 00:07:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:25:32.482 [2024-05-15 00:07:32.985052] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:25:32.482 [2024-05-15 00:07:32.985092] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:25:32.482 [2024-05-15 00:07:32.985103] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:32.482 [2024-05-15 00:07:32.985115] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:32.482 00:07:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:25:32.482 00:07:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:25:32.482 00:07:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:25:32.482 00:07:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:32.482 00:07:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:32.482 00:07:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:25:32.482 00:07:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:32.482 00:07:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:32.482 00:07:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:32.482 00:07:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:32.482 00:07:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:32.482 00:07:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:32.741 00:07:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:32.741 "name": "Existed_Raid", 00:25:32.741 "uuid": "67089c2b-d6fc-4528-b6bf-0ce67272c4c0", 00:25:32.741 "strip_size_kb": 0, 00:25:32.741 "state": "configuring", 00:25:32.741 "raid_level": "raid1", 00:25:32.741 "superblock": true, 00:25:32.741 "num_base_bdevs": 2, 00:25:32.741 "num_base_bdevs_discovered": 0, 00:25:32.741 "num_base_bdevs_operational": 2, 00:25:32.741 "base_bdevs_list": [ 00:25:32.741 { 00:25:32.741 "name": "BaseBdev1", 00:25:32.741 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:32.741 "is_configured": false, 00:25:32.741 "data_offset": 0, 00:25:32.741 "data_size": 0 00:25:32.741 }, 00:25:32.741 { 00:25:32.741 "name": "BaseBdev2", 00:25:32.741 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:32.741 "is_configured": false, 00:25:32.741 "data_offset": 0, 00:25:32.741 "data_size": 0 00:25:32.741 } 00:25:32.741 ] 00:25:32.741 }' 00:25:32.741 00:07:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:32.741 00:07:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:33.308 00:07:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:25:33.566 [2024-05-15 00:07:34.055731] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:25:33.566 [2024-05-15 00:07:34.055764] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc6fbc0 name Existed_Raid, state configuring 00:25:33.566 00:07:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:25:33.824 [2024-05-15 00:07:34.296384] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:25:33.825 [2024-05-15 00:07:34.296417] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:25:33.825 [2024-05-15 00:07:34.296428] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:33.825 [2024-05-15 00:07:34.296439] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:33.825 00:07:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:25:34.085 [2024-05-15 00:07:34.548339] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:34.085 BaseBdev1 00:25:34.085 00:07:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:25:34.085 00:07:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:25:34.085 00:07:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:25:34.085 00:07:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local i 00:25:34.085 00:07:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:25:34.085 00:07:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:25:34.085 00:07:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:34.344 00:07:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:25:34.602 [ 00:25:34.602 { 00:25:34.602 "name": "BaseBdev1", 00:25:34.602 "aliases": [ 00:25:34.602 "acfcebb8-2952-4f29-9fd6-0571dda50c8e" 00:25:34.602 ], 00:25:34.602 "product_name": "Malloc disk", 00:25:34.602 "block_size": 4128, 00:25:34.602 "num_blocks": 8192, 00:25:34.602 "uuid": "acfcebb8-2952-4f29-9fd6-0571dda50c8e", 00:25:34.602 "md_size": 32, 00:25:34.602 "md_interleave": true, 00:25:34.602 "dif_type": 0, 00:25:34.602 "assigned_rate_limits": { 00:25:34.602 "rw_ios_per_sec": 0, 00:25:34.602 "rw_mbytes_per_sec": 0, 00:25:34.602 "r_mbytes_per_sec": 0, 00:25:34.602 "w_mbytes_per_sec": 0 00:25:34.602 }, 00:25:34.602 "claimed": true, 00:25:34.602 "claim_type": "exclusive_write", 00:25:34.602 "zoned": false, 00:25:34.602 "supported_io_types": { 00:25:34.602 "read": true, 00:25:34.602 "write": true, 00:25:34.602 "unmap": true, 00:25:34.602 "write_zeroes": true, 00:25:34.602 "flush": true, 00:25:34.602 "reset": true, 00:25:34.602 "compare": false, 00:25:34.602 "compare_and_write": false, 00:25:34.602 "abort": true, 00:25:34.602 "nvme_admin": false, 00:25:34.602 "nvme_io": false 00:25:34.602 }, 00:25:34.602 "memory_domains": [ 00:25:34.602 { 00:25:34.602 "dma_device_id": "system", 00:25:34.602 "dma_device_type": 1 00:25:34.602 }, 00:25:34.602 { 00:25:34.602 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:34.602 "dma_device_type": 2 00:25:34.602 } 00:25:34.602 ], 00:25:34.602 "driver_specific": {} 00:25:34.602 } 00:25:34.602 ] 00:25:34.602 00:07:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@903 -- # return 0 00:25:34.602 00:07:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:25:34.602 00:07:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:25:34.602 00:07:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:25:34.602 00:07:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:34.602 00:07:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:34.602 00:07:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:25:34.602 00:07:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:34.602 00:07:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:34.602 00:07:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:34.602 00:07:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:34.602 00:07:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:34.602 00:07:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:34.861 00:07:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:34.861 "name": "Existed_Raid", 00:25:34.861 "uuid": "e7e946c5-5bfe-4de9-983f-7303e9c19d08", 00:25:34.861 "strip_size_kb": 0, 00:25:34.861 "state": "configuring", 00:25:34.861 "raid_level": "raid1", 00:25:34.861 "superblock": true, 00:25:34.861 "num_base_bdevs": 2, 00:25:34.861 "num_base_bdevs_discovered": 1, 00:25:34.861 "num_base_bdevs_operational": 2, 00:25:34.861 "base_bdevs_list": [ 00:25:34.861 { 00:25:34.861 "name": "BaseBdev1", 00:25:34.861 "uuid": "acfcebb8-2952-4f29-9fd6-0571dda50c8e", 00:25:34.861 "is_configured": true, 00:25:34.861 "data_offset": 256, 00:25:34.861 "data_size": 7936 00:25:34.861 }, 00:25:34.861 { 00:25:34.861 "name": "BaseBdev2", 00:25:34.861 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:34.861 "is_configured": false, 00:25:34.861 "data_offset": 0, 00:25:34.861 "data_size": 0 00:25:34.861 } 00:25:34.861 ] 00:25:34.861 }' 00:25:34.861 00:07:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:34.861 00:07:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:35.429 00:07:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:25:35.687 [2024-05-15 00:07:36.144593] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:25:35.687 [2024-05-15 00:07:36.144635] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc6fe60 name Existed_Raid, state configuring 00:25:35.687 00:07:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:25:35.945 [2024-05-15 00:07:36.385268] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:35.945 [2024-05-15 00:07:36.386771] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:35.945 [2024-05-15 00:07:36.386803] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:35.945 00:07:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:25:35.945 00:07:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:25:35.945 00:07:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:25:35.945 00:07:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:25:35.945 00:07:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:25:35.945 00:07:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:35.945 00:07:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:35.945 00:07:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:25:35.945 00:07:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:35.945 00:07:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:35.945 00:07:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:35.945 00:07:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:35.945 00:07:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:35.945 00:07:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:36.202 00:07:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:36.202 "name": "Existed_Raid", 00:25:36.202 "uuid": "47b20153-54a3-4956-9bef-a74d7e42c384", 00:25:36.202 "strip_size_kb": 0, 00:25:36.202 "state": "configuring", 00:25:36.202 "raid_level": "raid1", 00:25:36.202 "superblock": true, 00:25:36.202 "num_base_bdevs": 2, 00:25:36.202 "num_base_bdevs_discovered": 1, 00:25:36.202 "num_base_bdevs_operational": 2, 00:25:36.202 "base_bdevs_list": [ 00:25:36.202 { 00:25:36.202 "name": "BaseBdev1", 00:25:36.202 "uuid": "acfcebb8-2952-4f29-9fd6-0571dda50c8e", 00:25:36.202 "is_configured": true, 00:25:36.202 "data_offset": 256, 00:25:36.202 "data_size": 7936 00:25:36.202 }, 00:25:36.202 { 00:25:36.202 "name": "BaseBdev2", 00:25:36.202 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:36.202 "is_configured": false, 00:25:36.202 "data_offset": 0, 00:25:36.202 "data_size": 0 00:25:36.202 } 00:25:36.202 ] 00:25:36.202 }' 00:25:36.202 00:07:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:36.202 00:07:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:36.813 00:07:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:25:37.071 [2024-05-15 00:07:37.479665] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:37.071 [2024-05-15 00:07:37.479800] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xc6f4b0 00:25:37.071 [2024-05-15 00:07:37.479813] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:25:37.071 [2024-05-15 00:07:37.479870] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe0cc70 00:25:37.071 [2024-05-15 00:07:37.479947] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc6f4b0 00:25:37.071 [2024-05-15 00:07:37.479956] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xc6f4b0 00:25:37.071 [2024-05-15 00:07:37.480009] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:37.071 BaseBdev2 00:25:37.071 00:07:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:25:37.071 00:07:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:25:37.071 00:07:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:25:37.071 00:07:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local i 00:25:37.071 00:07:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:25:37.071 00:07:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:25:37.071 00:07:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:37.329 00:07:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:25:37.329 [ 00:25:37.329 { 00:25:37.329 "name": "BaseBdev2", 00:25:37.329 "aliases": [ 00:25:37.329 "693e55c6-66de-4cc1-a30f-d501fce9be28" 00:25:37.329 ], 00:25:37.329 "product_name": "Malloc disk", 00:25:37.329 "block_size": 4128, 00:25:37.329 "num_blocks": 8192, 00:25:37.329 "uuid": "693e55c6-66de-4cc1-a30f-d501fce9be28", 00:25:37.329 "md_size": 32, 00:25:37.329 "md_interleave": true, 00:25:37.329 "dif_type": 0, 00:25:37.329 "assigned_rate_limits": { 00:25:37.329 "rw_ios_per_sec": 0, 00:25:37.329 "rw_mbytes_per_sec": 0, 00:25:37.329 "r_mbytes_per_sec": 0, 00:25:37.329 "w_mbytes_per_sec": 0 00:25:37.329 }, 00:25:37.329 "claimed": true, 00:25:37.329 "claim_type": "exclusive_write", 00:25:37.329 "zoned": false, 00:25:37.329 "supported_io_types": { 00:25:37.329 "read": true, 00:25:37.329 "write": true, 00:25:37.329 "unmap": true, 00:25:37.329 "write_zeroes": true, 00:25:37.329 "flush": true, 00:25:37.329 "reset": true, 00:25:37.329 "compare": false, 00:25:37.329 "compare_and_write": false, 00:25:37.329 "abort": true, 00:25:37.329 "nvme_admin": false, 00:25:37.329 "nvme_io": false 00:25:37.329 }, 00:25:37.329 "memory_domains": [ 00:25:37.329 { 00:25:37.329 "dma_device_id": "system", 00:25:37.329 "dma_device_type": 1 00:25:37.329 }, 00:25:37.329 { 00:25:37.329 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:37.329 "dma_device_type": 2 00:25:37.329 } 00:25:37.329 ], 00:25:37.329 "driver_specific": {} 00:25:37.329 } 00:25:37.329 ] 00:25:37.329 00:07:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@903 -- # return 0 00:25:37.329 00:07:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:25:37.329 00:07:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:25:37.329 00:07:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:25:37.329 00:07:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:25:37.329 00:07:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:37.329 00:07:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:37.329 00:07:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:37.329 00:07:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:25:37.329 00:07:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:37.329 00:07:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:37.329 00:07:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:37.329 00:07:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:37.329 00:07:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:37.329 00:07:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:37.588 00:07:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:37.588 "name": "Existed_Raid", 00:25:37.588 "uuid": "47b20153-54a3-4956-9bef-a74d7e42c384", 00:25:37.588 "strip_size_kb": 0, 00:25:37.588 "state": "online", 00:25:37.588 "raid_level": "raid1", 00:25:37.588 "superblock": true, 00:25:37.588 "num_base_bdevs": 2, 00:25:37.588 "num_base_bdevs_discovered": 2, 00:25:37.588 "num_base_bdevs_operational": 2, 00:25:37.588 "base_bdevs_list": [ 00:25:37.588 { 00:25:37.588 "name": "BaseBdev1", 00:25:37.588 "uuid": "acfcebb8-2952-4f29-9fd6-0571dda50c8e", 00:25:37.588 "is_configured": true, 00:25:37.588 "data_offset": 256, 00:25:37.588 "data_size": 7936 00:25:37.588 }, 00:25:37.588 { 00:25:37.588 "name": "BaseBdev2", 00:25:37.588 "uuid": "693e55c6-66de-4cc1-a30f-d501fce9be28", 00:25:37.588 "is_configured": true, 00:25:37.588 "data_offset": 256, 00:25:37.588 "data_size": 7936 00:25:37.588 } 00:25:37.588 ] 00:25:37.588 }' 00:25:37.588 00:07:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:37.588 00:07:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:38.154 00:07:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:25:38.154 00:07:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:25:38.154 00:07:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:25:38.154 00:07:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:25:38.154 00:07:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:25:38.154 00:07:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@199 -- # local name 00:25:38.154 00:07:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:25:38.154 00:07:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:25:38.413 [2024-05-15 00:07:38.855591] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:38.413 00:07:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:25:38.413 "name": "Existed_Raid", 00:25:38.413 "aliases": [ 00:25:38.413 "47b20153-54a3-4956-9bef-a74d7e42c384" 00:25:38.413 ], 00:25:38.413 "product_name": "Raid Volume", 00:25:38.413 "block_size": 4128, 00:25:38.413 "num_blocks": 7936, 00:25:38.413 "uuid": "47b20153-54a3-4956-9bef-a74d7e42c384", 00:25:38.413 "md_size": 32, 00:25:38.413 "md_interleave": true, 00:25:38.413 "dif_type": 0, 00:25:38.413 "assigned_rate_limits": { 00:25:38.413 "rw_ios_per_sec": 0, 00:25:38.413 "rw_mbytes_per_sec": 0, 00:25:38.413 "r_mbytes_per_sec": 0, 00:25:38.413 "w_mbytes_per_sec": 0 00:25:38.413 }, 00:25:38.413 "claimed": false, 00:25:38.413 "zoned": false, 00:25:38.413 "supported_io_types": { 00:25:38.413 "read": true, 00:25:38.413 "write": true, 00:25:38.413 "unmap": false, 00:25:38.413 "write_zeroes": true, 00:25:38.413 "flush": false, 00:25:38.413 "reset": true, 00:25:38.413 "compare": false, 00:25:38.413 "compare_and_write": false, 00:25:38.413 "abort": false, 00:25:38.413 "nvme_admin": false, 00:25:38.413 "nvme_io": false 00:25:38.413 }, 00:25:38.413 "memory_domains": [ 00:25:38.413 { 00:25:38.413 "dma_device_id": "system", 00:25:38.413 "dma_device_type": 1 00:25:38.413 }, 00:25:38.413 { 00:25:38.413 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:38.413 "dma_device_type": 2 00:25:38.413 }, 00:25:38.413 { 00:25:38.413 "dma_device_id": "system", 00:25:38.413 "dma_device_type": 1 00:25:38.413 }, 00:25:38.413 { 00:25:38.413 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:38.413 "dma_device_type": 2 00:25:38.413 } 00:25:38.413 ], 00:25:38.413 "driver_specific": { 00:25:38.413 "raid": { 00:25:38.413 "uuid": "47b20153-54a3-4956-9bef-a74d7e42c384", 00:25:38.413 "strip_size_kb": 0, 00:25:38.413 "state": "online", 00:25:38.413 "raid_level": "raid1", 00:25:38.413 "superblock": true, 00:25:38.413 "num_base_bdevs": 2, 00:25:38.413 "num_base_bdevs_discovered": 2, 00:25:38.413 "num_base_bdevs_operational": 2, 00:25:38.413 "base_bdevs_list": [ 00:25:38.413 { 00:25:38.413 "name": "BaseBdev1", 00:25:38.413 "uuid": "acfcebb8-2952-4f29-9fd6-0571dda50c8e", 00:25:38.413 "is_configured": true, 00:25:38.413 "data_offset": 256, 00:25:38.413 "data_size": 7936 00:25:38.413 }, 00:25:38.413 { 00:25:38.413 "name": "BaseBdev2", 00:25:38.413 "uuid": "693e55c6-66de-4cc1-a30f-d501fce9be28", 00:25:38.413 "is_configured": true, 00:25:38.413 "data_offset": 256, 00:25:38.413 "data_size": 7936 00:25:38.413 } 00:25:38.413 ] 00:25:38.413 } 00:25:38.413 } 00:25:38.413 }' 00:25:38.413 00:07:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:38.413 00:07:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:25:38.413 BaseBdev2' 00:25:38.413 00:07:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:25:38.413 00:07:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:25:38.413 00:07:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:25:38.672 00:07:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:25:38.672 "name": "BaseBdev1", 00:25:38.672 "aliases": [ 00:25:38.672 "acfcebb8-2952-4f29-9fd6-0571dda50c8e" 00:25:38.672 ], 00:25:38.672 "product_name": "Malloc disk", 00:25:38.672 "block_size": 4128, 00:25:38.672 "num_blocks": 8192, 00:25:38.672 "uuid": "acfcebb8-2952-4f29-9fd6-0571dda50c8e", 00:25:38.672 "md_size": 32, 00:25:38.672 "md_interleave": true, 00:25:38.672 "dif_type": 0, 00:25:38.672 "assigned_rate_limits": { 00:25:38.672 "rw_ios_per_sec": 0, 00:25:38.672 "rw_mbytes_per_sec": 0, 00:25:38.672 "r_mbytes_per_sec": 0, 00:25:38.672 "w_mbytes_per_sec": 0 00:25:38.672 }, 00:25:38.672 "claimed": true, 00:25:38.672 "claim_type": "exclusive_write", 00:25:38.672 "zoned": false, 00:25:38.672 "supported_io_types": { 00:25:38.672 "read": true, 00:25:38.672 "write": true, 00:25:38.672 "unmap": true, 00:25:38.672 "write_zeroes": true, 00:25:38.672 "flush": true, 00:25:38.672 "reset": true, 00:25:38.672 "compare": false, 00:25:38.672 "compare_and_write": false, 00:25:38.672 "abort": true, 00:25:38.672 "nvme_admin": false, 00:25:38.672 "nvme_io": false 00:25:38.672 }, 00:25:38.672 "memory_domains": [ 00:25:38.672 { 00:25:38.672 "dma_device_id": "system", 00:25:38.672 "dma_device_type": 1 00:25:38.672 }, 00:25:38.672 { 00:25:38.672 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:38.672 "dma_device_type": 2 00:25:38.672 } 00:25:38.672 ], 00:25:38.672 "driver_specific": {} 00:25:38.672 }' 00:25:38.672 00:07:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:25:38.672 00:07:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:25:38.672 00:07:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 4128 == 4128 ]] 00:25:38.672 00:07:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:25:38.931 00:07:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:25:38.931 00:07:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:25:38.931 00:07:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:25:38.931 00:07:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:25:38.931 00:07:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ true == true ]] 00:25:38.931 00:07:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:25:38.931 00:07:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:25:39.189 00:07:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:25:39.189 00:07:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:25:39.189 00:07:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:25:39.189 00:07:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:25:39.189 00:07:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:25:39.189 "name": "BaseBdev2", 00:25:39.189 "aliases": [ 00:25:39.189 "693e55c6-66de-4cc1-a30f-d501fce9be28" 00:25:39.189 ], 00:25:39.189 "product_name": "Malloc disk", 00:25:39.189 "block_size": 4128, 00:25:39.189 "num_blocks": 8192, 00:25:39.189 "uuid": "693e55c6-66de-4cc1-a30f-d501fce9be28", 00:25:39.189 "md_size": 32, 00:25:39.189 "md_interleave": true, 00:25:39.189 "dif_type": 0, 00:25:39.189 "assigned_rate_limits": { 00:25:39.189 "rw_ios_per_sec": 0, 00:25:39.189 "rw_mbytes_per_sec": 0, 00:25:39.189 "r_mbytes_per_sec": 0, 00:25:39.189 "w_mbytes_per_sec": 0 00:25:39.189 }, 00:25:39.189 "claimed": true, 00:25:39.189 "claim_type": "exclusive_write", 00:25:39.189 "zoned": false, 00:25:39.189 "supported_io_types": { 00:25:39.189 "read": true, 00:25:39.189 "write": true, 00:25:39.189 "unmap": true, 00:25:39.189 "write_zeroes": true, 00:25:39.189 "flush": true, 00:25:39.189 "reset": true, 00:25:39.189 "compare": false, 00:25:39.189 "compare_and_write": false, 00:25:39.189 "abort": true, 00:25:39.189 "nvme_admin": false, 00:25:39.189 "nvme_io": false 00:25:39.189 }, 00:25:39.189 "memory_domains": [ 00:25:39.189 { 00:25:39.189 "dma_device_id": "system", 00:25:39.189 "dma_device_type": 1 00:25:39.189 }, 00:25:39.189 { 00:25:39.189 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:39.189 "dma_device_type": 2 00:25:39.189 } 00:25:39.189 ], 00:25:39.189 "driver_specific": {} 00:25:39.189 }' 00:25:39.190 00:07:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:25:39.448 00:07:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:25:39.448 00:07:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 4128 == 4128 ]] 00:25:39.448 00:07:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:25:39.448 00:07:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:25:39.448 00:07:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:25:39.448 00:07:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:25:39.448 00:07:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:25:39.707 00:07:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ true == true ]] 00:25:39.707 00:07:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:25:39.707 00:07:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:25:39.707 00:07:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:25:39.707 00:07:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:25:39.966 [2024-05-15 00:07:40.359386] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:39.966 00:07:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # local expected_state 00:25:39.966 00:07:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@277 -- # has_redundancy raid1 00:25:39.966 00:07:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # case $1 in 00:25:39.966 00:07:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@215 -- # return 0 00:25:39.966 00:07:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@280 -- # expected_state=online 00:25:39.966 00:07:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:25:39.966 00:07:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:25:39.966 00:07:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:39.966 00:07:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:39.966 00:07:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:39.966 00:07:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:25:39.966 00:07:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:39.966 00:07:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:39.966 00:07:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:39.966 00:07:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:39.966 00:07:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:39.966 00:07:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:40.224 00:07:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:40.224 "name": "Existed_Raid", 00:25:40.224 "uuid": "47b20153-54a3-4956-9bef-a74d7e42c384", 00:25:40.224 "strip_size_kb": 0, 00:25:40.224 "state": "online", 00:25:40.224 "raid_level": "raid1", 00:25:40.224 "superblock": true, 00:25:40.224 "num_base_bdevs": 2, 00:25:40.224 "num_base_bdevs_discovered": 1, 00:25:40.224 "num_base_bdevs_operational": 1, 00:25:40.224 "base_bdevs_list": [ 00:25:40.224 { 00:25:40.224 "name": null, 00:25:40.224 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:40.224 "is_configured": false, 00:25:40.224 "data_offset": 256, 00:25:40.224 "data_size": 7936 00:25:40.224 }, 00:25:40.224 { 00:25:40.224 "name": "BaseBdev2", 00:25:40.224 "uuid": "693e55c6-66de-4cc1-a30f-d501fce9be28", 00:25:40.224 "is_configured": true, 00:25:40.224 "data_offset": 256, 00:25:40.224 "data_size": 7936 00:25:40.224 } 00:25:40.224 ] 00:25:40.224 }' 00:25:40.224 00:07:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:40.224 00:07:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:40.791 00:07:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:25:40.792 00:07:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:25:40.792 00:07:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:40.792 00:07:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:25:41.051 00:07:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:25:41.051 00:07:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:25:41.051 00:07:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:25:41.311 [2024-05-15 00:07:41.700012] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:41.311 [2024-05-15 00:07:41.700092] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:41.311 [2024-05-15 00:07:41.711131] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:41.311 [2024-05-15 00:07:41.711195] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:41.311 [2024-05-15 00:07:41.711208] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc6f4b0 name Existed_Raid, state offline 00:25:41.311 00:07:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:25:41.311 00:07:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:25:41.311 00:07:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:41.311 00:07:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:25:41.570 00:07:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:25:41.570 00:07:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:25:41.570 00:07:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@300 -- # '[' 2 -gt 2 ']' 00:25:41.570 00:07:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@342 -- # killprocess 521579 00:25:41.570 00:07:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@946 -- # '[' -z 521579 ']' 00:25:41.570 00:07:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@950 -- # kill -0 521579 00:25:41.570 00:07:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@951 -- # uname 00:25:41.570 00:07:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:25:41.570 00:07:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 521579 00:25:41.570 00:07:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:25:41.570 00:07:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:25:41.570 00:07:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@964 -- # echo 'killing process with pid 521579' 00:25:41.570 killing process with pid 521579 00:25:41.570 00:07:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@965 -- # kill 521579 00:25:41.570 [2024-05-15 00:07:42.017123] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:41.570 00:07:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@970 -- # wait 521579 00:25:41.570 [2024-05-15 00:07:42.017998] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:41.829 00:07:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@344 -- # return 0 00:25:41.829 00:25:41.829 real 0m10.458s 00:25:41.829 user 0m18.545s 00:25:41.829 sys 0m1.977s 00:25:41.829 00:07:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1122 -- # xtrace_disable 00:25:41.829 00:07:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:41.829 ************************************ 00:25:41.829 END TEST raid_state_function_test_sb_md_interleaved 00:25:41.829 ************************************ 00:25:41.829 00:07:42 bdev_raid -- bdev/bdev_raid.sh@859 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:25:41.829 00:07:42 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:25:41.829 00:07:42 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:25:41.829 00:07:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:41.829 ************************************ 00:25:41.829 START TEST raid_superblock_test_md_interleaved 00:25:41.829 ************************************ 00:25:41.829 00:07:42 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1121 -- # raid_superblock_test raid1 2 00:25:41.829 00:07:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local raid_level=raid1 00:25:41.829 00:07:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=2 00:25:41.829 00:07:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:25:41.829 00:07:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:25:41.829 00:07:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:25:41.829 00:07:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:25:41.829 00:07:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:25:41.829 00:07:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:25:41.829 00:07:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:25:41.829 00:07:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size 00:25:41.829 00:07:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:25:41.829 00:07:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:25:41.829 00:07:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:25:41.829 00:07:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@404 -- # '[' raid1 '!=' raid1 ']' 00:25:41.829 00:07:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@408 -- # strip_size=0 00:25:41.829 00:07:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # raid_pid=523140 00:25:41.829 00:07:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@413 -- # waitforlisten 523140 /var/tmp/spdk-raid.sock 00:25:41.829 00:07:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:25:41.829 00:07:42 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@827 -- # '[' -z 523140 ']' 00:25:41.829 00:07:42 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:41.829 00:07:42 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@832 -- # local max_retries=100 00:25:41.829 00:07:42 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:41.829 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:41.829 00:07:42 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@836 -- # xtrace_disable 00:25:41.829 00:07:42 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:41.829 [2024-05-15 00:07:42.393209] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:25:41.829 [2024-05-15 00:07:42.393276] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid523140 ] 00:25:42.088 [2024-05-15 00:07:42.520231] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:42.088 [2024-05-15 00:07:42.617374] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:25:42.417 [2024-05-15 00:07:42.680925] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:42.417 [2024-05-15 00:07:42.680962] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:42.678 00:07:43 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:25:42.678 00:07:43 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@860 -- # return 0 00:25:42.678 00:07:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:25:42.678 00:07:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:25:42.678 00:07:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:25:42.678 00:07:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:25:42.678 00:07:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:25:42.678 00:07:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:25:42.678 00:07:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:25:42.678 00:07:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:25:42.678 00:07:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:25:42.937 malloc1 00:25:42.937 00:07:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:25:43.194 [2024-05-15 00:07:43.715894] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:25:43.194 [2024-05-15 00:07:43.715942] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:43.194 [2024-05-15 00:07:43.715967] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x233f750 00:25:43.194 [2024-05-15 00:07:43.715980] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:43.194 [2024-05-15 00:07:43.717558] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:43.194 [2024-05-15 00:07:43.717587] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:25:43.194 pt1 00:25:43.194 00:07:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:25:43.194 00:07:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:25:43.194 00:07:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:25:43.194 00:07:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:25:43.194 00:07:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:25:43.194 00:07:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:25:43.194 00:07:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:25:43.194 00:07:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:25:43.194 00:07:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:25:43.452 malloc2 00:25:43.452 00:07:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:43.710 [2024-05-15 00:07:44.210265] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:43.710 [2024-05-15 00:07:44.210312] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:43.710 [2024-05-15 00:07:44.210332] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2336da0 00:25:43.710 [2024-05-15 00:07:44.210345] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:43.710 [2024-05-15 00:07:44.211804] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:43.710 [2024-05-15 00:07:44.211831] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:43.710 pt2 00:25:43.710 00:07:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:25:43.711 00:07:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:25:43.711 00:07:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:25:43.969 [2024-05-15 00:07:44.454928] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:25:43.969 [2024-05-15 00:07:44.456359] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:43.969 [2024-05-15 00:07:44.456521] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x23385c0 00:25:43.969 [2024-05-15 00:07:44.456535] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:25:43.969 [2024-05-15 00:07:44.456605] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21a2400 00:25:43.969 [2024-05-15 00:07:44.456691] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23385c0 00:25:43.969 [2024-05-15 00:07:44.456701] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23385c0 00:25:43.969 [2024-05-15 00:07:44.456760] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:43.969 00:07:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:43.969 00:07:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:43.969 00:07:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:43.969 00:07:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:43.969 00:07:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:43.969 00:07:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:25:43.969 00:07:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:43.969 00:07:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:43.969 00:07:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:43.969 00:07:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:43.969 00:07:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:43.969 00:07:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:44.227 00:07:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:44.227 "name": "raid_bdev1", 00:25:44.227 "uuid": "729c08fa-f747-487f-bbf9-58f92c1a79ea", 00:25:44.227 "strip_size_kb": 0, 00:25:44.227 "state": "online", 00:25:44.227 "raid_level": "raid1", 00:25:44.227 "superblock": true, 00:25:44.227 "num_base_bdevs": 2, 00:25:44.227 "num_base_bdevs_discovered": 2, 00:25:44.227 "num_base_bdevs_operational": 2, 00:25:44.227 "base_bdevs_list": [ 00:25:44.227 { 00:25:44.227 "name": "pt1", 00:25:44.227 "uuid": "e7b0a638-4f5b-5d08-ad08-76802de3f576", 00:25:44.228 "is_configured": true, 00:25:44.228 "data_offset": 256, 00:25:44.228 "data_size": 7936 00:25:44.228 }, 00:25:44.228 { 00:25:44.228 "name": "pt2", 00:25:44.228 "uuid": "4ad78fe3-e8db-57bc-80d5-65ce472c7f1d", 00:25:44.228 "is_configured": true, 00:25:44.228 "data_offset": 256, 00:25:44.228 "data_size": 7936 00:25:44.228 } 00:25:44.228 ] 00:25:44.228 }' 00:25:44.228 00:07:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:44.228 00:07:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:44.794 00:07:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:25:44.794 00:07:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:25:44.794 00:07:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:25:44.794 00:07:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:25:44.794 00:07:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:25:44.794 00:07:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@199 -- # local name 00:25:44.794 00:07:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:44.794 00:07:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:25:45.053 [2024-05-15 00:07:45.493867] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:45.053 00:07:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:25:45.053 "name": "raid_bdev1", 00:25:45.053 "aliases": [ 00:25:45.053 "729c08fa-f747-487f-bbf9-58f92c1a79ea" 00:25:45.053 ], 00:25:45.053 "product_name": "Raid Volume", 00:25:45.053 "block_size": 4128, 00:25:45.053 "num_blocks": 7936, 00:25:45.053 "uuid": "729c08fa-f747-487f-bbf9-58f92c1a79ea", 00:25:45.053 "md_size": 32, 00:25:45.053 "md_interleave": true, 00:25:45.053 "dif_type": 0, 00:25:45.053 "assigned_rate_limits": { 00:25:45.053 "rw_ios_per_sec": 0, 00:25:45.053 "rw_mbytes_per_sec": 0, 00:25:45.053 "r_mbytes_per_sec": 0, 00:25:45.053 "w_mbytes_per_sec": 0 00:25:45.053 }, 00:25:45.053 "claimed": false, 00:25:45.053 "zoned": false, 00:25:45.053 "supported_io_types": { 00:25:45.053 "read": true, 00:25:45.053 "write": true, 00:25:45.053 "unmap": false, 00:25:45.053 "write_zeroes": true, 00:25:45.053 "flush": false, 00:25:45.053 "reset": true, 00:25:45.053 "compare": false, 00:25:45.053 "compare_and_write": false, 00:25:45.053 "abort": false, 00:25:45.053 "nvme_admin": false, 00:25:45.053 "nvme_io": false 00:25:45.053 }, 00:25:45.053 "memory_domains": [ 00:25:45.053 { 00:25:45.053 "dma_device_id": "system", 00:25:45.053 "dma_device_type": 1 00:25:45.053 }, 00:25:45.053 { 00:25:45.053 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:45.053 "dma_device_type": 2 00:25:45.053 }, 00:25:45.053 { 00:25:45.053 "dma_device_id": "system", 00:25:45.053 "dma_device_type": 1 00:25:45.053 }, 00:25:45.053 { 00:25:45.053 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:45.053 "dma_device_type": 2 00:25:45.053 } 00:25:45.053 ], 00:25:45.053 "driver_specific": { 00:25:45.053 "raid": { 00:25:45.053 "uuid": "729c08fa-f747-487f-bbf9-58f92c1a79ea", 00:25:45.053 "strip_size_kb": 0, 00:25:45.053 "state": "online", 00:25:45.053 "raid_level": "raid1", 00:25:45.053 "superblock": true, 00:25:45.053 "num_base_bdevs": 2, 00:25:45.053 "num_base_bdevs_discovered": 2, 00:25:45.053 "num_base_bdevs_operational": 2, 00:25:45.053 "base_bdevs_list": [ 00:25:45.053 { 00:25:45.053 "name": "pt1", 00:25:45.053 "uuid": "e7b0a638-4f5b-5d08-ad08-76802de3f576", 00:25:45.053 "is_configured": true, 00:25:45.053 "data_offset": 256, 00:25:45.053 "data_size": 7936 00:25:45.053 }, 00:25:45.053 { 00:25:45.053 "name": "pt2", 00:25:45.053 "uuid": "4ad78fe3-e8db-57bc-80d5-65ce472c7f1d", 00:25:45.053 "is_configured": true, 00:25:45.053 "data_offset": 256, 00:25:45.053 "data_size": 7936 00:25:45.053 } 00:25:45.053 ] 00:25:45.053 } 00:25:45.053 } 00:25:45.053 }' 00:25:45.053 00:07:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:45.053 00:07:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:25:45.053 pt2' 00:25:45.053 00:07:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:25:45.053 00:07:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:25:45.053 00:07:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:25:45.311 00:07:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:25:45.311 "name": "pt1", 00:25:45.311 "aliases": [ 00:25:45.311 "e7b0a638-4f5b-5d08-ad08-76802de3f576" 00:25:45.311 ], 00:25:45.311 "product_name": "passthru", 00:25:45.311 "block_size": 4128, 00:25:45.311 "num_blocks": 8192, 00:25:45.311 "uuid": "e7b0a638-4f5b-5d08-ad08-76802de3f576", 00:25:45.311 "md_size": 32, 00:25:45.311 "md_interleave": true, 00:25:45.311 "dif_type": 0, 00:25:45.311 "assigned_rate_limits": { 00:25:45.311 "rw_ios_per_sec": 0, 00:25:45.311 "rw_mbytes_per_sec": 0, 00:25:45.311 "r_mbytes_per_sec": 0, 00:25:45.311 "w_mbytes_per_sec": 0 00:25:45.311 }, 00:25:45.311 "claimed": true, 00:25:45.311 "claim_type": "exclusive_write", 00:25:45.311 "zoned": false, 00:25:45.311 "supported_io_types": { 00:25:45.311 "read": true, 00:25:45.311 "write": true, 00:25:45.311 "unmap": true, 00:25:45.311 "write_zeroes": true, 00:25:45.311 "flush": true, 00:25:45.311 "reset": true, 00:25:45.311 "compare": false, 00:25:45.311 "compare_and_write": false, 00:25:45.311 "abort": true, 00:25:45.311 "nvme_admin": false, 00:25:45.311 "nvme_io": false 00:25:45.311 }, 00:25:45.311 "memory_domains": [ 00:25:45.311 { 00:25:45.311 "dma_device_id": "system", 00:25:45.311 "dma_device_type": 1 00:25:45.311 }, 00:25:45.311 { 00:25:45.311 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:45.311 "dma_device_type": 2 00:25:45.311 } 00:25:45.312 ], 00:25:45.312 "driver_specific": { 00:25:45.312 "passthru": { 00:25:45.312 "name": "pt1", 00:25:45.312 "base_bdev_name": "malloc1" 00:25:45.312 } 00:25:45.312 } 00:25:45.312 }' 00:25:45.312 00:07:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:25:45.312 00:07:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:25:45.312 00:07:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 4128 == 4128 ]] 00:25:45.312 00:07:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:25:45.569 00:07:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:25:45.569 00:07:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:25:45.569 00:07:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:25:45.569 00:07:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:25:45.569 00:07:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ true == true ]] 00:25:45.569 00:07:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:25:45.569 00:07:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:25:45.569 00:07:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:25:45.569 00:07:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:25:45.569 00:07:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:25:45.569 00:07:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:25:45.827 00:07:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:25:45.827 "name": "pt2", 00:25:45.827 "aliases": [ 00:25:45.827 "4ad78fe3-e8db-57bc-80d5-65ce472c7f1d" 00:25:45.827 ], 00:25:45.827 "product_name": "passthru", 00:25:45.827 "block_size": 4128, 00:25:45.827 "num_blocks": 8192, 00:25:45.827 "uuid": "4ad78fe3-e8db-57bc-80d5-65ce472c7f1d", 00:25:45.827 "md_size": 32, 00:25:45.827 "md_interleave": true, 00:25:45.827 "dif_type": 0, 00:25:45.827 "assigned_rate_limits": { 00:25:45.827 "rw_ios_per_sec": 0, 00:25:45.827 "rw_mbytes_per_sec": 0, 00:25:45.827 "r_mbytes_per_sec": 0, 00:25:45.827 "w_mbytes_per_sec": 0 00:25:45.827 }, 00:25:45.827 "claimed": true, 00:25:45.827 "claim_type": "exclusive_write", 00:25:45.827 "zoned": false, 00:25:45.827 "supported_io_types": { 00:25:45.827 "read": true, 00:25:45.827 "write": true, 00:25:45.827 "unmap": true, 00:25:45.827 "write_zeroes": true, 00:25:45.827 "flush": true, 00:25:45.827 "reset": true, 00:25:45.827 "compare": false, 00:25:45.827 "compare_and_write": false, 00:25:45.827 "abort": true, 00:25:45.827 "nvme_admin": false, 00:25:45.827 "nvme_io": false 00:25:45.827 }, 00:25:45.827 "memory_domains": [ 00:25:45.827 { 00:25:45.827 "dma_device_id": "system", 00:25:45.827 "dma_device_type": 1 00:25:45.827 }, 00:25:45.827 { 00:25:45.827 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:45.827 "dma_device_type": 2 00:25:45.827 } 00:25:45.827 ], 00:25:45.827 "driver_specific": { 00:25:45.827 "passthru": { 00:25:45.827 "name": "pt2", 00:25:45.827 "base_bdev_name": "malloc2" 00:25:45.827 } 00:25:45.827 } 00:25:45.827 }' 00:25:45.827 00:07:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:25:46.085 00:07:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:25:46.085 00:07:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 4128 == 4128 ]] 00:25:46.085 00:07:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:25:46.085 00:07:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:25:46.085 00:07:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:25:46.085 00:07:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:25:46.085 00:07:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:25:46.085 00:07:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ true == true ]] 00:25:46.085 00:07:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:25:46.343 00:07:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:25:46.343 00:07:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:25:46.343 00:07:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:46.343 00:07:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:25:46.601 [2024-05-15 00:07:46.953742] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:46.601 00:07:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=729c08fa-f747-487f-bbf9-58f92c1a79ea 00:25:46.601 00:07:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@436 -- # '[' -z 729c08fa-f747-487f-bbf9-58f92c1a79ea ']' 00:25:46.601 00:07:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:46.859 [2024-05-15 00:07:47.194145] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:46.859 [2024-05-15 00:07:47.194167] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:46.859 [2024-05-15 00:07:47.194223] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:46.859 [2024-05-15 00:07:47.194282] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:46.859 [2024-05-15 00:07:47.194294] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23385c0 name raid_bdev1, state offline 00:25:46.859 00:07:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:46.859 00:07:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:25:47.116 00:07:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:25:47.116 00:07:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:25:47.116 00:07:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:25:47.116 00:07:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:25:47.116 00:07:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:25:47.116 00:07:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:25:47.373 00:07:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:25:47.373 00:07:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:25:47.632 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:25:47.632 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:25:47.632 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:25:47.632 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:25:47.632 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:47.632 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:47.632 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:47.632 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:47.632 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:47.632 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:47.632 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:47.632 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:47.632 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:25:47.890 [2024-05-15 00:07:48.337122] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:25:47.890 [2024-05-15 00:07:48.338467] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:25:47.890 [2024-05-15 00:07:48.338525] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:25:47.890 [2024-05-15 00:07:48.338563] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:25:47.890 [2024-05-15 00:07:48.338582] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:47.890 [2024-05-15 00:07:48.338592] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23390a0 name raid_bdev1, state configuring 00:25:47.890 request: 00:25:47.890 { 00:25:47.890 "name": "raid_bdev1", 00:25:47.890 "raid_level": "raid1", 00:25:47.890 "base_bdevs": [ 00:25:47.890 "malloc1", 00:25:47.890 "malloc2" 00:25:47.890 ], 00:25:47.890 "superblock": false, 00:25:47.890 "method": "bdev_raid_create", 00:25:47.890 "req_id": 1 00:25:47.890 } 00:25:47.890 Got JSON-RPC error response 00:25:47.890 response: 00:25:47.890 { 00:25:47.890 "code": -17, 00:25:47.890 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:25:47.890 } 00:25:47.890 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:25:47.890 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:25:47.890 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:25:47.890 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:25:47.890 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:47.890 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:25:48.148 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:25:48.148 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:25:48.148 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:25:48.406 [2024-05-15 00:07:48.806297] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:25:48.406 [2024-05-15 00:07:48.806342] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:48.406 [2024-05-15 00:07:48.806362] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2337590 00:25:48.406 [2024-05-15 00:07:48.806374] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:48.406 [2024-05-15 00:07:48.807792] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:48.406 [2024-05-15 00:07:48.807819] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:25:48.406 [2024-05-15 00:07:48.807865] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt1 00:25:48.406 [2024-05-15 00:07:48.807889] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:25:48.406 pt1 00:25:48.406 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:25:48.406 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:48.406 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:25:48.406 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:48.406 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:48.406 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:25:48.406 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:48.406 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:48.406 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:48.406 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:48.406 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:48.406 00:07:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:48.664 00:07:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:48.664 "name": "raid_bdev1", 00:25:48.664 "uuid": "729c08fa-f747-487f-bbf9-58f92c1a79ea", 00:25:48.664 "strip_size_kb": 0, 00:25:48.664 "state": "configuring", 00:25:48.664 "raid_level": "raid1", 00:25:48.664 "superblock": true, 00:25:48.664 "num_base_bdevs": 2, 00:25:48.664 "num_base_bdevs_discovered": 1, 00:25:48.664 "num_base_bdevs_operational": 2, 00:25:48.664 "base_bdevs_list": [ 00:25:48.664 { 00:25:48.664 "name": "pt1", 00:25:48.664 "uuid": "e7b0a638-4f5b-5d08-ad08-76802de3f576", 00:25:48.664 "is_configured": true, 00:25:48.664 "data_offset": 256, 00:25:48.664 "data_size": 7936 00:25:48.664 }, 00:25:48.664 { 00:25:48.664 "name": null, 00:25:48.664 "uuid": "4ad78fe3-e8db-57bc-80d5-65ce472c7f1d", 00:25:48.664 "is_configured": false, 00:25:48.664 "data_offset": 256, 00:25:48.664 "data_size": 7936 00:25:48.664 } 00:25:48.664 ] 00:25:48.664 }' 00:25:48.664 00:07:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:48.664 00:07:49 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:49.229 00:07:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@470 -- # '[' 2 -gt 2 ']' 00:25:49.229 00:07:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:25:49.229 00:07:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:25:49.229 00:07:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:49.487 [2024-05-15 00:07:49.885153] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:49.487 [2024-05-15 00:07:49.885202] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:49.487 [2024-05-15 00:07:49.885223] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x233a0e0 00:25:49.487 [2024-05-15 00:07:49.885235] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:49.487 [2024-05-15 00:07:49.885406] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:49.487 [2024-05-15 00:07:49.885423] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:49.487 [2024-05-15 00:07:49.885465] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:25:49.487 [2024-05-15 00:07:49.885482] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:49.487 [2024-05-15 00:07:49.885563] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x233a7b0 00:25:49.487 [2024-05-15 00:07:49.885573] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:25:49.487 [2024-05-15 00:07:49.885633] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x233fe10 00:25:49.487 [2024-05-15 00:07:49.885710] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x233a7b0 00:25:49.487 [2024-05-15 00:07:49.885719] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x233a7b0 00:25:49.487 [2024-05-15 00:07:49.885777] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:49.487 pt2 00:25:49.487 00:07:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:25:49.487 00:07:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:25:49.487 00:07:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:49.487 00:07:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:49.487 00:07:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:49.487 00:07:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:49.487 00:07:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:49.487 00:07:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:25:49.487 00:07:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:49.487 00:07:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:49.487 00:07:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:49.487 00:07:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:49.487 00:07:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:49.487 00:07:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:49.745 00:07:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:49.745 "name": "raid_bdev1", 00:25:49.745 "uuid": "729c08fa-f747-487f-bbf9-58f92c1a79ea", 00:25:49.745 "strip_size_kb": 0, 00:25:49.745 "state": "online", 00:25:49.745 "raid_level": "raid1", 00:25:49.745 "superblock": true, 00:25:49.745 "num_base_bdevs": 2, 00:25:49.745 "num_base_bdevs_discovered": 2, 00:25:49.745 "num_base_bdevs_operational": 2, 00:25:49.745 "base_bdevs_list": [ 00:25:49.745 { 00:25:49.745 "name": "pt1", 00:25:49.745 "uuid": "e7b0a638-4f5b-5d08-ad08-76802de3f576", 00:25:49.745 "is_configured": true, 00:25:49.745 "data_offset": 256, 00:25:49.745 "data_size": 7936 00:25:49.745 }, 00:25:49.745 { 00:25:49.745 "name": "pt2", 00:25:49.745 "uuid": "4ad78fe3-e8db-57bc-80d5-65ce472c7f1d", 00:25:49.745 "is_configured": true, 00:25:49.745 "data_offset": 256, 00:25:49.745 "data_size": 7936 00:25:49.745 } 00:25:49.745 ] 00:25:49.745 }' 00:25:49.745 00:07:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:49.745 00:07:50 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:50.310 00:07:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:25:50.310 00:07:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:25:50.310 00:07:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:25:50.311 00:07:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:25:50.311 00:07:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:25:50.311 00:07:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@199 -- # local name 00:25:50.311 00:07:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:50.311 00:07:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:25:50.569 [2024-05-15 00:07:50.908095] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:50.569 00:07:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:25:50.569 "name": "raid_bdev1", 00:25:50.569 "aliases": [ 00:25:50.569 "729c08fa-f747-487f-bbf9-58f92c1a79ea" 00:25:50.569 ], 00:25:50.569 "product_name": "Raid Volume", 00:25:50.569 "block_size": 4128, 00:25:50.569 "num_blocks": 7936, 00:25:50.569 "uuid": "729c08fa-f747-487f-bbf9-58f92c1a79ea", 00:25:50.569 "md_size": 32, 00:25:50.569 "md_interleave": true, 00:25:50.569 "dif_type": 0, 00:25:50.569 "assigned_rate_limits": { 00:25:50.569 "rw_ios_per_sec": 0, 00:25:50.569 "rw_mbytes_per_sec": 0, 00:25:50.569 "r_mbytes_per_sec": 0, 00:25:50.569 "w_mbytes_per_sec": 0 00:25:50.569 }, 00:25:50.569 "claimed": false, 00:25:50.569 "zoned": false, 00:25:50.569 "supported_io_types": { 00:25:50.569 "read": true, 00:25:50.569 "write": true, 00:25:50.569 "unmap": false, 00:25:50.569 "write_zeroes": true, 00:25:50.569 "flush": false, 00:25:50.569 "reset": true, 00:25:50.569 "compare": false, 00:25:50.569 "compare_and_write": false, 00:25:50.569 "abort": false, 00:25:50.569 "nvme_admin": false, 00:25:50.569 "nvme_io": false 00:25:50.569 }, 00:25:50.569 "memory_domains": [ 00:25:50.569 { 00:25:50.569 "dma_device_id": "system", 00:25:50.569 "dma_device_type": 1 00:25:50.569 }, 00:25:50.569 { 00:25:50.569 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:50.569 "dma_device_type": 2 00:25:50.569 }, 00:25:50.569 { 00:25:50.569 "dma_device_id": "system", 00:25:50.569 "dma_device_type": 1 00:25:50.569 }, 00:25:50.569 { 00:25:50.569 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:50.569 "dma_device_type": 2 00:25:50.569 } 00:25:50.569 ], 00:25:50.569 "driver_specific": { 00:25:50.569 "raid": { 00:25:50.569 "uuid": "729c08fa-f747-487f-bbf9-58f92c1a79ea", 00:25:50.569 "strip_size_kb": 0, 00:25:50.569 "state": "online", 00:25:50.569 "raid_level": "raid1", 00:25:50.569 "superblock": true, 00:25:50.569 "num_base_bdevs": 2, 00:25:50.569 "num_base_bdevs_discovered": 2, 00:25:50.569 "num_base_bdevs_operational": 2, 00:25:50.569 "base_bdevs_list": [ 00:25:50.569 { 00:25:50.569 "name": "pt1", 00:25:50.569 "uuid": "e7b0a638-4f5b-5d08-ad08-76802de3f576", 00:25:50.569 "is_configured": true, 00:25:50.569 "data_offset": 256, 00:25:50.569 "data_size": 7936 00:25:50.569 }, 00:25:50.569 { 00:25:50.569 "name": "pt2", 00:25:50.569 "uuid": "4ad78fe3-e8db-57bc-80d5-65ce472c7f1d", 00:25:50.569 "is_configured": true, 00:25:50.569 "data_offset": 256, 00:25:50.569 "data_size": 7936 00:25:50.569 } 00:25:50.569 ] 00:25:50.569 } 00:25:50.569 } 00:25:50.569 }' 00:25:50.569 00:07:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:50.569 00:07:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:25:50.569 pt2' 00:25:50.569 00:07:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:25:50.569 00:07:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:25:50.569 00:07:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:25:50.827 00:07:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:25:50.827 "name": "pt1", 00:25:50.827 "aliases": [ 00:25:50.827 "e7b0a638-4f5b-5d08-ad08-76802de3f576" 00:25:50.827 ], 00:25:50.827 "product_name": "passthru", 00:25:50.827 "block_size": 4128, 00:25:50.827 "num_blocks": 8192, 00:25:50.827 "uuid": "e7b0a638-4f5b-5d08-ad08-76802de3f576", 00:25:50.827 "md_size": 32, 00:25:50.828 "md_interleave": true, 00:25:50.828 "dif_type": 0, 00:25:50.828 "assigned_rate_limits": { 00:25:50.828 "rw_ios_per_sec": 0, 00:25:50.828 "rw_mbytes_per_sec": 0, 00:25:50.828 "r_mbytes_per_sec": 0, 00:25:50.828 "w_mbytes_per_sec": 0 00:25:50.828 }, 00:25:50.828 "claimed": true, 00:25:50.828 "claim_type": "exclusive_write", 00:25:50.828 "zoned": false, 00:25:50.828 "supported_io_types": { 00:25:50.828 "read": true, 00:25:50.828 "write": true, 00:25:50.828 "unmap": true, 00:25:50.828 "write_zeroes": true, 00:25:50.828 "flush": true, 00:25:50.828 "reset": true, 00:25:50.828 "compare": false, 00:25:50.828 "compare_and_write": false, 00:25:50.828 "abort": true, 00:25:50.828 "nvme_admin": false, 00:25:50.828 "nvme_io": false 00:25:50.828 }, 00:25:50.828 "memory_domains": [ 00:25:50.828 { 00:25:50.828 "dma_device_id": "system", 00:25:50.828 "dma_device_type": 1 00:25:50.828 }, 00:25:50.828 { 00:25:50.828 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:50.828 "dma_device_type": 2 00:25:50.828 } 00:25:50.828 ], 00:25:50.828 "driver_specific": { 00:25:50.828 "passthru": { 00:25:50.828 "name": "pt1", 00:25:50.828 "base_bdev_name": "malloc1" 00:25:50.828 } 00:25:50.828 } 00:25:50.828 }' 00:25:50.828 00:07:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:25:50.828 00:07:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:25:50.828 00:07:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 4128 == 4128 ]] 00:25:50.828 00:07:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:25:50.828 00:07:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:25:50.828 00:07:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:25:50.828 00:07:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:25:50.828 00:07:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:25:51.086 00:07:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ true == true ]] 00:25:51.086 00:07:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:25:51.086 00:07:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:25:51.086 00:07:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:25:51.086 00:07:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:25:51.086 00:07:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:25:51.086 00:07:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:25:51.344 00:07:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:25:51.344 "name": "pt2", 00:25:51.344 "aliases": [ 00:25:51.344 "4ad78fe3-e8db-57bc-80d5-65ce472c7f1d" 00:25:51.344 ], 00:25:51.344 "product_name": "passthru", 00:25:51.344 "block_size": 4128, 00:25:51.344 "num_blocks": 8192, 00:25:51.344 "uuid": "4ad78fe3-e8db-57bc-80d5-65ce472c7f1d", 00:25:51.344 "md_size": 32, 00:25:51.344 "md_interleave": true, 00:25:51.344 "dif_type": 0, 00:25:51.344 "assigned_rate_limits": { 00:25:51.344 "rw_ios_per_sec": 0, 00:25:51.344 "rw_mbytes_per_sec": 0, 00:25:51.344 "r_mbytes_per_sec": 0, 00:25:51.344 "w_mbytes_per_sec": 0 00:25:51.344 }, 00:25:51.344 "claimed": true, 00:25:51.344 "claim_type": "exclusive_write", 00:25:51.344 "zoned": false, 00:25:51.344 "supported_io_types": { 00:25:51.344 "read": true, 00:25:51.344 "write": true, 00:25:51.344 "unmap": true, 00:25:51.344 "write_zeroes": true, 00:25:51.344 "flush": true, 00:25:51.344 "reset": true, 00:25:51.344 "compare": false, 00:25:51.344 "compare_and_write": false, 00:25:51.344 "abort": true, 00:25:51.344 "nvme_admin": false, 00:25:51.344 "nvme_io": false 00:25:51.344 }, 00:25:51.344 "memory_domains": [ 00:25:51.344 { 00:25:51.344 "dma_device_id": "system", 00:25:51.344 "dma_device_type": 1 00:25:51.344 }, 00:25:51.344 { 00:25:51.344 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:51.344 "dma_device_type": 2 00:25:51.344 } 00:25:51.344 ], 00:25:51.344 "driver_specific": { 00:25:51.344 "passthru": { 00:25:51.344 "name": "pt2", 00:25:51.344 "base_bdev_name": "malloc2" 00:25:51.344 } 00:25:51.344 } 00:25:51.344 }' 00:25:51.344 00:07:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:25:51.344 00:07:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:25:51.344 00:07:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 4128 == 4128 ]] 00:25:51.344 00:07:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:25:51.344 00:07:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:25:51.344 00:07:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:25:51.344 00:07:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:25:51.602 00:07:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:25:51.602 00:07:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ true == true ]] 00:25:51.602 00:07:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:25:51.602 00:07:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:25:51.602 00:07:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:25:51.602 00:07:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:51.602 00:07:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:25:51.859 [2024-05-15 00:07:52.235626] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:51.859 00:07:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@487 -- # '[' 729c08fa-f747-487f-bbf9-58f92c1a79ea '!=' 729c08fa-f747-487f-bbf9-58f92c1a79ea ']' 00:25:51.859 00:07:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@491 -- # has_redundancy raid1 00:25:51.859 00:07:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # case $1 in 00:25:51.859 00:07:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@215 -- # return 0 00:25:51.859 00:07:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@493 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:25:52.118 [2024-05-15 00:07:52.472038] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:25:52.118 00:07:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@496 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:52.118 00:07:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:52.118 00:07:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:52.118 00:07:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:52.118 00:07:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:52.118 00:07:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:25:52.118 00:07:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:52.118 00:07:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:52.118 00:07:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:52.118 00:07:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:52.118 00:07:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:52.118 00:07:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:52.375 00:07:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:52.375 "name": "raid_bdev1", 00:25:52.375 "uuid": "729c08fa-f747-487f-bbf9-58f92c1a79ea", 00:25:52.375 "strip_size_kb": 0, 00:25:52.375 "state": "online", 00:25:52.375 "raid_level": "raid1", 00:25:52.375 "superblock": true, 00:25:52.375 "num_base_bdevs": 2, 00:25:52.375 "num_base_bdevs_discovered": 1, 00:25:52.376 "num_base_bdevs_operational": 1, 00:25:52.376 "base_bdevs_list": [ 00:25:52.376 { 00:25:52.376 "name": null, 00:25:52.376 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:52.376 "is_configured": false, 00:25:52.376 "data_offset": 256, 00:25:52.376 "data_size": 7936 00:25:52.376 }, 00:25:52.376 { 00:25:52.376 "name": "pt2", 00:25:52.376 "uuid": "4ad78fe3-e8db-57bc-80d5-65ce472c7f1d", 00:25:52.376 "is_configured": true, 00:25:52.376 "data_offset": 256, 00:25:52.376 "data_size": 7936 00:25:52.376 } 00:25:52.376 ] 00:25:52.376 }' 00:25:52.376 00:07:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:52.376 00:07:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:52.941 00:07:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:53.198 [2024-05-15 00:07:53.546992] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:53.198 [2024-05-15 00:07:53.547020] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:53.198 [2024-05-15 00:07:53.547078] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:53.198 [2024-05-15 00:07:53.547127] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:53.198 [2024-05-15 00:07:53.547139] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x233a7b0 name raid_bdev1, state offline 00:25:53.198 00:07:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:53.198 00:07:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # jq -r '.[]' 00:25:53.455 00:07:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # raid_bdev= 00:25:53.455 00:07:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@501 -- # '[' -n '' ']' 00:25:53.455 00:07:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # (( i = 1 )) 00:25:53.455 00:07:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:25:53.455 00:07:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@507 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:25:53.713 00:07:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # (( i++ )) 00:25:53.713 00:07:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:25:53.714 00:07:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@511 -- # (( i = 1 )) 00:25:53.714 00:07:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@511 -- # (( i < num_base_bdevs - 1 )) 00:25:53.714 00:07:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # i=1 00:25:53.714 00:07:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@520 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:53.714 [2024-05-15 00:07:54.276874] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:53.714 [2024-05-15 00:07:54.276923] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:53.714 [2024-05-15 00:07:54.276942] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x233fc10 00:25:53.714 [2024-05-15 00:07:54.276955] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:53.714 [2024-05-15 00:07:54.278437] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:53.714 [2024-05-15 00:07:54.278464] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:53.714 [2024-05-15 00:07:54.278513] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:25:53.714 [2024-05-15 00:07:54.278540] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:53.714 [2024-05-15 00:07:54.278607] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x233aa50 00:25:53.714 [2024-05-15 00:07:54.278617] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:25:53.714 [2024-05-15 00:07:54.278677] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x233e440 00:25:53.714 [2024-05-15 00:07:54.278761] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x233aa50 00:25:53.714 [2024-05-15 00:07:54.278771] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x233aa50 00:25:53.714 [2024-05-15 00:07:54.278828] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:53.714 pt2 00:25:53.714 00:07:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@523 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:53.714 00:07:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:53.714 00:07:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:53.714 00:07:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:53.714 00:07:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:53.714 00:07:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:25:53.714 00:07:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:53.714 00:07:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:53.714 00:07:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:53.714 00:07:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:53.714 00:07:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:53.714 00:07:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:53.972 00:07:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:53.972 "name": "raid_bdev1", 00:25:53.972 "uuid": "729c08fa-f747-487f-bbf9-58f92c1a79ea", 00:25:53.972 "strip_size_kb": 0, 00:25:53.972 "state": "online", 00:25:53.972 "raid_level": "raid1", 00:25:53.972 "superblock": true, 00:25:53.972 "num_base_bdevs": 2, 00:25:53.972 "num_base_bdevs_discovered": 1, 00:25:53.972 "num_base_bdevs_operational": 1, 00:25:53.972 "base_bdevs_list": [ 00:25:53.972 { 00:25:53.972 "name": null, 00:25:53.972 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:53.972 "is_configured": false, 00:25:53.972 "data_offset": 256, 00:25:53.972 "data_size": 7936 00:25:53.972 }, 00:25:53.972 { 00:25:53.972 "name": "pt2", 00:25:53.972 "uuid": "4ad78fe3-e8db-57bc-80d5-65ce472c7f1d", 00:25:53.972 "is_configured": true, 00:25:53.972 "data_offset": 256, 00:25:53.972 "data_size": 7936 00:25:53.972 } 00:25:53.972 ] 00:25:53.972 }' 00:25:53.972 00:07:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:53.972 00:07:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:54.539 00:07:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@525 -- # '[' 2 -gt 2 ']' 00:25:54.539 00:07:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@563 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:54.539 00:07:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@563 -- # jq -r '.[] | .uuid' 00:25:54.797 [2024-05-15 00:07:55.331875] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:54.797 00:07:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@563 -- # '[' 729c08fa-f747-487f-bbf9-58f92c1a79ea '!=' 729c08fa-f747-487f-bbf9-58f92c1a79ea ']' 00:25:54.797 00:07:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@568 -- # killprocess 523140 00:25:54.797 00:07:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@946 -- # '[' -z 523140 ']' 00:25:54.797 00:07:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@950 -- # kill -0 523140 00:25:54.797 00:07:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@951 -- # uname 00:25:54.797 00:07:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:25:54.797 00:07:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 523140 00:25:55.056 00:07:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:25:55.056 00:07:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:25:55.056 00:07:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@964 -- # echo 'killing process with pid 523140' 00:25:55.056 killing process with pid 523140 00:25:55.056 00:07:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@965 -- # kill 523140 00:25:55.056 [2024-05-15 00:07:55.402304] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:55.056 [2024-05-15 00:07:55.402364] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:55.056 [2024-05-15 00:07:55.402416] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:55.056 [2024-05-15 00:07:55.402428] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x233aa50 name raid_bdev1, state offline 00:25:55.056 00:07:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@970 -- # wait 523140 00:25:55.056 [2024-05-15 00:07:55.419663] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:55.315 00:07:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@570 -- # return 0 00:25:55.315 00:25:55.315 real 0m13.329s 00:25:55.315 user 0m24.009s 00:25:55.315 sys 0m2.435s 00:25:55.315 00:07:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1122 -- # xtrace_disable 00:25:55.315 00:07:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:55.315 ************************************ 00:25:55.315 END TEST raid_superblock_test_md_interleaved 00:25:55.315 ************************************ 00:25:55.315 00:07:55 bdev_raid -- bdev/bdev_raid.sh@860 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:25:55.315 00:07:55 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:25:55.315 00:07:55 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:25:55.315 00:07:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:55.315 ************************************ 00:25:55.315 START TEST raid_rebuild_test_sb_md_interleaved 00:25:55.315 ************************************ 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 2 true false false 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local raid_level=raid1 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local num_base_bdevs=2 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local superblock=true 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local background_io=false 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local verify=false 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@579 -- # (( i = 1 )) 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@581 -- # echo BaseBdev1 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@581 -- # echo BaseBdev2 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@579 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@579 -- # local base_bdevs 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@580 -- # local raid_bdev_name=raid_bdev1 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@581 -- # local strip_size 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@582 -- # local create_arg 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@583 -- # local raid_bdev_size 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@584 -- # local data_offset 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@586 -- # '[' raid1 '!=' raid1 ']' 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@594 -- # strip_size=0 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # '[' true = true ']' 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@598 -- # create_arg+=' -s' 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # raid_pid=525078 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@603 -- # waitforlisten 525078 /var/tmp/spdk-raid.sock 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@827 -- # '[' -z 525078 ']' 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@832 -- # local max_retries=100 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:55.315 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # xtrace_disable 00:25:55.315 00:07:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:55.315 [2024-05-15 00:07:55.795350] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:25:55.315 [2024-05-15 00:07:55.795424] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid525078 ] 00:25:55.315 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:55.315 Zero copy mechanism will not be used. 00:25:55.574 [2024-05-15 00:07:55.924534] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:55.574 [2024-05-15 00:07:56.030259] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:25:55.574 [2024-05-15 00:07:56.104236] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:55.574 [2024-05-15 00:07:56.104274] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:56.140 00:07:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:25:56.140 00:07:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@860 -- # return 0 00:25:56.140 00:07:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:25:56.140 00:07:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:25:56.399 BaseBdev1_malloc 00:25:56.399 00:07:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:56.658 [2024-05-15 00:07:57.195504] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:56.658 [2024-05-15 00:07:57.195554] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:56.658 [2024-05-15 00:07:57.195579] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa42470 00:25:56.658 [2024-05-15 00:07:57.195592] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:56.658 [2024-05-15 00:07:57.197065] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:56.658 [2024-05-15 00:07:57.197093] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:56.658 BaseBdev1 00:25:56.658 00:07:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:25:56.658 00:07:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:25:56.916 BaseBdev2_malloc 00:25:56.916 00:07:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:57.175 [2024-05-15 00:07:57.681816] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:57.175 [2024-05-15 00:07:57.681869] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:57.175 [2024-05-15 00:07:57.681890] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb9a190 00:25:57.175 [2024-05-15 00:07:57.681903] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:57.175 [2024-05-15 00:07:57.683368] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:57.175 [2024-05-15 00:07:57.683397] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:57.175 BaseBdev2 00:25:57.175 00:07:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:25:57.433 spare_malloc 00:25:57.433 00:07:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@613 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:57.692 spare_delay 00:25:57.692 00:07:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@614 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:57.951 [2024-05-15 00:07:58.390133] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:57.951 [2024-05-15 00:07:58.390179] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:57.951 [2024-05-15 00:07:58.390202] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xac34e0 00:25:57.951 [2024-05-15 00:07:58.390215] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:57.951 [2024-05-15 00:07:58.391615] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:57.951 [2024-05-15 00:07:58.391641] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:57.951 spare 00:25:57.951 00:07:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:25:58.212 [2024-05-15 00:07:58.630804] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:58.212 [2024-05-15 00:07:58.632043] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:58.212 [2024-05-15 00:07:58.632209] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xb49060 00:25:58.212 [2024-05-15 00:07:58.632222] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:25:58.212 [2024-05-15 00:07:58.632292] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa38500 00:25:58.212 [2024-05-15 00:07:58.632378] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb49060 00:25:58.212 [2024-05-15 00:07:58.632388] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb49060 00:25:58.212 [2024-05-15 00:07:58.632454] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:58.212 00:07:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:58.212 00:07:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:58.212 00:07:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:58.212 00:07:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:58.212 00:07:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:58.212 00:07:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:25:58.212 00:07:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:58.212 00:07:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:58.212 00:07:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:58.212 00:07:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:58.212 00:07:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:58.212 00:07:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:58.501 00:07:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:58.501 "name": "raid_bdev1", 00:25:58.501 "uuid": "c664daec-15ea-4fd3-8e22-ed097e9586ce", 00:25:58.501 "strip_size_kb": 0, 00:25:58.501 "state": "online", 00:25:58.501 "raid_level": "raid1", 00:25:58.501 "superblock": true, 00:25:58.501 "num_base_bdevs": 2, 00:25:58.501 "num_base_bdevs_discovered": 2, 00:25:58.501 "num_base_bdevs_operational": 2, 00:25:58.501 "base_bdevs_list": [ 00:25:58.501 { 00:25:58.501 "name": "BaseBdev1", 00:25:58.501 "uuid": "6ef0e247-7a79-5cc3-abe7-a1b390a5bfbe", 00:25:58.501 "is_configured": true, 00:25:58.501 "data_offset": 256, 00:25:58.501 "data_size": 7936 00:25:58.501 }, 00:25:58.501 { 00:25:58.501 "name": "BaseBdev2", 00:25:58.501 "uuid": "c6aa8e31-e4a6-5139-88c8-c1fc21202987", 00:25:58.501 "is_configured": true, 00:25:58.501 "data_offset": 256, 00:25:58.501 "data_size": 7936 00:25:58.501 } 00:25:58.501 ] 00:25:58.501 }' 00:25:58.501 00:07:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:58.501 00:07:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:59.069 00:07:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@621 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:59.069 00:07:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@621 -- # jq -r '.[].num_blocks' 00:25:59.327 [2024-05-15 00:07:59.673761] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:59.327 00:07:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@621 -- # raid_bdev_size=7936 00:25:59.327 00:07:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:59.327 00:07:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@624 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:59.586 00:07:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@624 -- # data_offset=256 00:25:59.586 00:07:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@626 -- # '[' false = true ']' 00:25:59.586 00:07:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@629 -- # '[' false = true ']' 00:25:59.586 00:07:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:59.845 [2024-05-15 00:08:00.182881] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:59.845 00:08:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@648 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:59.845 00:08:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:59.845 00:08:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:59.845 00:08:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:59.845 00:08:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:59.845 00:08:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:25:59.845 00:08:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:59.845 00:08:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:59.845 00:08:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:59.845 00:08:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:59.845 00:08:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:59.845 00:08:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:00.103 00:08:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:26:00.103 "name": "raid_bdev1", 00:26:00.104 "uuid": "c664daec-15ea-4fd3-8e22-ed097e9586ce", 00:26:00.104 "strip_size_kb": 0, 00:26:00.104 "state": "online", 00:26:00.104 "raid_level": "raid1", 00:26:00.104 "superblock": true, 00:26:00.104 "num_base_bdevs": 2, 00:26:00.104 "num_base_bdevs_discovered": 1, 00:26:00.104 "num_base_bdevs_operational": 1, 00:26:00.104 "base_bdevs_list": [ 00:26:00.104 { 00:26:00.104 "name": null, 00:26:00.104 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:00.104 "is_configured": false, 00:26:00.104 "data_offset": 256, 00:26:00.104 "data_size": 7936 00:26:00.104 }, 00:26:00.104 { 00:26:00.104 "name": "BaseBdev2", 00:26:00.104 "uuid": "c6aa8e31-e4a6-5139-88c8-c1fc21202987", 00:26:00.104 "is_configured": true, 00:26:00.104 "data_offset": 256, 00:26:00.104 "data_size": 7936 00:26:00.104 } 00:26:00.104 ] 00:26:00.104 }' 00:26:00.104 00:08:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:26:00.104 00:08:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:00.671 00:08:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:00.671 [2024-05-15 00:08:01.213638] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:00.671 [2024-05-15 00:08:01.217209] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa39500 00:26:00.671 [2024-05-15 00:08:01.219048] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:00.671 00:08:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@652 -- # sleep 1 00:26:02.048 00:08:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:02.048 00:08:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:26:02.048 00:08:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:26:02.048 00:08:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=spare 00:26:02.048 00:08:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:26:02.048 00:08:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:02.048 00:08:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:02.048 00:08:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:26:02.048 "name": "raid_bdev1", 00:26:02.048 "uuid": "c664daec-15ea-4fd3-8e22-ed097e9586ce", 00:26:02.048 "strip_size_kb": 0, 00:26:02.048 "state": "online", 00:26:02.048 "raid_level": "raid1", 00:26:02.048 "superblock": true, 00:26:02.048 "num_base_bdevs": 2, 00:26:02.048 "num_base_bdevs_discovered": 2, 00:26:02.048 "num_base_bdevs_operational": 2, 00:26:02.048 "process": { 00:26:02.048 "type": "rebuild", 00:26:02.048 "target": "spare", 00:26:02.048 "progress": { 00:26:02.048 "blocks": 3072, 00:26:02.048 "percent": 38 00:26:02.048 } 00:26:02.048 }, 00:26:02.048 "base_bdevs_list": [ 00:26:02.048 { 00:26:02.048 "name": "spare", 00:26:02.048 "uuid": "ed36f516-420c-59b7-80c2-16aaca13c0c8", 00:26:02.048 "is_configured": true, 00:26:02.048 "data_offset": 256, 00:26:02.048 "data_size": 7936 00:26:02.048 }, 00:26:02.048 { 00:26:02.048 "name": "BaseBdev2", 00:26:02.048 "uuid": "c6aa8e31-e4a6-5139-88c8-c1fc21202987", 00:26:02.048 "is_configured": true, 00:26:02.048 "data_offset": 256, 00:26:02.048 "data_size": 7936 00:26:02.048 } 00:26:02.048 ] 00:26:02.048 }' 00:26:02.048 00:08:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:26:02.048 00:08:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:02.048 00:08:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:26:02.048 00:08:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:26:02.048 00:08:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:02.307 [2024-05-15 00:08:02.796107] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:02.307 [2024-05-15 00:08:02.831629] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:02.307 [2024-05-15 00:08:02.831676] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:02.307 00:08:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:02.307 00:08:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:26:02.307 00:08:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:26:02.307 00:08:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:26:02.307 00:08:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:26:02.307 00:08:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:26:02.307 00:08:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:26:02.307 00:08:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:26:02.307 00:08:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:26:02.307 00:08:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:26:02.307 00:08:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:02.307 00:08:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:02.564 00:08:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:26:02.564 "name": "raid_bdev1", 00:26:02.564 "uuid": "c664daec-15ea-4fd3-8e22-ed097e9586ce", 00:26:02.564 "strip_size_kb": 0, 00:26:02.564 "state": "online", 00:26:02.564 "raid_level": "raid1", 00:26:02.564 "superblock": true, 00:26:02.564 "num_base_bdevs": 2, 00:26:02.564 "num_base_bdevs_discovered": 1, 00:26:02.564 "num_base_bdevs_operational": 1, 00:26:02.564 "base_bdevs_list": [ 00:26:02.564 { 00:26:02.564 "name": null, 00:26:02.564 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:02.564 "is_configured": false, 00:26:02.564 "data_offset": 256, 00:26:02.564 "data_size": 7936 00:26:02.564 }, 00:26:02.564 { 00:26:02.564 "name": "BaseBdev2", 00:26:02.564 "uuid": "c6aa8e31-e4a6-5139-88c8-c1fc21202987", 00:26:02.564 "is_configured": true, 00:26:02.564 "data_offset": 256, 00:26:02.564 "data_size": 7936 00:26:02.564 } 00:26:02.564 ] 00:26:02.564 }' 00:26:02.564 00:08:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:26:02.564 00:08:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:03.129 00:08:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:03.129 00:08:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:26:03.129 00:08:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:26:03.129 00:08:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=none 00:26:03.129 00:08:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:26:03.129 00:08:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:03.129 00:08:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:03.386 00:08:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:26:03.386 "name": "raid_bdev1", 00:26:03.386 "uuid": "c664daec-15ea-4fd3-8e22-ed097e9586ce", 00:26:03.386 "strip_size_kb": 0, 00:26:03.386 "state": "online", 00:26:03.386 "raid_level": "raid1", 00:26:03.386 "superblock": true, 00:26:03.386 "num_base_bdevs": 2, 00:26:03.386 "num_base_bdevs_discovered": 1, 00:26:03.386 "num_base_bdevs_operational": 1, 00:26:03.386 "base_bdevs_list": [ 00:26:03.386 { 00:26:03.386 "name": null, 00:26:03.386 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:03.386 "is_configured": false, 00:26:03.386 "data_offset": 256, 00:26:03.386 "data_size": 7936 00:26:03.386 }, 00:26:03.386 { 00:26:03.386 "name": "BaseBdev2", 00:26:03.386 "uuid": "c6aa8e31-e4a6-5139-88c8-c1fc21202987", 00:26:03.386 "is_configured": true, 00:26:03.386 "data_offset": 256, 00:26:03.386 "data_size": 7936 00:26:03.386 } 00:26:03.386 ] 00:26:03.386 }' 00:26:03.386 00:08:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:26:03.386 00:08:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:03.386 00:08:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:26:03.644 00:08:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:26:03.644 00:08:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@667 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:03.644 [2024-05-15 00:08:04.223163] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:03.644 [2024-05-15 00:08:04.226727] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa38c40 00:26:03.644 [2024-05-15 00:08:04.228167] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:03.901 00:08:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@668 -- # sleep 1 00:26:04.833 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@669 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:04.833 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:26:04.833 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:26:04.833 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=spare 00:26:04.833 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:26:04.833 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:04.833 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:05.091 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:26:05.091 "name": "raid_bdev1", 00:26:05.091 "uuid": "c664daec-15ea-4fd3-8e22-ed097e9586ce", 00:26:05.091 "strip_size_kb": 0, 00:26:05.091 "state": "online", 00:26:05.091 "raid_level": "raid1", 00:26:05.091 "superblock": true, 00:26:05.091 "num_base_bdevs": 2, 00:26:05.091 "num_base_bdevs_discovered": 2, 00:26:05.091 "num_base_bdevs_operational": 2, 00:26:05.091 "process": { 00:26:05.091 "type": "rebuild", 00:26:05.091 "target": "spare", 00:26:05.091 "progress": { 00:26:05.091 "blocks": 3072, 00:26:05.091 "percent": 38 00:26:05.091 } 00:26:05.091 }, 00:26:05.091 "base_bdevs_list": [ 00:26:05.091 { 00:26:05.091 "name": "spare", 00:26:05.091 "uuid": "ed36f516-420c-59b7-80c2-16aaca13c0c8", 00:26:05.091 "is_configured": true, 00:26:05.091 "data_offset": 256, 00:26:05.091 "data_size": 7936 00:26:05.091 }, 00:26:05.091 { 00:26:05.091 "name": "BaseBdev2", 00:26:05.091 "uuid": "c6aa8e31-e4a6-5139-88c8-c1fc21202987", 00:26:05.091 "is_configured": true, 00:26:05.091 "data_offset": 256, 00:26:05.091 "data_size": 7936 00:26:05.091 } 00:26:05.091 ] 00:26:05.091 }' 00:26:05.091 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:26:05.091 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:05.091 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:26:05.091 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:26:05.091 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@671 -- # '[' true = true ']' 00:26:05.091 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@671 -- # '[' = false ']' 00:26:05.091 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 671: [: =: unary operator expected 00:26:05.091 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@696 -- # local num_base_bdevs_operational=2 00:26:05.091 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@698 -- # '[' raid1 = raid1 ']' 00:26:05.091 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@698 -- # '[' 2 -gt 2 ']' 00:26:05.091 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@711 -- # local timeout=976 00:26:05.091 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:26:05.091 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:05.091 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:26:05.091 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:26:05.091 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=spare 00:26:05.091 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:26:05.091 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:05.091 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:05.349 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:26:05.349 "name": "raid_bdev1", 00:26:05.349 "uuid": "c664daec-15ea-4fd3-8e22-ed097e9586ce", 00:26:05.349 "strip_size_kb": 0, 00:26:05.349 "state": "online", 00:26:05.349 "raid_level": "raid1", 00:26:05.349 "superblock": true, 00:26:05.349 "num_base_bdevs": 2, 00:26:05.349 "num_base_bdevs_discovered": 2, 00:26:05.349 "num_base_bdevs_operational": 2, 00:26:05.349 "process": { 00:26:05.349 "type": "rebuild", 00:26:05.349 "target": "spare", 00:26:05.349 "progress": { 00:26:05.349 "blocks": 3840, 00:26:05.349 "percent": 48 00:26:05.349 } 00:26:05.349 }, 00:26:05.349 "base_bdevs_list": [ 00:26:05.349 { 00:26:05.349 "name": "spare", 00:26:05.349 "uuid": "ed36f516-420c-59b7-80c2-16aaca13c0c8", 00:26:05.349 "is_configured": true, 00:26:05.349 "data_offset": 256, 00:26:05.349 "data_size": 7936 00:26:05.349 }, 00:26:05.349 { 00:26:05.349 "name": "BaseBdev2", 00:26:05.349 "uuid": "c6aa8e31-e4a6-5139-88c8-c1fc21202987", 00:26:05.349 "is_configured": true, 00:26:05.349 "data_offset": 256, 00:26:05.349 "data_size": 7936 00:26:05.349 } 00:26:05.349 ] 00:26:05.349 }' 00:26:05.349 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:26:05.349 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:05.349 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:26:05.349 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:26:05.349 00:08:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@716 -- # sleep 1 00:26:06.721 00:08:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:26:06.721 00:08:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:06.721 00:08:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:26:06.721 00:08:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:26:06.721 00:08:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=spare 00:26:06.721 00:08:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:26:06.721 00:08:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:06.721 00:08:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:06.721 00:08:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:26:06.721 "name": "raid_bdev1", 00:26:06.721 "uuid": "c664daec-15ea-4fd3-8e22-ed097e9586ce", 00:26:06.721 "strip_size_kb": 0, 00:26:06.721 "state": "online", 00:26:06.721 "raid_level": "raid1", 00:26:06.721 "superblock": true, 00:26:06.721 "num_base_bdevs": 2, 00:26:06.721 "num_base_bdevs_discovered": 2, 00:26:06.721 "num_base_bdevs_operational": 2, 00:26:06.721 "process": { 00:26:06.721 "type": "rebuild", 00:26:06.721 "target": "spare", 00:26:06.721 "progress": { 00:26:06.721 "blocks": 7168, 00:26:06.721 "percent": 90 00:26:06.721 } 00:26:06.721 }, 00:26:06.721 "base_bdevs_list": [ 00:26:06.721 { 00:26:06.721 "name": "spare", 00:26:06.721 "uuid": "ed36f516-420c-59b7-80c2-16aaca13c0c8", 00:26:06.721 "is_configured": true, 00:26:06.721 "data_offset": 256, 00:26:06.721 "data_size": 7936 00:26:06.721 }, 00:26:06.721 { 00:26:06.721 "name": "BaseBdev2", 00:26:06.721 "uuid": "c6aa8e31-e4a6-5139-88c8-c1fc21202987", 00:26:06.721 "is_configured": true, 00:26:06.721 "data_offset": 256, 00:26:06.721 "data_size": 7936 00:26:06.721 } 00:26:06.721 ] 00:26:06.721 }' 00:26:06.721 00:08:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:26:06.721 00:08:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:06.721 00:08:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:26:06.721 00:08:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:26:06.721 00:08:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@716 -- # sleep 1 00:26:06.979 [2024-05-15 00:08:07.352558] bdev_raid.c:2741:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:06.979 [2024-05-15 00:08:07.352617] bdev_raid.c:2458:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:06.979 [2024-05-15 00:08:07.352699] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:07.911 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:26:07.911 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:07.911 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:26:07.911 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:26:07.911 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=spare 00:26:07.911 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:26:07.911 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:07.911 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:07.911 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:26:07.911 "name": "raid_bdev1", 00:26:07.911 "uuid": "c664daec-15ea-4fd3-8e22-ed097e9586ce", 00:26:07.911 "strip_size_kb": 0, 00:26:07.911 "state": "online", 00:26:07.911 "raid_level": "raid1", 00:26:07.911 "superblock": true, 00:26:07.911 "num_base_bdevs": 2, 00:26:07.911 "num_base_bdevs_discovered": 2, 00:26:07.911 "num_base_bdevs_operational": 2, 00:26:07.911 "base_bdevs_list": [ 00:26:07.911 { 00:26:07.911 "name": "spare", 00:26:07.911 "uuid": "ed36f516-420c-59b7-80c2-16aaca13c0c8", 00:26:07.911 "is_configured": true, 00:26:07.911 "data_offset": 256, 00:26:07.911 "data_size": 7936 00:26:07.911 }, 00:26:07.911 { 00:26:07.911 "name": "BaseBdev2", 00:26:07.911 "uuid": "c6aa8e31-e4a6-5139-88c8-c1fc21202987", 00:26:07.911 "is_configured": true, 00:26:07.911 "data_offset": 256, 00:26:07.911 "data_size": 7936 00:26:07.911 } 00:26:07.911 ] 00:26:07.911 }' 00:26:07.911 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:26:07.911 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:07.911 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:26:08.169 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:26:08.169 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@714 -- # break 00:26:08.169 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@720 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:08.169 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:26:08.169 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:26:08.169 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=none 00:26:08.169 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:26:08.169 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:08.169 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:08.427 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:26:08.427 "name": "raid_bdev1", 00:26:08.427 "uuid": "c664daec-15ea-4fd3-8e22-ed097e9586ce", 00:26:08.427 "strip_size_kb": 0, 00:26:08.427 "state": "online", 00:26:08.427 "raid_level": "raid1", 00:26:08.427 "superblock": true, 00:26:08.427 "num_base_bdevs": 2, 00:26:08.427 "num_base_bdevs_discovered": 2, 00:26:08.427 "num_base_bdevs_operational": 2, 00:26:08.427 "base_bdevs_list": [ 00:26:08.427 { 00:26:08.427 "name": "spare", 00:26:08.427 "uuid": "ed36f516-420c-59b7-80c2-16aaca13c0c8", 00:26:08.427 "is_configured": true, 00:26:08.427 "data_offset": 256, 00:26:08.427 "data_size": 7936 00:26:08.427 }, 00:26:08.427 { 00:26:08.427 "name": "BaseBdev2", 00:26:08.427 "uuid": "c6aa8e31-e4a6-5139-88c8-c1fc21202987", 00:26:08.427 "is_configured": true, 00:26:08.427 "data_offset": 256, 00:26:08.427 "data_size": 7936 00:26:08.427 } 00:26:08.427 ] 00:26:08.427 }' 00:26:08.427 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:26:08.427 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:08.427 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:26:08.427 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:26:08.427 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:08.427 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:26:08.427 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:26:08.427 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:26:08.427 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:26:08.427 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:26:08.427 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:26:08.427 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:26:08.427 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:26:08.427 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:26:08.427 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:08.427 00:08:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:08.685 00:08:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:26:08.685 "name": "raid_bdev1", 00:26:08.685 "uuid": "c664daec-15ea-4fd3-8e22-ed097e9586ce", 00:26:08.685 "strip_size_kb": 0, 00:26:08.685 "state": "online", 00:26:08.685 "raid_level": "raid1", 00:26:08.685 "superblock": true, 00:26:08.685 "num_base_bdevs": 2, 00:26:08.685 "num_base_bdevs_discovered": 2, 00:26:08.685 "num_base_bdevs_operational": 2, 00:26:08.685 "base_bdevs_list": [ 00:26:08.685 { 00:26:08.685 "name": "spare", 00:26:08.685 "uuid": "ed36f516-420c-59b7-80c2-16aaca13c0c8", 00:26:08.685 "is_configured": true, 00:26:08.685 "data_offset": 256, 00:26:08.685 "data_size": 7936 00:26:08.685 }, 00:26:08.685 { 00:26:08.685 "name": "BaseBdev2", 00:26:08.685 "uuid": "c6aa8e31-e4a6-5139-88c8-c1fc21202987", 00:26:08.685 "is_configured": true, 00:26:08.685 "data_offset": 256, 00:26:08.685 "data_size": 7936 00:26:08.685 } 00:26:08.685 ] 00:26:08.685 }' 00:26:08.685 00:08:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:26:08.685 00:08:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:09.251 00:08:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@724 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:09.508 [2024-05-15 00:08:09.911408] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:09.508 [2024-05-15 00:08:09.911437] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:09.508 [2024-05-15 00:08:09.911500] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:09.508 [2024-05-15 00:08:09.911561] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:09.508 [2024-05-15 00:08:09.911574] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb49060 name raid_bdev1, state offline 00:26:09.508 00:08:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@725 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:09.508 00:08:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@725 -- # jq length 00:26:09.767 00:08:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@725 -- # [[ 0 == 0 ]] 00:26:09.767 00:08:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@727 -- # '[' false = true ']' 00:26:09.767 00:08:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # '[' true = true ']' 00:26:09.767 00:08:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:26:09.767 00:08:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev1 ']' 00:26:09.767 00:08:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:26:10.025 00:08:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:10.283 [2024-05-15 00:08:10.641505] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:10.283 [2024-05-15 00:08:10.641554] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:10.283 [2024-05-15 00:08:10.641574] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa3c5b0 00:26:10.283 [2024-05-15 00:08:10.641586] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:10.283 [2024-05-15 00:08:10.643068] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:10.283 [2024-05-15 00:08:10.643095] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:10.283 [2024-05-15 00:08:10.643146] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev1 00:26:10.284 [2024-05-15 00:08:10.643173] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:10.284 BaseBdev1 00:26:10.284 00:08:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:26:10.284 00:08:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev2 ']' 00:26:10.284 00:08:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev2 00:26:10.541 00:08:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:10.541 [2024-05-15 00:08:11.122894] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:10.541 [2024-05-15 00:08:11.122935] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:10.541 [2024-05-15 00:08:11.122956] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa3c220 00:26:10.541 [2024-05-15 00:08:11.122968] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:10.541 [2024-05-15 00:08:11.123124] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:10.541 [2024-05-15 00:08:11.123140] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:10.541 [2024-05-15 00:08:11.123181] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev2 00:26:10.541 [2024-05-15 00:08:11.123193] bdev_raid.c:3396:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev2 (3) greater than existing raid bdev raid_bdev1 (1) 00:26:10.541 [2024-05-15 00:08:11.123203] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:10.541 [2024-05-15 00:08:11.123219] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa3ba70 name raid_bdev1, state configuring 00:26:10.541 [2024-05-15 00:08:11.123251] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:10.541 BaseBdev2 00:26:10.799 00:08:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@757 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:10.799 00:08:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@758 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:11.057 [2024-05-15 00:08:11.588135] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:11.057 [2024-05-15 00:08:11.588174] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:11.057 [2024-05-15 00:08:11.588194] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa3bfb0 00:26:11.057 [2024-05-15 00:08:11.588206] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:11.057 [2024-05-15 00:08:11.588377] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:11.057 [2024-05-15 00:08:11.588393] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:11.057 [2024-05-15 00:08:11.588453] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev spare 00:26:11.057 [2024-05-15 00:08:11.588472] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:11.057 spare 00:26:11.057 00:08:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:11.057 00:08:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:26:11.057 00:08:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:26:11.057 00:08:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:26:11.057 00:08:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:26:11.057 00:08:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:26:11.057 00:08:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:26:11.057 00:08:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:26:11.057 00:08:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:26:11.057 00:08:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:26:11.057 00:08:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:11.057 00:08:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:11.316 [2024-05-15 00:08:11.688791] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xa387a0 00:26:11.316 [2024-05-15 00:08:11.688808] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:26:11.316 [2024-05-15 00:08:11.688883] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbdba00 00:26:11.316 [2024-05-15 00:08:11.688974] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa387a0 00:26:11.316 [2024-05-15 00:08:11.688983] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa387a0 00:26:11.316 [2024-05-15 00:08:11.689047] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:11.316 00:08:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:26:11.316 "name": "raid_bdev1", 00:26:11.316 "uuid": "c664daec-15ea-4fd3-8e22-ed097e9586ce", 00:26:11.316 "strip_size_kb": 0, 00:26:11.316 "state": "online", 00:26:11.316 "raid_level": "raid1", 00:26:11.316 "superblock": true, 00:26:11.316 "num_base_bdevs": 2, 00:26:11.316 "num_base_bdevs_discovered": 2, 00:26:11.316 "num_base_bdevs_operational": 2, 00:26:11.316 "base_bdevs_list": [ 00:26:11.316 { 00:26:11.316 "name": "spare", 00:26:11.316 "uuid": "ed36f516-420c-59b7-80c2-16aaca13c0c8", 00:26:11.316 "is_configured": true, 00:26:11.316 "data_offset": 256, 00:26:11.316 "data_size": 7936 00:26:11.316 }, 00:26:11.316 { 00:26:11.316 "name": "BaseBdev2", 00:26:11.316 "uuid": "c6aa8e31-e4a6-5139-88c8-c1fc21202987", 00:26:11.316 "is_configured": true, 00:26:11.316 "data_offset": 256, 00:26:11.316 "data_size": 7936 00:26:11.316 } 00:26:11.316 ] 00:26:11.316 }' 00:26:11.316 00:08:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:26:11.316 00:08:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:11.882 00:08:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:11.882 00:08:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:26:11.882 00:08:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:26:11.882 00:08:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=none 00:26:11.882 00:08:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:26:11.882 00:08:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:11.882 00:08:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:12.141 00:08:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:26:12.141 "name": "raid_bdev1", 00:26:12.141 "uuid": "c664daec-15ea-4fd3-8e22-ed097e9586ce", 00:26:12.141 "strip_size_kb": 0, 00:26:12.141 "state": "online", 00:26:12.141 "raid_level": "raid1", 00:26:12.141 "superblock": true, 00:26:12.141 "num_base_bdevs": 2, 00:26:12.141 "num_base_bdevs_discovered": 2, 00:26:12.141 "num_base_bdevs_operational": 2, 00:26:12.141 "base_bdevs_list": [ 00:26:12.141 { 00:26:12.141 "name": "spare", 00:26:12.141 "uuid": "ed36f516-420c-59b7-80c2-16aaca13c0c8", 00:26:12.141 "is_configured": true, 00:26:12.141 "data_offset": 256, 00:26:12.141 "data_size": 7936 00:26:12.141 }, 00:26:12.141 { 00:26:12.141 "name": "BaseBdev2", 00:26:12.141 "uuid": "c6aa8e31-e4a6-5139-88c8-c1fc21202987", 00:26:12.141 "is_configured": true, 00:26:12.141 "data_offset": 256, 00:26:12.141 "data_size": 7936 00:26:12.141 } 00:26:12.141 ] 00:26:12.141 }' 00:26:12.141 00:08:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:26:12.141 00:08:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:12.141 00:08:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:26:12.141 00:08:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:26:12.141 00:08:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:12.141 00:08:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # jq -r '.[].base_bdevs_list[0].name' 00:26:12.400 00:08:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # [[ spare == \s\p\a\r\e ]] 00:26:12.400 00:08:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:12.658 [2024-05-15 00:08:13.164448] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:12.658 00:08:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@766 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:12.658 00:08:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:26:12.658 00:08:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:26:12.658 00:08:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:26:12.659 00:08:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:26:12.659 00:08:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:26:12.659 00:08:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:26:12.659 00:08:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:26:12.659 00:08:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:26:12.659 00:08:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:26:12.659 00:08:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:12.659 00:08:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:12.917 00:08:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:26:12.917 "name": "raid_bdev1", 00:26:12.917 "uuid": "c664daec-15ea-4fd3-8e22-ed097e9586ce", 00:26:12.917 "strip_size_kb": 0, 00:26:12.917 "state": "online", 00:26:12.917 "raid_level": "raid1", 00:26:12.917 "superblock": true, 00:26:12.917 "num_base_bdevs": 2, 00:26:12.917 "num_base_bdevs_discovered": 1, 00:26:12.917 "num_base_bdevs_operational": 1, 00:26:12.917 "base_bdevs_list": [ 00:26:12.917 { 00:26:12.917 "name": null, 00:26:12.917 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:12.917 "is_configured": false, 00:26:12.917 "data_offset": 256, 00:26:12.917 "data_size": 7936 00:26:12.917 }, 00:26:12.917 { 00:26:12.917 "name": "BaseBdev2", 00:26:12.917 "uuid": "c6aa8e31-e4a6-5139-88c8-c1fc21202987", 00:26:12.917 "is_configured": true, 00:26:12.917 "data_offset": 256, 00:26:12.917 "data_size": 7936 00:26:12.917 } 00:26:12.917 ] 00:26:12.917 }' 00:26:12.917 00:08:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:26:12.917 00:08:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:13.482 00:08:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:13.740 [2024-05-15 00:08:14.235282] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:13.740 [2024-05-15 00:08:14.235439] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:26:13.740 [2024-05-15 00:08:14.235457] bdev_raid.c:3452:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:13.740 [2024-05-15 00:08:14.235486] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:13.740 [2024-05-15 00:08:14.238903] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb48460 00:26:13.740 [2024-05-15 00:08:14.240407] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:13.740 00:08:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # sleep 1 00:26:15.113 00:08:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:15.113 00:08:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:26:15.113 00:08:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:26:15.113 00:08:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=spare 00:26:15.113 00:08:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:26:15.113 00:08:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:15.113 00:08:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:15.113 00:08:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:26:15.113 "name": "raid_bdev1", 00:26:15.113 "uuid": "c664daec-15ea-4fd3-8e22-ed097e9586ce", 00:26:15.113 "strip_size_kb": 0, 00:26:15.113 "state": "online", 00:26:15.113 "raid_level": "raid1", 00:26:15.113 "superblock": true, 00:26:15.113 "num_base_bdevs": 2, 00:26:15.113 "num_base_bdevs_discovered": 2, 00:26:15.113 "num_base_bdevs_operational": 2, 00:26:15.113 "process": { 00:26:15.113 "type": "rebuild", 00:26:15.113 "target": "spare", 00:26:15.113 "progress": { 00:26:15.113 "blocks": 3072, 00:26:15.114 "percent": 38 00:26:15.114 } 00:26:15.114 }, 00:26:15.114 "base_bdevs_list": [ 00:26:15.114 { 00:26:15.114 "name": "spare", 00:26:15.114 "uuid": "ed36f516-420c-59b7-80c2-16aaca13c0c8", 00:26:15.114 "is_configured": true, 00:26:15.114 "data_offset": 256, 00:26:15.114 "data_size": 7936 00:26:15.114 }, 00:26:15.114 { 00:26:15.114 "name": "BaseBdev2", 00:26:15.114 "uuid": "c6aa8e31-e4a6-5139-88c8-c1fc21202987", 00:26:15.114 "is_configured": true, 00:26:15.114 "data_offset": 256, 00:26:15.114 "data_size": 7936 00:26:15.114 } 00:26:15.114 ] 00:26:15.114 }' 00:26:15.114 00:08:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:26:15.114 00:08:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:15.114 00:08:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:26:15.114 00:08:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:26:15.114 00:08:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:15.372 [2024-05-15 00:08:15.833809] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:15.372 [2024-05-15 00:08:15.853012] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:15.372 [2024-05-15 00:08:15.853062] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:15.372 00:08:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:15.372 00:08:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:26:15.372 00:08:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:26:15.372 00:08:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:26:15.372 00:08:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:26:15.372 00:08:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:26:15.372 00:08:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:26:15.372 00:08:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:26:15.372 00:08:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:26:15.372 00:08:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:26:15.372 00:08:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:15.372 00:08:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:15.666 00:08:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:26:15.666 "name": "raid_bdev1", 00:26:15.666 "uuid": "c664daec-15ea-4fd3-8e22-ed097e9586ce", 00:26:15.666 "strip_size_kb": 0, 00:26:15.666 "state": "online", 00:26:15.666 "raid_level": "raid1", 00:26:15.666 "superblock": true, 00:26:15.666 "num_base_bdevs": 2, 00:26:15.666 "num_base_bdevs_discovered": 1, 00:26:15.666 "num_base_bdevs_operational": 1, 00:26:15.666 "base_bdevs_list": [ 00:26:15.666 { 00:26:15.666 "name": null, 00:26:15.666 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:15.666 "is_configured": false, 00:26:15.666 "data_offset": 256, 00:26:15.666 "data_size": 7936 00:26:15.666 }, 00:26:15.666 { 00:26:15.666 "name": "BaseBdev2", 00:26:15.666 "uuid": "c6aa8e31-e4a6-5139-88c8-c1fc21202987", 00:26:15.666 "is_configured": true, 00:26:15.666 "data_offset": 256, 00:26:15.666 "data_size": 7936 00:26:15.666 } 00:26:15.666 ] 00:26:15.666 }' 00:26:15.666 00:08:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:26:15.666 00:08:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:16.234 00:08:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:16.494 [2024-05-15 00:08:16.911672] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:16.494 [2024-05-15 00:08:16.911722] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:16.494 [2024-05-15 00:08:16.911743] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbdb410 00:26:16.494 [2024-05-15 00:08:16.911756] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:16.494 [2024-05-15 00:08:16.911939] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:16.494 [2024-05-15 00:08:16.911955] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:16.494 [2024-05-15 00:08:16.912008] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev spare 00:26:16.494 [2024-05-15 00:08:16.912020] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:26:16.494 [2024-05-15 00:08:16.912031] bdev_raid.c:3452:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:16.494 [2024-05-15 00:08:16.912051] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:16.494 [2024-05-15 00:08:16.915490] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbdba00 00:26:16.494 [2024-05-15 00:08:16.916820] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:16.494 spare 00:26:16.494 00:08:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # sleep 1 00:26:17.430 00:08:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:17.430 00:08:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:26:17.430 00:08:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:26:17.430 00:08:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=spare 00:26:17.430 00:08:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:26:17.430 00:08:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:17.430 00:08:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:17.688 00:08:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:26:17.688 "name": "raid_bdev1", 00:26:17.688 "uuid": "c664daec-15ea-4fd3-8e22-ed097e9586ce", 00:26:17.688 "strip_size_kb": 0, 00:26:17.688 "state": "online", 00:26:17.688 "raid_level": "raid1", 00:26:17.688 "superblock": true, 00:26:17.688 "num_base_bdevs": 2, 00:26:17.688 "num_base_bdevs_discovered": 2, 00:26:17.688 "num_base_bdevs_operational": 2, 00:26:17.688 "process": { 00:26:17.688 "type": "rebuild", 00:26:17.688 "target": "spare", 00:26:17.688 "progress": { 00:26:17.688 "blocks": 2816, 00:26:17.688 "percent": 35 00:26:17.688 } 00:26:17.688 }, 00:26:17.688 "base_bdevs_list": [ 00:26:17.688 { 00:26:17.688 "name": "spare", 00:26:17.688 "uuid": "ed36f516-420c-59b7-80c2-16aaca13c0c8", 00:26:17.688 "is_configured": true, 00:26:17.688 "data_offset": 256, 00:26:17.688 "data_size": 7936 00:26:17.688 }, 00:26:17.688 { 00:26:17.688 "name": "BaseBdev2", 00:26:17.688 "uuid": "c6aa8e31-e4a6-5139-88c8-c1fc21202987", 00:26:17.688 "is_configured": true, 00:26:17.688 "data_offset": 256, 00:26:17.688 "data_size": 7936 00:26:17.688 } 00:26:17.688 ] 00:26:17.688 }' 00:26:17.688 00:08:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:26:17.688 00:08:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:17.688 00:08:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:26:17.688 00:08:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:26:17.688 00:08:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:17.947 [2024-05-15 00:08:18.442632] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:17.947 [2024-05-15 00:08:18.529082] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:17.947 [2024-05-15 00:08:18.529128] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:18.206 00:08:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@780 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:18.206 00:08:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:26:18.206 00:08:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:26:18.206 00:08:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:26:18.206 00:08:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:26:18.206 00:08:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:26:18.206 00:08:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:26:18.206 00:08:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:26:18.206 00:08:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:26:18.206 00:08:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:26:18.206 00:08:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:18.206 00:08:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:18.465 00:08:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:26:18.465 "name": "raid_bdev1", 00:26:18.465 "uuid": "c664daec-15ea-4fd3-8e22-ed097e9586ce", 00:26:18.465 "strip_size_kb": 0, 00:26:18.465 "state": "online", 00:26:18.465 "raid_level": "raid1", 00:26:18.465 "superblock": true, 00:26:18.465 "num_base_bdevs": 2, 00:26:18.465 "num_base_bdevs_discovered": 1, 00:26:18.465 "num_base_bdevs_operational": 1, 00:26:18.465 "base_bdevs_list": [ 00:26:18.465 { 00:26:18.465 "name": null, 00:26:18.465 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:18.465 "is_configured": false, 00:26:18.465 "data_offset": 256, 00:26:18.465 "data_size": 7936 00:26:18.465 }, 00:26:18.465 { 00:26:18.465 "name": "BaseBdev2", 00:26:18.465 "uuid": "c6aa8e31-e4a6-5139-88c8-c1fc21202987", 00:26:18.465 "is_configured": true, 00:26:18.465 "data_offset": 256, 00:26:18.465 "data_size": 7936 00:26:18.465 } 00:26:18.465 ] 00:26:18.465 }' 00:26:18.465 00:08:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:26:18.465 00:08:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:19.033 00:08:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@781 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:19.033 00:08:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:26:19.033 00:08:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:26:19.033 00:08:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=none 00:26:19.033 00:08:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:26:19.033 00:08:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:19.033 00:08:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:19.291 00:08:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:26:19.291 "name": "raid_bdev1", 00:26:19.291 "uuid": "c664daec-15ea-4fd3-8e22-ed097e9586ce", 00:26:19.291 "strip_size_kb": 0, 00:26:19.291 "state": "online", 00:26:19.291 "raid_level": "raid1", 00:26:19.291 "superblock": true, 00:26:19.291 "num_base_bdevs": 2, 00:26:19.291 "num_base_bdevs_discovered": 1, 00:26:19.291 "num_base_bdevs_operational": 1, 00:26:19.291 "base_bdevs_list": [ 00:26:19.292 { 00:26:19.292 "name": null, 00:26:19.292 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:19.292 "is_configured": false, 00:26:19.292 "data_offset": 256, 00:26:19.292 "data_size": 7936 00:26:19.292 }, 00:26:19.292 { 00:26:19.292 "name": "BaseBdev2", 00:26:19.292 "uuid": "c6aa8e31-e4a6-5139-88c8-c1fc21202987", 00:26:19.292 "is_configured": true, 00:26:19.292 "data_offset": 256, 00:26:19.292 "data_size": 7936 00:26:19.292 } 00:26:19.292 ] 00:26:19.292 }' 00:26:19.292 00:08:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:26:19.292 00:08:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:19.292 00:08:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:26:19.292 00:08:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:26:19.292 00:08:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:26:19.550 00:08:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@785 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:19.809 [2024-05-15 00:08:20.213156] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:19.809 [2024-05-15 00:08:20.213202] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:19.809 [2024-05-15 00:08:20.213224] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbdd2f0 00:26:19.809 [2024-05-15 00:08:20.213237] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:19.809 [2024-05-15 00:08:20.213414] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:19.809 [2024-05-15 00:08:20.213430] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:19.809 [2024-05-15 00:08:20.213474] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev1 00:26:19.809 [2024-05-15 00:08:20.213486] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:26:19.809 [2024-05-15 00:08:20.213496] bdev_raid.c:3430:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:19.809 BaseBdev1 00:26:19.809 00:08:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@786 -- # sleep 1 00:26:20.746 00:08:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@787 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:20.746 00:08:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:26:20.746 00:08:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:26:20.746 00:08:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:26:20.746 00:08:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:26:20.746 00:08:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:26:20.746 00:08:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:26:20.746 00:08:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:26:20.746 00:08:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:26:20.746 00:08:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:26:20.746 00:08:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:20.746 00:08:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:21.005 00:08:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:26:21.005 "name": "raid_bdev1", 00:26:21.005 "uuid": "c664daec-15ea-4fd3-8e22-ed097e9586ce", 00:26:21.005 "strip_size_kb": 0, 00:26:21.005 "state": "online", 00:26:21.005 "raid_level": "raid1", 00:26:21.005 "superblock": true, 00:26:21.005 "num_base_bdevs": 2, 00:26:21.005 "num_base_bdevs_discovered": 1, 00:26:21.005 "num_base_bdevs_operational": 1, 00:26:21.005 "base_bdevs_list": [ 00:26:21.005 { 00:26:21.005 "name": null, 00:26:21.005 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:21.005 "is_configured": false, 00:26:21.005 "data_offset": 256, 00:26:21.005 "data_size": 7936 00:26:21.005 }, 00:26:21.005 { 00:26:21.005 "name": "BaseBdev2", 00:26:21.005 "uuid": "c6aa8e31-e4a6-5139-88c8-c1fc21202987", 00:26:21.005 "is_configured": true, 00:26:21.005 "data_offset": 256, 00:26:21.005 "data_size": 7936 00:26:21.005 } 00:26:21.005 ] 00:26:21.005 }' 00:26:21.005 00:08:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:26:21.005 00:08:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:21.572 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@788 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:21.572 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:26:21.573 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:26:21.573 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=none 00:26:21.573 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:26:21.573 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:21.573 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:21.832 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:26:21.832 "name": "raid_bdev1", 00:26:21.832 "uuid": "c664daec-15ea-4fd3-8e22-ed097e9586ce", 00:26:21.832 "strip_size_kb": 0, 00:26:21.832 "state": "online", 00:26:21.832 "raid_level": "raid1", 00:26:21.832 "superblock": true, 00:26:21.832 "num_base_bdevs": 2, 00:26:21.832 "num_base_bdevs_discovered": 1, 00:26:21.832 "num_base_bdevs_operational": 1, 00:26:21.832 "base_bdevs_list": [ 00:26:21.832 { 00:26:21.832 "name": null, 00:26:21.832 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:21.832 "is_configured": false, 00:26:21.832 "data_offset": 256, 00:26:21.832 "data_size": 7936 00:26:21.832 }, 00:26:21.832 { 00:26:21.832 "name": "BaseBdev2", 00:26:21.832 "uuid": "c6aa8e31-e4a6-5139-88c8-c1fc21202987", 00:26:21.832 "is_configured": true, 00:26:21.832 "data_offset": 256, 00:26:21.832 "data_size": 7936 00:26:21.832 } 00:26:21.832 ] 00:26:21.832 }' 00:26:21.832 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:26:21.832 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:21.832 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:26:21.832 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:26:21.832 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@789 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:21.832 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:26:21.832 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:21.832 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:21.832 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:21.832 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:21.832 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:21.832 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:21.832 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:21.832 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:21.832 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:21.832 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:22.091 [2024-05-15 00:08:22.615568] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:22.091 [2024-05-15 00:08:22.615700] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:26:22.091 [2024-05-15 00:08:22.615716] bdev_raid.c:3430:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:22.091 request: 00:26:22.091 { 00:26:22.091 "raid_bdev": "raid_bdev1", 00:26:22.091 "base_bdev": "BaseBdev1", 00:26:22.091 "method": "bdev_raid_add_base_bdev", 00:26:22.091 "req_id": 1 00:26:22.091 } 00:26:22.091 Got JSON-RPC error response 00:26:22.091 response: 00:26:22.091 { 00:26:22.091 "code": -22, 00:26:22.091 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:26:22.091 } 00:26:22.091 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:26:22.091 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:26:22.091 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:26:22.091 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:26:22.091 00:08:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@790 -- # sleep 1 00:26:23.464 00:08:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:23.464 00:08:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:26:23.464 00:08:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:26:23.464 00:08:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:26:23.464 00:08:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:26:23.464 00:08:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:26:23.464 00:08:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:26:23.464 00:08:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:26:23.464 00:08:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:26:23.464 00:08:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:26:23.464 00:08:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:23.464 00:08:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:23.464 00:08:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:26:23.464 "name": "raid_bdev1", 00:26:23.464 "uuid": "c664daec-15ea-4fd3-8e22-ed097e9586ce", 00:26:23.464 "strip_size_kb": 0, 00:26:23.464 "state": "online", 00:26:23.464 "raid_level": "raid1", 00:26:23.464 "superblock": true, 00:26:23.464 "num_base_bdevs": 2, 00:26:23.464 "num_base_bdevs_discovered": 1, 00:26:23.464 "num_base_bdevs_operational": 1, 00:26:23.464 "base_bdevs_list": [ 00:26:23.464 { 00:26:23.464 "name": null, 00:26:23.464 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:23.464 "is_configured": false, 00:26:23.464 "data_offset": 256, 00:26:23.464 "data_size": 7936 00:26:23.464 }, 00:26:23.464 { 00:26:23.464 "name": "BaseBdev2", 00:26:23.464 "uuid": "c6aa8e31-e4a6-5139-88c8-c1fc21202987", 00:26:23.464 "is_configured": true, 00:26:23.464 "data_offset": 256, 00:26:23.464 "data_size": 7936 00:26:23.464 } 00:26:23.464 ] 00:26:23.464 }' 00:26:23.464 00:08:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:26:23.464 00:08:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:24.031 00:08:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@792 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:24.031 00:08:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:26:24.031 00:08:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:26:24.031 00:08:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=none 00:26:24.031 00:08:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:26:24.031 00:08:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:24.031 00:08:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:24.290 00:08:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:26:24.290 "name": "raid_bdev1", 00:26:24.290 "uuid": "c664daec-15ea-4fd3-8e22-ed097e9586ce", 00:26:24.290 "strip_size_kb": 0, 00:26:24.290 "state": "online", 00:26:24.290 "raid_level": "raid1", 00:26:24.290 "superblock": true, 00:26:24.290 "num_base_bdevs": 2, 00:26:24.290 "num_base_bdevs_discovered": 1, 00:26:24.290 "num_base_bdevs_operational": 1, 00:26:24.290 "base_bdevs_list": [ 00:26:24.290 { 00:26:24.290 "name": null, 00:26:24.290 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:24.290 "is_configured": false, 00:26:24.290 "data_offset": 256, 00:26:24.290 "data_size": 7936 00:26:24.290 }, 00:26:24.290 { 00:26:24.290 "name": "BaseBdev2", 00:26:24.290 "uuid": "c6aa8e31-e4a6-5139-88c8-c1fc21202987", 00:26:24.290 "is_configured": true, 00:26:24.290 "data_offset": 256, 00:26:24.290 "data_size": 7936 00:26:24.290 } 00:26:24.290 ] 00:26:24.290 }' 00:26:24.290 00:08:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:26:24.290 00:08:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:24.290 00:08:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:26:24.290 00:08:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:26:24.290 00:08:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@795 -- # killprocess 525078 00:26:24.290 00:08:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@946 -- # '[' -z 525078 ']' 00:26:24.290 00:08:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@950 -- # kill -0 525078 00:26:24.290 00:08:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@951 -- # uname 00:26:24.290 00:08:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:26:24.290 00:08:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 525078 00:26:24.290 00:08:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:26:24.290 00:08:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:26:24.290 00:08:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@964 -- # echo 'killing process with pid 525078' 00:26:24.290 killing process with pid 525078 00:26:24.290 00:08:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@965 -- # kill 525078 00:26:24.290 Received shutdown signal, test time was about 60.000000 seconds 00:26:24.290 00:26:24.290 Latency(us) 00:26:24.290 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:24.290 =================================================================================================================== 00:26:24.290 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:26:24.290 [2024-05-15 00:08:24.822263] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:24.290 [2024-05-15 00:08:24.822359] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:24.290 [2024-05-15 00:08:24.822420] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:24.290 [2024-05-15 00:08:24.822433] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa387a0 name raid_bdev1, state offline 00:26:24.290 00:08:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@970 -- # wait 525078 00:26:24.290 [2024-05-15 00:08:24.853802] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:24.549 00:08:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@797 -- # return 0 00:26:24.549 00:26:24.549 real 0m29.375s 00:26:24.549 user 0m46.763s 00:26:24.549 sys 0m3.965s 00:26:24.549 00:08:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1122 -- # xtrace_disable 00:26:24.549 00:08:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:24.549 ************************************ 00:26:24.549 END TEST raid_rebuild_test_sb_md_interleaved 00:26:24.549 ************************************ 00:26:24.807 00:08:25 bdev_raid -- bdev/bdev_raid.sh@862 -- # rm -f /raidrandtest 00:26:24.807 00:26:24.807 real 16m6.123s 00:26:24.807 user 27m25.313s 00:26:24.807 sys 2m55.419s 00:26:24.807 00:08:25 bdev_raid -- common/autotest_common.sh@1122 -- # xtrace_disable 00:26:24.807 00:08:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:24.807 ************************************ 00:26:24.807 END TEST bdev_raid 00:26:24.807 ************************************ 00:26:24.807 00:08:25 -- spdk/autotest.sh@187 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:26:24.807 00:08:25 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:26:24.807 00:08:25 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:26:24.807 00:08:25 -- common/autotest_common.sh@10 -- # set +x 00:26:24.807 ************************************ 00:26:24.807 START TEST bdevperf_config 00:26:24.807 ************************************ 00:26:24.807 00:08:25 bdevperf_config -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:26:24.807 * Looking for test storage... 00:26:24.807 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:26:24.807 00:08:25 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:26:24.807 00:08:25 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:26:24.807 00:08:25 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:26:24.807 00:08:25 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:26:24.807 00:08:25 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:24.807 00:08:25 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:24.808 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:24.808 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:24.808 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:24.808 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:26:24.808 00:08:25 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:26:25.067 00:08:25 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:26:25.067 00:08:25 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:26:25.067 00:08:25 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:25.067 00:26:25.067 00:08:25 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:25.067 00:08:25 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:26:27.602 00:08:28 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-05-15 00:08:25.457007] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:26:27.602 [2024-05-15 00:08:25.457061] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid529398 ] 00:26:27.602 Using job config with 4 jobs 00:26:27.602 [2024-05-15 00:08:25.589836] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:27.602 [2024-05-15 00:08:25.706979] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:27.602 cpumask for '\''job0'\'' is too big 00:26:27.602 cpumask for '\''job1'\'' is too big 00:26:27.602 cpumask for '\''job2'\'' is too big 00:26:27.602 cpumask for '\''job3'\'' is too big 00:26:27.602 Running I/O for 2 seconds... 00:26:27.602 00:26:27.602 Latency(us) 00:26:27.602 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:27.602 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:27.602 Malloc0 : 2.02 23675.25 23.12 0.00 0.00 10802.38 1937.59 16640.45 00:26:27.602 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:27.602 Malloc0 : 2.02 23653.69 23.10 0.00 0.00 10786.91 1909.09 14702.86 00:26:27.602 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:27.602 Malloc0 : 2.03 23632.34 23.08 0.00 0.00 10772.56 1909.09 12765.27 00:26:27.602 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:27.602 Malloc0 : 2.03 23610.94 23.06 0.00 0.00 10757.17 1909.09 11169.61 00:26:27.602 =================================================================================================================== 00:26:27.602 Total : 94572.22 92.36 0.00 0.00 10779.76 1909.09 16640.45' 00:26:27.602 00:08:28 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-05-15 00:08:25.457007] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:26:27.602 [2024-05-15 00:08:25.457061] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid529398 ] 00:26:27.602 Using job config with 4 jobs 00:26:27.602 [2024-05-15 00:08:25.589836] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:27.602 [2024-05-15 00:08:25.706979] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:27.602 cpumask for '\''job0'\'' is too big 00:26:27.602 cpumask for '\''job1'\'' is too big 00:26:27.602 cpumask for '\''job2'\'' is too big 00:26:27.602 cpumask for '\''job3'\'' is too big 00:26:27.602 Running I/O for 2 seconds... 00:26:27.602 00:26:27.602 Latency(us) 00:26:27.602 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:27.602 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:27.602 Malloc0 : 2.02 23675.25 23.12 0.00 0.00 10802.38 1937.59 16640.45 00:26:27.602 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:27.603 Malloc0 : 2.02 23653.69 23.10 0.00 0.00 10786.91 1909.09 14702.86 00:26:27.603 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:27.603 Malloc0 : 2.03 23632.34 23.08 0.00 0.00 10772.56 1909.09 12765.27 00:26:27.603 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:27.603 Malloc0 : 2.03 23610.94 23.06 0.00 0.00 10757.17 1909.09 11169.61 00:26:27.603 =================================================================================================================== 00:26:27.603 Total : 94572.22 92.36 0.00 0.00 10779.76 1909.09 16640.45' 00:26:27.603 00:08:28 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-05-15 00:08:25.457007] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:26:27.603 [2024-05-15 00:08:25.457061] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid529398 ] 00:26:27.603 Using job config with 4 jobs 00:26:27.603 [2024-05-15 00:08:25.589836] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:27.603 [2024-05-15 00:08:25.706979] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:27.603 cpumask for '\''job0'\'' is too big 00:26:27.603 cpumask for '\''job1'\'' is too big 00:26:27.603 cpumask for '\''job2'\'' is too big 00:26:27.603 cpumask for '\''job3'\'' is too big 00:26:27.603 Running I/O for 2 seconds... 00:26:27.603 00:26:27.603 Latency(us) 00:26:27.603 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:27.603 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:27.603 Malloc0 : 2.02 23675.25 23.12 0.00 0.00 10802.38 1937.59 16640.45 00:26:27.603 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:27.603 Malloc0 : 2.02 23653.69 23.10 0.00 0.00 10786.91 1909.09 14702.86 00:26:27.603 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:27.603 Malloc0 : 2.03 23632.34 23.08 0.00 0.00 10772.56 1909.09 12765.27 00:26:27.603 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:27.603 Malloc0 : 2.03 23610.94 23.06 0.00 0.00 10757.17 1909.09 11169.61 00:26:27.603 =================================================================================================================== 00:26:27.603 Total : 94572.22 92.36 0.00 0.00 10779.76 1909.09 16640.45' 00:26:27.603 00:08:28 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:26:27.603 00:08:28 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:26:27.603 00:08:28 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:26:27.603 00:08:28 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:26:27.862 [2024-05-15 00:08:28.235069] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:26:27.862 [2024-05-15 00:08:28.235133] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid529748 ] 00:26:27.862 [2024-05-15 00:08:28.375227] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:28.121 [2024-05-15 00:08:28.493066] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:28.121 cpumask for 'job0' is too big 00:26:28.121 cpumask for 'job1' is too big 00:26:28.121 cpumask for 'job2' is too big 00:26:28.121 cpumask for 'job3' is too big 00:26:30.653 00:08:30 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:26:30.653 Running I/O for 2 seconds... 00:26:30.653 00:26:30.653 Latency(us) 00:26:30.653 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:30.653 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:30.653 Malloc0 : 2.02 23592.05 23.04 0.00 0.00 10837.07 1909.09 16640.45 00:26:30.653 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:30.653 Malloc0 : 2.02 23570.54 23.02 0.00 0.00 10820.20 1894.85 14702.86 00:26:30.653 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:30.653 Malloc0 : 2.02 23549.14 23.00 0.00 0.00 10805.82 1894.85 12822.26 00:26:30.653 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:30.653 Malloc0 : 2.02 23527.81 22.98 0.00 0.00 10791.53 1894.85 11226.60 00:26:30.653 =================================================================================================================== 00:26:30.653 Total : 94239.53 92.03 0.00 0.00 10813.66 1894.85 16640.45' 00:26:30.653 00:08:30 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:26:30.653 00:08:30 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:26:30.653 00:08:30 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:26:30.653 00:08:30 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:26:30.653 00:08:30 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:26:30.653 00:08:30 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:26:30.653 00:08:30 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:26:30.653 00:08:30 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:26:30.653 00:08:30 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:30.653 00:26:30.653 00:08:30 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:30.653 00:08:30 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:26:30.653 00:08:30 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:26:30.653 00:08:30 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:26:30.653 00:08:30 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:26:30.653 00:08:30 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:26:30.653 00:08:30 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:26:30.653 00:08:30 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:30.653 00:26:30.653 00:08:30 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:30.653 00:08:30 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:26:30.653 00:08:30 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:26:30.653 00:08:30 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:26:30.653 00:08:30 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:26:30.653 00:08:30 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:26:30.653 00:08:30 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:26:30.653 00:08:30 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:30.653 00:26:30.653 00:08:30 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:30.653 00:08:30 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:26:33.186 00:08:33 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-05-15 00:08:30.995877] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:26:33.186 [2024-05-15 00:08:30.995942] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid530103 ] 00:26:33.186 Using job config with 3 jobs 00:26:33.186 [2024-05-15 00:08:31.130575] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:33.186 [2024-05-15 00:08:31.242981] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:33.186 cpumask for '\''job0'\'' is too big 00:26:33.186 cpumask for '\''job1'\'' is too big 00:26:33.186 cpumask for '\''job2'\'' is too big 00:26:33.186 Running I/O for 2 seconds... 00:26:33.186 00:26:33.186 Latency(us) 00:26:33.186 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:33.186 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:26:33.186 Malloc0 : 2.01 31591.35 30.85 0.00 0.00 8093.51 1880.60 11910.46 00:26:33.186 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:26:33.186 Malloc0 : 2.02 31604.41 30.86 0.00 0.00 8071.62 1852.10 10029.86 00:26:33.186 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:26:33.186 Malloc0 : 2.02 31575.68 30.84 0.00 0.00 8061.57 1880.60 8434.20 00:26:33.186 =================================================================================================================== 00:26:33.186 Total : 94771.43 92.55 0.00 0.00 8075.55 1852.10 11910.46' 00:26:33.186 00:08:33 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-05-15 00:08:30.995877] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:26:33.186 [2024-05-15 00:08:30.995942] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid530103 ] 00:26:33.186 Using job config with 3 jobs 00:26:33.187 [2024-05-15 00:08:31.130575] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:33.187 [2024-05-15 00:08:31.242981] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:33.187 cpumask for '\''job0'\'' is too big 00:26:33.187 cpumask for '\''job1'\'' is too big 00:26:33.187 cpumask for '\''job2'\'' is too big 00:26:33.187 Running I/O for 2 seconds... 00:26:33.187 00:26:33.187 Latency(us) 00:26:33.187 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:33.187 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:26:33.187 Malloc0 : 2.01 31591.35 30.85 0.00 0.00 8093.51 1880.60 11910.46 00:26:33.187 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:26:33.187 Malloc0 : 2.02 31604.41 30.86 0.00 0.00 8071.62 1852.10 10029.86 00:26:33.187 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:26:33.187 Malloc0 : 2.02 31575.68 30.84 0.00 0.00 8061.57 1880.60 8434.20 00:26:33.187 =================================================================================================================== 00:26:33.187 Total : 94771.43 92.55 0.00 0.00 8075.55 1852.10 11910.46' 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-05-15 00:08:30.995877] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:26:33.187 [2024-05-15 00:08:30.995942] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid530103 ] 00:26:33.187 Using job config with 3 jobs 00:26:33.187 [2024-05-15 00:08:31.130575] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:33.187 [2024-05-15 00:08:31.242981] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:33.187 cpumask for '\''job0'\'' is too big 00:26:33.187 cpumask for '\''job1'\'' is too big 00:26:33.187 cpumask for '\''job2'\'' is too big 00:26:33.187 Running I/O for 2 seconds... 00:26:33.187 00:26:33.187 Latency(us) 00:26:33.187 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:33.187 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:26:33.187 Malloc0 : 2.01 31591.35 30.85 0.00 0.00 8093.51 1880.60 11910.46 00:26:33.187 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:26:33.187 Malloc0 : 2.02 31604.41 30.86 0.00 0.00 8071.62 1852.10 10029.86 00:26:33.187 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:26:33.187 Malloc0 : 2.02 31575.68 30.84 0.00 0.00 8061.57 1880.60 8434.20 00:26:33.187 =================================================================================================================== 00:26:33.187 Total : 94771.43 92.55 0.00 0.00 8075.55 1852.10 11910.46' 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:33.187 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:33.187 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:33.187 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:33.187 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:33.187 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:33.187 00:08:33 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:26:36.476 00:08:36 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-05-15 00:08:33.802491] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:26:36.476 [2024-05-15 00:08:33.802555] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid530463 ] 00:26:36.476 Using job config with 4 jobs 00:26:36.476 [2024-05-15 00:08:33.945964] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:36.476 [2024-05-15 00:08:34.067811] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:36.476 cpumask for '\''job0'\'' is too big 00:26:36.476 cpumask for '\''job1'\'' is too big 00:26:36.476 cpumask for '\''job2'\'' is too big 00:26:36.476 cpumask for '\''job3'\'' is too big 00:26:36.476 Running I/O for 2 seconds... 00:26:36.476 00:26:36.476 Latency(us) 00:26:36.476 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:36.476 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:36.476 Malloc0 : 2.04 11788.08 11.51 0.00 0.00 21696.44 3932.16 33736.79 00:26:36.476 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:36.476 Malloc1 : 2.04 11777.17 11.50 0.00 0.00 21695.75 4729.99 33736.79 00:26:36.476 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:36.476 Malloc0 : 2.05 11766.58 11.49 0.00 0.00 21635.79 3903.67 29633.67 00:26:36.476 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:36.476 Malloc1 : 2.05 11755.72 11.48 0.00 0.00 21635.93 4701.50 29633.67 00:26:36.476 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:36.476 Malloc0 : 2.05 11745.19 11.47 0.00 0.00 21576.51 3875.17 25758.50 00:26:36.476 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:36.476 Malloc1 : 2.05 11734.43 11.46 0.00 0.00 21575.01 4701.50 25758.50 00:26:36.476 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:36.476 Malloc0 : 2.05 11723.94 11.45 0.00 0.00 21516.46 3875.17 22225.25 00:26:36.476 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:36.476 Malloc1 : 2.05 11713.19 11.44 0.00 0.00 21516.11 4701.50 22111.28 00:26:36.476 =================================================================================================================== 00:26:36.476 Total : 94004.29 91.80 0.00 0.00 21606.00 3875.17 33736.79' 00:26:36.476 00:08:36 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-05-15 00:08:33.802491] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:26:36.476 [2024-05-15 00:08:33.802555] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid530463 ] 00:26:36.476 Using job config with 4 jobs 00:26:36.476 [2024-05-15 00:08:33.945964] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:36.476 [2024-05-15 00:08:34.067811] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:36.476 cpumask for '\''job0'\'' is too big 00:26:36.476 cpumask for '\''job1'\'' is too big 00:26:36.476 cpumask for '\''job2'\'' is too big 00:26:36.476 cpumask for '\''job3'\'' is too big 00:26:36.476 Running I/O for 2 seconds... 00:26:36.476 00:26:36.476 Latency(us) 00:26:36.476 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:36.476 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:36.476 Malloc0 : 2.04 11788.08 11.51 0.00 0.00 21696.44 3932.16 33736.79 00:26:36.476 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:36.476 Malloc1 : 2.04 11777.17 11.50 0.00 0.00 21695.75 4729.99 33736.79 00:26:36.476 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:36.476 Malloc0 : 2.05 11766.58 11.49 0.00 0.00 21635.79 3903.67 29633.67 00:26:36.476 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:36.476 Malloc1 : 2.05 11755.72 11.48 0.00 0.00 21635.93 4701.50 29633.67 00:26:36.476 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:36.476 Malloc0 : 2.05 11745.19 11.47 0.00 0.00 21576.51 3875.17 25758.50 00:26:36.476 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:36.476 Malloc1 : 2.05 11734.43 11.46 0.00 0.00 21575.01 4701.50 25758.50 00:26:36.476 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:36.476 Malloc0 : 2.05 11723.94 11.45 0.00 0.00 21516.46 3875.17 22225.25 00:26:36.476 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:36.476 Malloc1 : 2.05 11713.19 11.44 0.00 0.00 21516.11 4701.50 22111.28 00:26:36.477 =================================================================================================================== 00:26:36.477 Total : 94004.29 91.80 0.00 0.00 21606.00 3875.17 33736.79' 00:26:36.477 00:08:36 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-05-15 00:08:33.802491] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:26:36.477 [2024-05-15 00:08:33.802555] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid530463 ] 00:26:36.477 Using job config with 4 jobs 00:26:36.477 [2024-05-15 00:08:33.945964] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:36.477 [2024-05-15 00:08:34.067811] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:36.477 cpumask for '\''job0'\'' is too big 00:26:36.477 cpumask for '\''job1'\'' is too big 00:26:36.477 cpumask for '\''job2'\'' is too big 00:26:36.477 cpumask for '\''job3'\'' is too big 00:26:36.477 Running I/O for 2 seconds... 00:26:36.477 00:26:36.477 Latency(us) 00:26:36.477 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:36.477 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:36.477 Malloc0 : 2.04 11788.08 11.51 0.00 0.00 21696.44 3932.16 33736.79 00:26:36.477 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:36.477 Malloc1 : 2.04 11777.17 11.50 0.00 0.00 21695.75 4729.99 33736.79 00:26:36.477 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:36.477 Malloc0 : 2.05 11766.58 11.49 0.00 0.00 21635.79 3903.67 29633.67 00:26:36.477 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:36.477 Malloc1 : 2.05 11755.72 11.48 0.00 0.00 21635.93 4701.50 29633.67 00:26:36.477 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:36.477 Malloc0 : 2.05 11745.19 11.47 0.00 0.00 21576.51 3875.17 25758.50 00:26:36.477 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:36.477 Malloc1 : 2.05 11734.43 11.46 0.00 0.00 21575.01 4701.50 25758.50 00:26:36.477 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:36.477 Malloc0 : 2.05 11723.94 11.45 0.00 0.00 21516.46 3875.17 22225.25 00:26:36.477 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:36.477 Malloc1 : 2.05 11713.19 11.44 0.00 0.00 21516.11 4701.50 22111.28 00:26:36.477 =================================================================================================================== 00:26:36.477 Total : 94004.29 91.80 0.00 0.00 21606.00 3875.17 33736.79' 00:26:36.477 00:08:36 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:26:36.477 00:08:36 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:26:36.477 00:08:36 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:26:36.477 00:08:36 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:26:36.477 00:08:36 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:26:36.477 00:08:36 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:26:36.477 00:26:36.477 real 0m11.334s 00:26:36.477 user 0m10.008s 00:26:36.477 sys 0m1.179s 00:26:36.477 00:08:36 bdevperf_config -- common/autotest_common.sh@1122 -- # xtrace_disable 00:26:36.477 00:08:36 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:26:36.477 ************************************ 00:26:36.477 END TEST bdevperf_config 00:26:36.477 ************************************ 00:26:36.477 00:08:36 -- spdk/autotest.sh@188 -- # uname -s 00:26:36.477 00:08:36 -- spdk/autotest.sh@188 -- # [[ Linux == Linux ]] 00:26:36.477 00:08:36 -- spdk/autotest.sh@189 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:26:36.477 00:08:36 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:26:36.477 00:08:36 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:26:36.477 00:08:36 -- common/autotest_common.sh@10 -- # set +x 00:26:36.477 ************************************ 00:26:36.477 START TEST reactor_set_interrupt 00:26:36.477 ************************************ 00:26:36.477 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:26:36.477 * Looking for test storage... 00:26:36.477 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:36.477 00:08:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:26:36.477 00:08:36 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:26:36.477 00:08:36 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:36.477 00:08:36 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:36.477 00:08:36 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:26:36.477 00:08:36 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:36.477 00:08:36 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:26:36.477 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:26:36.477 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:26:36.477 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:26:36.477 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:26:36.477 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@38 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:26:36.477 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@43 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:26:36.477 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@44 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:26:36.477 00:08:36 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=n 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=n 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES= 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:26:36.478 00:08:36 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:26:36.478 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@53 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:26:36.478 00:08:36 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:26:36.478 00:08:36 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:26:36.478 00:08:36 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:26:36.478 00:08:36 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:36.478 00:08:36 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:26:36.478 00:08:36 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:26:36.478 00:08:36 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:26:36.478 00:08:36 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:26:36.478 00:08:36 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:26:36.478 00:08:36 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:26:36.478 00:08:36 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:26:36.478 00:08:36 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:26:36.478 00:08:36 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:26:36.478 00:08:36 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:26:36.478 00:08:36 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:26:36.478 #define SPDK_CONFIG_H 00:26:36.478 #define SPDK_CONFIG_APPS 1 00:26:36.478 #define SPDK_CONFIG_ARCH native 00:26:36.478 #undef SPDK_CONFIG_ASAN 00:26:36.478 #undef SPDK_CONFIG_AVAHI 00:26:36.478 #undef SPDK_CONFIG_CET 00:26:36.478 #define SPDK_CONFIG_COVERAGE 1 00:26:36.478 #define SPDK_CONFIG_CROSS_PREFIX 00:26:36.478 #define SPDK_CONFIG_CRYPTO 1 00:26:36.478 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:26:36.478 #undef SPDK_CONFIG_CUSTOMOCF 00:26:36.478 #undef SPDK_CONFIG_DAOS 00:26:36.478 #define SPDK_CONFIG_DAOS_DIR 00:26:36.478 #define SPDK_CONFIG_DEBUG 1 00:26:36.478 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:26:36.478 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:26:36.478 #define SPDK_CONFIG_DPDK_INC_DIR 00:26:36.478 #define SPDK_CONFIG_DPDK_LIB_DIR 00:26:36.478 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:26:36.478 #undef SPDK_CONFIG_DPDK_UADK 00:26:36.478 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:26:36.478 #define SPDK_CONFIG_EXAMPLES 1 00:26:36.478 #undef SPDK_CONFIG_FC 00:26:36.478 #define SPDK_CONFIG_FC_PATH 00:26:36.478 #define SPDK_CONFIG_FIO_PLUGIN 1 00:26:36.478 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:26:36.478 #undef SPDK_CONFIG_FUSE 00:26:36.478 #undef SPDK_CONFIG_FUZZER 00:26:36.478 #define SPDK_CONFIG_FUZZER_LIB 00:26:36.478 #undef SPDK_CONFIG_GOLANG 00:26:36.478 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:26:36.478 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:26:36.478 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:26:36.478 #undef SPDK_CONFIG_HAVE_KEYUTILS 00:26:36.478 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:26:36.478 #undef SPDK_CONFIG_HAVE_LIBBSD 00:26:36.478 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:26:36.478 #define SPDK_CONFIG_IDXD 1 00:26:36.478 #undef SPDK_CONFIG_IDXD_KERNEL 00:26:36.478 #define SPDK_CONFIG_IPSEC_MB 1 00:26:36.478 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:26:36.478 #define SPDK_CONFIG_ISAL 1 00:26:36.478 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:26:36.478 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:26:36.478 #define SPDK_CONFIG_LIBDIR 00:26:36.478 #undef SPDK_CONFIG_LTO 00:26:36.478 #define SPDK_CONFIG_MAX_LCORES 00:26:36.478 #define SPDK_CONFIG_NVME_CUSE 1 00:26:36.478 #undef SPDK_CONFIG_OCF 00:26:36.478 #define SPDK_CONFIG_OCF_PATH 00:26:36.478 #define SPDK_CONFIG_OPENSSL_PATH 00:26:36.478 #undef SPDK_CONFIG_PGO_CAPTURE 00:26:36.478 #define SPDK_CONFIG_PGO_DIR 00:26:36.478 #undef SPDK_CONFIG_PGO_USE 00:26:36.478 #define SPDK_CONFIG_PREFIX /usr/local 00:26:36.478 #undef SPDK_CONFIG_RAID5F 00:26:36.478 #undef SPDK_CONFIG_RBD 00:26:36.478 #define SPDK_CONFIG_RDMA 1 00:26:36.478 #define SPDK_CONFIG_RDMA_PROV verbs 00:26:36.478 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:26:36.478 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:26:36.478 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:26:36.478 #define SPDK_CONFIG_SHARED 1 00:26:36.478 #undef SPDK_CONFIG_SMA 00:26:36.478 #define SPDK_CONFIG_TESTS 1 00:26:36.478 #undef SPDK_CONFIG_TSAN 00:26:36.478 #define SPDK_CONFIG_UBLK 1 00:26:36.478 #define SPDK_CONFIG_UBSAN 1 00:26:36.478 #undef SPDK_CONFIG_UNIT_TESTS 00:26:36.478 #undef SPDK_CONFIG_URING 00:26:36.478 #define SPDK_CONFIG_URING_PATH 00:26:36.478 #undef SPDK_CONFIG_URING_ZNS 00:26:36.478 #undef SPDK_CONFIG_USDT 00:26:36.478 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:26:36.478 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:26:36.478 #undef SPDK_CONFIG_VFIO_USER 00:26:36.478 #define SPDK_CONFIG_VFIO_USER_DIR 00:26:36.478 #define SPDK_CONFIG_VHOST 1 00:26:36.478 #define SPDK_CONFIG_VIRTIO 1 00:26:36.478 #undef SPDK_CONFIG_VTUNE 00:26:36.478 #define SPDK_CONFIG_VTUNE_DIR 00:26:36.478 #define SPDK_CONFIG_WERROR 1 00:26:36.478 #define SPDK_CONFIG_WPDK_DIR 00:26:36.478 #undef SPDK_CONFIG_XNVME 00:26:36.478 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:26:36.478 00:08:36 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:26:36.478 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:26:36.478 00:08:36 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:26:36.478 00:08:36 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:26:36.478 00:08:36 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:26:36.478 00:08:36 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:36.478 00:08:36 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:36.478 00:08:36 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:36.478 00:08:36 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:26:36.478 00:08:36 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:36.478 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:26:36.478 00:08:36 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:26:36.478 00:08:36 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:26:36.478 00:08:36 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:26:36.478 00:08:36 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:26:36.479 00:08:36 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:36.479 00:08:36 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:26:36.479 00:08:36 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:26:36.479 00:08:36 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:26:36.479 00:08:36 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:26:36.479 00:08:36 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:26:36.479 00:08:36 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:26:36.479 00:08:36 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:26:36.479 00:08:36 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:26:36.479 00:08:36 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:26:36.479 00:08:36 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:26:36.479 00:08:36 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:26:36.479 00:08:36 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:26:36.479 00:08:36 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:26:36.479 00:08:36 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:26:36.479 00:08:36 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:26:36.479 00:08:36 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:26:36.479 00:08:36 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:26:36.479 00:08:36 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:26:36.479 00:08:36 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:26:36.479 00:08:36 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:26:36.479 00:08:36 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@57 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@58 -- # export RUN_NIGHTLY 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@61 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@62 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@63 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@64 -- # export SPDK_RUN_VALGRIND 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@65 -- # : 1 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@66 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@67 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@68 -- # export SPDK_TEST_UNITTEST 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@69 -- # : 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@70 -- # export SPDK_TEST_AUTOBUILD 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@71 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@72 -- # export SPDK_TEST_RELEASE_BUILD 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@73 -- # : 1 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@74 -- # export SPDK_TEST_ISAL 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@75 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@76 -- # export SPDK_TEST_ISCSI 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@77 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@78 -- # export SPDK_TEST_ISCSI_INITIATOR 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@79 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@80 -- # export SPDK_TEST_NVME 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@81 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@82 -- # export SPDK_TEST_NVME_PMR 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@83 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@84 -- # export SPDK_TEST_NVME_BP 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@85 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@86 -- # export SPDK_TEST_NVME_CLI 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@87 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@88 -- # export SPDK_TEST_NVME_CUSE 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@89 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@90 -- # export SPDK_TEST_NVME_FDP 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@91 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@92 -- # export SPDK_TEST_NVMF 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@93 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@94 -- # export SPDK_TEST_VFIOUSER 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@95 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@96 -- # export SPDK_TEST_VFIOUSER_QEMU 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@97 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@98 -- # export SPDK_TEST_FUZZER 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@99 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@100 -- # export SPDK_TEST_FUZZER_SHORT 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@101 -- # : rdma 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@102 -- # export SPDK_TEST_NVMF_TRANSPORT 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@103 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@104 -- # export SPDK_TEST_RBD 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@105 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@106 -- # export SPDK_TEST_VHOST 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@107 -- # : 1 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@108 -- # export SPDK_TEST_BLOCKDEV 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@109 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@110 -- # export SPDK_TEST_IOAT 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@111 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@112 -- # export SPDK_TEST_BLOBFS 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@113 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@114 -- # export SPDK_TEST_VHOST_INIT 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@115 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@116 -- # export SPDK_TEST_LVOL 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@117 -- # : 1 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@118 -- # export SPDK_TEST_VBDEV_COMPRESS 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@119 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@120 -- # export SPDK_RUN_ASAN 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@121 -- # : 1 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@122 -- # export SPDK_RUN_UBSAN 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@123 -- # : 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@124 -- # export SPDK_RUN_EXTERNAL_DPDK 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@125 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@126 -- # export SPDK_RUN_NON_ROOT 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@127 -- # : 1 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@128 -- # export SPDK_TEST_CRYPTO 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@129 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@130 -- # export SPDK_TEST_FTL 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@131 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@132 -- # export SPDK_TEST_OCF 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@133 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@134 -- # export SPDK_TEST_VMD 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@135 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@136 -- # export SPDK_TEST_OPAL 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@137 -- # : 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@138 -- # export SPDK_TEST_NATIVE_DPDK 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@139 -- # : true 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@140 -- # export SPDK_AUTOTEST_X 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@141 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@142 -- # export SPDK_TEST_RAID5 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@143 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@144 -- # export SPDK_TEST_URING 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@145 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@146 -- # export SPDK_TEST_USDT 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@147 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@148 -- # export SPDK_TEST_USE_IGB_UIO 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@149 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@150 -- # export SPDK_TEST_SCHEDULER 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@151 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@152 -- # export SPDK_TEST_SCANBUILD 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@153 -- # : 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@154 -- # export SPDK_TEST_NVMF_NICS 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@155 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@156 -- # export SPDK_TEST_SMA 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@157 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@158 -- # export SPDK_TEST_DAOS 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@159 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@160 -- # export SPDK_TEST_XNVME 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@161 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@162 -- # export SPDK_TEST_ACCEL_DSA 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@163 -- # : 0 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@164 -- # export SPDK_TEST_ACCEL_IAA 00:26:36.479 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@166 -- # : 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@167 -- # export SPDK_TEST_FUZZER_TARGET 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@168 -- # : 0 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@169 -- # export SPDK_TEST_NVMF_MDNS 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@170 -- # : 0 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@171 -- # export SPDK_JSONRPC_GO_CLIENT 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@174 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@174 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@175 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@175 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@176 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@176 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@177 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@180 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@180 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@184 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@184 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@188 -- # export PYTHONDONTWRITEBYTECODE=1 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@188 -- # PYTHONDONTWRITEBYTECODE=1 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@192 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@192 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@193 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@193 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@197 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@198 -- # rm -rf /var/tmp/asan_suppression_file 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@199 -- # cat 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@235 -- # echo leak:libfuse3.so 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@237 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@237 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@239 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@239 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@241 -- # '[' -z /var/spdk/dependencies ']' 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@244 -- # export DEPENDENCY_DIR 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@248 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@248 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@249 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@249 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@252 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@252 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@253 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@253 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@255 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@255 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@258 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@258 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@261 -- # '[' 0 -eq 0 ']' 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@262 -- # export valgrind= 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@262 -- # valgrind= 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@268 -- # uname -s 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@268 -- # '[' Linux = Linux ']' 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@269 -- # HUGEMEM=4096 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@270 -- # export CLEAR_HUGE=yes 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@270 -- # CLEAR_HUGE=yes 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@271 -- # [[ 1 -eq 1 ]] 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@275 -- # export HUGE_EVEN_ALLOC=yes 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@275 -- # HUGE_EVEN_ALLOC=yes 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@278 -- # MAKE=make 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@279 -- # MAKEFLAGS=-j72 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@295 -- # export HUGEMEM=4096 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@295 -- # HUGEMEM=4096 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@297 -- # NO_HUGE=() 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@298 -- # TEST_MODE= 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@317 -- # [[ -z 530857 ]] 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@317 -- # kill -0 530857 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@1676 -- # set_test_storage 2147483648 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@327 -- # [[ -v testdir ]] 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@329 -- # local requested_size=2147483648 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@330 -- # local mount target_dir 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@332 -- # local -A mounts fss sizes avails uses 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local source fs size avail mount use 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@335 -- # local storage_fallback storage_candidates 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@337 -- # mktemp -udt spdk.XXXXXX 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@337 -- # storage_fallback=/tmp/spdk.EGegiu 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@342 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@344 -- # [[ -n '' ]] 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@349 -- # [[ -n '' ]] 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@354 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.EGegiu/tests/interrupt /tmp/spdk.EGegiu 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@357 -- # requested_size=2214592512 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@326 -- # df -T 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@326 -- # grep -v Filesystem 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@360 -- # mounts["$mount"]=spdk_devtmpfs 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@360 -- # fss["$mount"]=devtmpfs 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@361 -- # avails["$mount"]=67108864 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@361 -- # sizes["$mount"]=67108864 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@362 -- # uses["$mount"]=0 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@360 -- # mounts["$mount"]=/dev/pmem0 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@360 -- # fss["$mount"]=ext2 00:26:36.480 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@361 -- # avails["$mount"]=969789440 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@361 -- # sizes["$mount"]=5284429824 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@362 -- # uses["$mount"]=4314640384 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@360 -- # mounts["$mount"]=spdk_root 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@360 -- # fss["$mount"]=overlay 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@361 -- # avails["$mount"]=88767176704 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@361 -- # sizes["$mount"]=94508531712 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@362 -- # uses["$mount"]=5741355008 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@361 -- # avails["$mount"]=47249555456 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@361 -- # sizes["$mount"]=47254265856 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@362 -- # uses["$mount"]=4710400 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@361 -- # avails["$mount"]=18892214272 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@361 -- # sizes["$mount"]=18901708800 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@362 -- # uses["$mount"]=9494528 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@361 -- # avails["$mount"]=47253536768 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@361 -- # sizes["$mount"]=47254265856 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@362 -- # uses["$mount"]=729088 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@361 -- # avails["$mount"]=9450848256 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@361 -- # sizes["$mount"]=9450852352 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@362 -- # uses["$mount"]=4096 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@365 -- # printf '* Looking for test storage...\n' 00:26:36.481 * Looking for test storage... 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@367 -- # local target_space new_size 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@368 -- # for target_dir in "${storage_candidates[@]}" 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@371 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@371 -- # awk '$1 !~ /Filesystem/{print $6}' 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@371 -- # mount=/ 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@373 -- # target_space=88767176704 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@374 -- # (( target_space == 0 || target_space < requested_size )) 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@377 -- # (( target_space >= requested_size )) 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@379 -- # [[ overlay == tmpfs ]] 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@379 -- # [[ overlay == ramfs ]] 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@379 -- # [[ / == / ]] 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@380 -- # new_size=7955947520 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@381 -- # (( new_size * 100 / sizes[/] > 95 )) 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@386 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@386 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@387 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:36.481 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@388 -- # return 0 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@1678 -- # set -o errtrace 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@1679 -- # shopt -s extdebug 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@1680 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # PS4=' \t $test_domain -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # true 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@1685 -- # xtrace_fd 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:26:36.481 00:08:36 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:26:36.481 00:08:36 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:36.481 00:08:36 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:26:36.481 00:08:36 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:26:36.481 00:08:36 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:26:36.481 00:08:36 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:26:36.481 00:08:36 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:26:36.481 00:08:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:26:36.481 00:08:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:26:36.481 00:08:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:26:36.481 00:08:36 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:36.481 00:08:36 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:26:36.481 00:08:36 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=530898 00:26:36.481 00:08:36 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:26:36.481 00:08:36 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:36.481 00:08:36 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 530898 /var/tmp/spdk.sock 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@827 -- # '[' -z 530898 ']' 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@832 -- # local max_retries=100 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:36.481 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@836 -- # xtrace_disable 00:26:36.481 00:08:36 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:26:36.481 [2024-05-15 00:08:36.979161] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:26:36.481 [2024-05-15 00:08:36.979226] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid530898 ] 00:26:36.742 [2024-05-15 00:08:37.108290] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:26:36.742 [2024-05-15 00:08:37.212562] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:26:36.742 [2024-05-15 00:08:37.212657] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:26:36.742 [2024-05-15 00:08:37.212662] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:36.742 [2024-05-15 00:08:37.283748] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:26:37.373 00:08:37 reactor_set_interrupt -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:26:37.373 00:08:37 reactor_set_interrupt -- common/autotest_common.sh@860 -- # return 0 00:26:37.373 00:08:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:26:37.373 00:08:37 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:37.632 Malloc0 00:26:37.632 Malloc1 00:26:37.632 Malloc2 00:26:37.632 00:08:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:26:37.632 00:08:38 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:26:37.632 00:08:38 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:26:37.632 00:08:38 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:26:37.632 5000+0 records in 00:26:37.632 5000+0 records out 00:26:37.632 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0280594 s, 365 MB/s 00:26:37.632 00:08:38 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:26:37.890 AIO0 00:26:37.890 00:08:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 530898 00:26:37.890 00:08:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 530898 without_thd 00:26:37.890 00:08:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=530898 00:26:37.890 00:08:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:26:37.890 00:08:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:26:37.890 00:08:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:26:37.890 00:08:38 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:26:37.891 00:08:38 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:26:37.891 00:08:38 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:26:37.891 00:08:38 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:26:37.891 00:08:38 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:26:37.891 00:08:38 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:26:38.149 00:08:38 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:26:38.149 00:08:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:26:38.149 00:08:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:26:38.149 00:08:38 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:26:38.149 00:08:38 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:26:38.149 00:08:38 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:26:38.149 00:08:38 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:26:38.149 00:08:38 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:26:38.149 00:08:38 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:26:38.407 00:08:38 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:26:38.407 00:08:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:26:38.408 00:08:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:26:38.408 spdk_thread ids are 1 on reactor0. 00:26:38.408 00:08:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:26:38.408 00:08:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 530898 0 00:26:38.408 00:08:38 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 530898 0 idle 00:26:38.408 00:08:38 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=530898 00:26:38.408 00:08:38 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:26:38.408 00:08:38 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:38.408 00:08:38 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:38.408 00:08:38 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:38.408 00:08:38 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:38.408 00:08:38 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:38.408 00:08:38 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:38.408 00:08:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 530898 -w 256 00:26:38.408 00:08:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:26:38.666 00:08:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 530898 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.38 reactor_0' 00:26:38.666 00:08:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 530898 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.38 reactor_0 00:26:38.666 00:08:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:38.666 00:08:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:38.666 00:08:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:38.666 00:08:39 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:38.666 00:08:39 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:38.666 00:08:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:38.666 00:08:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:38.666 00:08:39 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:38.666 00:08:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:26:38.666 00:08:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 530898 1 00:26:38.666 00:08:39 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 530898 1 idle 00:26:38.666 00:08:39 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=530898 00:26:38.666 00:08:39 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:26:38.666 00:08:39 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:38.666 00:08:39 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:38.666 00:08:39 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:38.666 00:08:39 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:38.666 00:08:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:38.666 00:08:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:38.666 00:08:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 530898 -w 256 00:26:38.666 00:08:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:26:38.666 00:08:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 530901 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.00 reactor_1' 00:26:38.666 00:08:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 530901 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.00 reactor_1 00:26:38.666 00:08:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:38.666 00:08:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:38.925 00:08:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:38.925 00:08:39 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 530898 2 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 530898 2 idle 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=530898 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 530898 -w 256 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 530902 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.00 reactor_2' 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 530902 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.00 reactor_2 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:26:38.926 00:08:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:26:39.184 [2024-05-15 00:08:39.673519] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:26:39.184 00:08:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:26:39.442 [2024-05-15 00:08:39.925248] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:26:39.443 [2024-05-15 00:08:39.925633] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:39.443 00:08:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:26:39.700 [2024-05-15 00:08:40.185187] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:26:39.700 [2024-05-15 00:08:40.185325] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:39.700 00:08:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:26:39.700 00:08:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 530898 0 00:26:39.700 00:08:40 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 530898 0 busy 00:26:39.700 00:08:40 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=530898 00:26:39.700 00:08:40 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:26:39.700 00:08:40 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:26:39.700 00:08:40 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:26:39.700 00:08:40 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:39.700 00:08:40 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:39.700 00:08:40 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:39.700 00:08:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 530898 -w 256 00:26:39.701 00:08:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:26:39.959 00:08:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 530898 root 20 0 128.2g 35712 23040 R 93.8 0.0 0:00.83 reactor_0' 00:26:39.959 00:08:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 530898 root 20 0 128.2g 35712 23040 R 93.8 0.0 0:00.83 reactor_0 00:26:39.959 00:08:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:39.959 00:08:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:39.959 00:08:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=93.8 00:26:39.959 00:08:40 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=93 00:26:39.959 00:08:40 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:26:39.959 00:08:40 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 93 -lt 70 ]] 00:26:39.959 00:08:40 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:26:39.959 00:08:40 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:39.959 00:08:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:26:39.959 00:08:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 530898 2 00:26:39.959 00:08:40 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 530898 2 busy 00:26:39.959 00:08:40 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=530898 00:26:39.959 00:08:40 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:26:39.959 00:08:40 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:26:39.959 00:08:40 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:26:39.959 00:08:40 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:39.959 00:08:40 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:39.959 00:08:40 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:39.959 00:08:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 530898 -w 256 00:26:39.959 00:08:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:26:40.217 00:08:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 530902 root 20 0 128.2g 35712 23040 R 99.9 0.0 0:00.36 reactor_2' 00:26:40.217 00:08:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 530902 root 20 0 128.2g 35712 23040 R 99.9 0.0 0:00.36 reactor_2 00:26:40.217 00:08:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:40.217 00:08:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:40.217 00:08:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:26:40.217 00:08:40 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:26:40.217 00:08:40 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:26:40.217 00:08:40 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:26:40.217 00:08:40 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:26:40.217 00:08:40 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:40.217 00:08:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:26:40.217 [2024-05-15 00:08:40.789165] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:26:40.217 [2024-05-15 00:08:40.789264] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:40.476 00:08:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:26:40.476 00:08:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 530898 2 00:26:40.476 00:08:40 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 530898 2 idle 00:26:40.476 00:08:40 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=530898 00:26:40.476 00:08:40 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:26:40.476 00:08:40 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:40.476 00:08:40 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:40.476 00:08:40 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:40.476 00:08:40 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:40.476 00:08:40 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:40.476 00:08:40 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:40.476 00:08:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 530898 -w 256 00:26:40.476 00:08:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:26:40.476 00:08:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 530902 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.60 reactor_2' 00:26:40.476 00:08:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 530902 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.60 reactor_2 00:26:40.476 00:08:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:40.476 00:08:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:40.476 00:08:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:40.476 00:08:40 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:40.476 00:08:40 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:40.476 00:08:40 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:40.476 00:08:40 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:40.476 00:08:40 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:40.476 00:08:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:26:40.734 [2024-05-15 00:08:41.213163] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:26:40.734 [2024-05-15 00:08:41.213297] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:40.734 00:08:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:26:40.734 00:08:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:26:40.734 00:08:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:26:40.992 [2024-05-15 00:08:41.457308] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:26:40.992 00:08:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 530898 0 00:26:40.992 00:08:41 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 530898 0 idle 00:26:40.992 00:08:41 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=530898 00:26:40.992 00:08:41 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:26:40.992 00:08:41 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:40.992 00:08:41 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:40.992 00:08:41 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:40.992 00:08:41 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:40.992 00:08:41 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:40.992 00:08:41 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:40.992 00:08:41 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 530898 -w 256 00:26:40.992 00:08:41 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:26:41.251 00:08:41 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 530898 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:01.67 reactor_0' 00:26:41.251 00:08:41 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 530898 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:01.67 reactor_0 00:26:41.251 00:08:41 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:41.251 00:08:41 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:41.251 00:08:41 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:41.251 00:08:41 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:41.251 00:08:41 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:41.251 00:08:41 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:41.251 00:08:41 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:41.251 00:08:41 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:41.252 00:08:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:26:41.252 00:08:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:26:41.252 00:08:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:26:41.252 00:08:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 530898 00:26:41.252 00:08:41 reactor_set_interrupt -- common/autotest_common.sh@946 -- # '[' -z 530898 ']' 00:26:41.252 00:08:41 reactor_set_interrupt -- common/autotest_common.sh@950 -- # kill -0 530898 00:26:41.252 00:08:41 reactor_set_interrupt -- common/autotest_common.sh@951 -- # uname 00:26:41.252 00:08:41 reactor_set_interrupt -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:26:41.252 00:08:41 reactor_set_interrupt -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 530898 00:26:41.252 00:08:41 reactor_set_interrupt -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:26:41.252 00:08:41 reactor_set_interrupt -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:26:41.252 00:08:41 reactor_set_interrupt -- common/autotest_common.sh@964 -- # echo 'killing process with pid 530898' 00:26:41.252 killing process with pid 530898 00:26:41.252 00:08:41 reactor_set_interrupt -- common/autotest_common.sh@965 -- # kill 530898 00:26:41.252 00:08:41 reactor_set_interrupt -- common/autotest_common.sh@970 -- # wait 530898 00:26:41.521 00:08:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:26:41.521 00:08:41 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:26:41.521 00:08:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:26:41.521 00:08:41 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:41.521 00:08:41 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:26:41.521 00:08:42 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=531680 00:26:41.521 00:08:42 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:26:41.521 00:08:42 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:41.521 00:08:42 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 531680 /var/tmp/spdk.sock 00:26:41.522 00:08:42 reactor_set_interrupt -- common/autotest_common.sh@827 -- # '[' -z 531680 ']' 00:26:41.522 00:08:42 reactor_set_interrupt -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:41.522 00:08:42 reactor_set_interrupt -- common/autotest_common.sh@832 -- # local max_retries=100 00:26:41.522 00:08:42 reactor_set_interrupt -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:41.522 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:41.522 00:08:42 reactor_set_interrupt -- common/autotest_common.sh@836 -- # xtrace_disable 00:26:41.522 00:08:42 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:26:41.522 [2024-05-15 00:08:42.041089] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:26:41.522 [2024-05-15 00:08:42.041160] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid531680 ] 00:26:41.785 [2024-05-15 00:08:42.170059] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:26:41.785 [2024-05-15 00:08:42.277343] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:26:41.785 [2024-05-15 00:08:42.277484] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:26:41.785 [2024-05-15 00:08:42.277490] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:41.785 [2024-05-15 00:08:42.358765] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:26:42.724 00:08:42 reactor_set_interrupt -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:26:42.724 00:08:42 reactor_set_interrupt -- common/autotest_common.sh@860 -- # return 0 00:26:42.724 00:08:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:26:42.724 00:08:42 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:42.724 Malloc0 00:26:42.724 Malloc1 00:26:42.724 Malloc2 00:26:42.724 00:08:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:26:42.724 00:08:43 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:26:42.724 00:08:43 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:26:42.724 00:08:43 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:26:42.983 5000+0 records in 00:26:42.983 5000+0 records out 00:26:42.983 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0249768 s, 410 MB/s 00:26:42.983 00:08:43 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:26:42.983 AIO0 00:26:43.243 00:08:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 531680 00:26:43.243 00:08:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 531680 00:26:43.243 00:08:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=531680 00:26:43.243 00:08:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:26:43.243 00:08:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:26:43.243 00:08:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:26:43.243 00:08:43 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:26:43.243 00:08:43 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:26:43.243 00:08:43 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:26:43.243 00:08:43 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:26:43.243 00:08:43 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:26:43.243 00:08:43 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:26:43.520 00:08:43 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:26:43.520 00:08:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:26:43.520 00:08:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:26:43.520 00:08:43 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:26:43.520 00:08:43 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:26:43.520 00:08:43 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:26:43.520 00:08:43 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:26:43.520 00:08:43 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:26:43.520 00:08:43 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:26:43.520 00:08:44 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:26:43.520 00:08:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:26:43.520 00:08:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:26:43.520 spdk_thread ids are 1 on reactor0. 00:26:43.520 00:08:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:26:43.520 00:08:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 531680 0 00:26:43.520 00:08:44 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 531680 0 idle 00:26:43.520 00:08:44 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=531680 00:26:43.520 00:08:44 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:26:43.520 00:08:44 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:43.520 00:08:44 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:43.520 00:08:44 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:43.520 00:08:44 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:43.520 00:08:44 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:43.520 00:08:44 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:43.520 00:08:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 531680 -w 256 00:26:43.520 00:08:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:26:43.779 00:08:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 531680 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.41 reactor_0' 00:26:43.779 00:08:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 531680 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.41 reactor_0 00:26:43.779 00:08:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:43.779 00:08:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:43.779 00:08:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:43.779 00:08:44 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:43.779 00:08:44 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:43.779 00:08:44 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:43.779 00:08:44 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:43.779 00:08:44 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:43.779 00:08:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:26:43.779 00:08:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 531680 1 00:26:43.779 00:08:44 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 531680 1 idle 00:26:43.779 00:08:44 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=531680 00:26:43.779 00:08:44 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:26:43.779 00:08:44 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:43.779 00:08:44 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:43.779 00:08:44 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:43.779 00:08:44 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:43.779 00:08:44 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:43.779 00:08:44 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:43.779 00:08:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 531680 -w 256 00:26:43.779 00:08:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 531684 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1' 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 531684 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 531680 2 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 531680 2 idle 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=531680 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 531680 -w 256 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 531685 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2' 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 531685 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:26:44.039 00:08:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:26:44.297 [2024-05-15 00:08:44.790115] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:26:44.297 [2024-05-15 00:08:44.790308] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:26:44.297 [2024-05-15 00:08:44.790501] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:44.297 00:08:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:26:44.556 [2024-05-15 00:08:45.042717] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:26:44.556 [2024-05-15 00:08:45.042919] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:44.556 00:08:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:26:44.556 00:08:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 531680 0 00:26:44.556 00:08:45 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 531680 0 busy 00:26:44.556 00:08:45 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=531680 00:26:44.556 00:08:45 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:26:44.556 00:08:45 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:26:44.556 00:08:45 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:26:44.556 00:08:45 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:44.556 00:08:45 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:44.556 00:08:45 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:44.556 00:08:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 531680 -w 256 00:26:44.556 00:08:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:26:44.814 00:08:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 531680 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.85 reactor_0' 00:26:44.814 00:08:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 531680 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.85 reactor_0 00:26:44.814 00:08:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:44.814 00:08:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:44.814 00:08:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:26:44.814 00:08:45 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:26:44.814 00:08:45 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:26:44.814 00:08:45 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:26:44.814 00:08:45 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:26:44.814 00:08:45 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:44.814 00:08:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:26:44.814 00:08:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 531680 2 00:26:44.814 00:08:45 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 531680 2 busy 00:26:44.814 00:08:45 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=531680 00:26:44.814 00:08:45 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:26:44.814 00:08:45 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:26:44.814 00:08:45 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:26:44.814 00:08:45 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:44.814 00:08:45 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:44.814 00:08:45 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:44.814 00:08:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 531680 -w 256 00:26:44.814 00:08:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:26:45.073 00:08:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 531685 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.36 reactor_2' 00:26:45.073 00:08:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 531685 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.36 reactor_2 00:26:45.073 00:08:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:45.073 00:08:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:45.073 00:08:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:26:45.073 00:08:45 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:26:45.073 00:08:45 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:26:45.073 00:08:45 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:26:45.073 00:08:45 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:26:45.073 00:08:45 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:45.073 00:08:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:26:45.073 [2024-05-15 00:08:45.648437] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:26:45.073 [2024-05-15 00:08:45.648535] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:45.331 00:08:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:26:45.331 00:08:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 531680 2 00:26:45.331 00:08:45 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 531680 2 idle 00:26:45.331 00:08:45 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=531680 00:26:45.331 00:08:45 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:26:45.331 00:08:45 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:45.331 00:08:45 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:45.331 00:08:45 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:45.331 00:08:45 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:45.331 00:08:45 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:45.331 00:08:45 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:45.331 00:08:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:26:45.331 00:08:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 531680 -w 256 00:26:45.331 00:08:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 531685 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.60 reactor_2' 00:26:45.331 00:08:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 531685 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.60 reactor_2 00:26:45.331 00:08:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:45.331 00:08:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:45.331 00:08:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:45.331 00:08:45 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:45.331 00:08:45 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:45.331 00:08:45 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:45.331 00:08:45 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:45.331 00:08:45 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:45.331 00:08:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:26:45.589 [2024-05-15 00:08:46.073520] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:26:45.589 [2024-05-15 00:08:46.073696] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:26:45.589 [2024-05-15 00:08:46.073725] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:45.589 00:08:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:26:45.589 00:08:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 531680 0 00:26:45.589 00:08:46 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 531680 0 idle 00:26:45.589 00:08:46 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=531680 00:26:45.589 00:08:46 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:26:45.589 00:08:46 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:45.589 00:08:46 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:45.589 00:08:46 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:45.589 00:08:46 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:45.589 00:08:46 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:45.589 00:08:46 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:45.589 00:08:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:26:45.589 00:08:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 531680 -w 256 00:26:45.848 00:08:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 531680 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.69 reactor_0' 00:26:45.848 00:08:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 531680 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.69 reactor_0 00:26:45.848 00:08:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:45.848 00:08:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:45.848 00:08:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:45.848 00:08:46 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:45.848 00:08:46 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:45.848 00:08:46 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:45.848 00:08:46 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:45.848 00:08:46 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:45.848 00:08:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:26:45.848 00:08:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:26:45.848 00:08:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:26:45.848 00:08:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 531680 00:26:45.848 00:08:46 reactor_set_interrupt -- common/autotest_common.sh@946 -- # '[' -z 531680 ']' 00:26:45.848 00:08:46 reactor_set_interrupt -- common/autotest_common.sh@950 -- # kill -0 531680 00:26:45.848 00:08:46 reactor_set_interrupt -- common/autotest_common.sh@951 -- # uname 00:26:45.848 00:08:46 reactor_set_interrupt -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:26:45.848 00:08:46 reactor_set_interrupt -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 531680 00:26:45.848 00:08:46 reactor_set_interrupt -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:26:45.848 00:08:46 reactor_set_interrupt -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:26:45.848 00:08:46 reactor_set_interrupt -- common/autotest_common.sh@964 -- # echo 'killing process with pid 531680' 00:26:45.848 killing process with pid 531680 00:26:45.848 00:08:46 reactor_set_interrupt -- common/autotest_common.sh@965 -- # kill 531680 00:26:45.848 00:08:46 reactor_set_interrupt -- common/autotest_common.sh@970 -- # wait 531680 00:26:46.107 00:08:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:26:46.107 00:08:46 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:26:46.107 00:26:46.107 real 0m9.928s 00:26:46.107 user 0m9.433s 00:26:46.107 sys 0m2.073s 00:26:46.107 00:08:46 reactor_set_interrupt -- common/autotest_common.sh@1122 -- # xtrace_disable 00:26:46.107 00:08:46 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:26:46.107 ************************************ 00:26:46.107 END TEST reactor_set_interrupt 00:26:46.107 ************************************ 00:26:46.107 00:08:46 -- spdk/autotest.sh@190 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:26:46.107 00:08:46 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:26:46.107 00:08:46 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:26:46.107 00:08:46 -- common/autotest_common.sh@10 -- # set +x 00:26:46.369 ************************************ 00:26:46.369 START TEST reap_unregistered_poller 00:26:46.369 ************************************ 00:26:46.369 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:26:46.369 * Looking for test storage... 00:26:46.369 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:46.369 00:08:46 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:26:46.369 00:08:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:26:46.369 00:08:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:46.369 00:08:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:46.369 00:08:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:26:46.369 00:08:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:46.369 00:08:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:26:46.369 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:26:46.369 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:26:46.370 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:26:46.370 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:26:46.370 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@38 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:26:46.370 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@43 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:26:46.370 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@44 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES= 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:26:46.370 00:08:46 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:26:46.370 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@53 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:26:46.370 00:08:46 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:26:46.370 00:08:46 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:26:46.370 00:08:46 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:26:46.370 00:08:46 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:46.370 00:08:46 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:26:46.371 00:08:46 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:26:46.371 00:08:46 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:26:46.371 00:08:46 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:26:46.371 00:08:46 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:26:46.371 00:08:46 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:26:46.371 00:08:46 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:26:46.371 00:08:46 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:26:46.371 00:08:46 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:26:46.371 00:08:46 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:26:46.371 00:08:46 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:26:46.371 #define SPDK_CONFIG_H 00:26:46.371 #define SPDK_CONFIG_APPS 1 00:26:46.371 #define SPDK_CONFIG_ARCH native 00:26:46.371 #undef SPDK_CONFIG_ASAN 00:26:46.371 #undef SPDK_CONFIG_AVAHI 00:26:46.371 #undef SPDK_CONFIG_CET 00:26:46.371 #define SPDK_CONFIG_COVERAGE 1 00:26:46.371 #define SPDK_CONFIG_CROSS_PREFIX 00:26:46.371 #define SPDK_CONFIG_CRYPTO 1 00:26:46.371 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:26:46.371 #undef SPDK_CONFIG_CUSTOMOCF 00:26:46.371 #undef SPDK_CONFIG_DAOS 00:26:46.371 #define SPDK_CONFIG_DAOS_DIR 00:26:46.371 #define SPDK_CONFIG_DEBUG 1 00:26:46.371 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:26:46.371 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:26:46.371 #define SPDK_CONFIG_DPDK_INC_DIR 00:26:46.371 #define SPDK_CONFIG_DPDK_LIB_DIR 00:26:46.371 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:26:46.371 #undef SPDK_CONFIG_DPDK_UADK 00:26:46.371 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:26:46.371 #define SPDK_CONFIG_EXAMPLES 1 00:26:46.371 #undef SPDK_CONFIG_FC 00:26:46.371 #define SPDK_CONFIG_FC_PATH 00:26:46.371 #define SPDK_CONFIG_FIO_PLUGIN 1 00:26:46.371 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:26:46.371 #undef SPDK_CONFIG_FUSE 00:26:46.371 #undef SPDK_CONFIG_FUZZER 00:26:46.371 #define SPDK_CONFIG_FUZZER_LIB 00:26:46.371 #undef SPDK_CONFIG_GOLANG 00:26:46.371 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:26:46.371 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:26:46.371 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:26:46.371 #undef SPDK_CONFIG_HAVE_KEYUTILS 00:26:46.371 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:26:46.371 #undef SPDK_CONFIG_HAVE_LIBBSD 00:26:46.371 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:26:46.371 #define SPDK_CONFIG_IDXD 1 00:26:46.371 #undef SPDK_CONFIG_IDXD_KERNEL 00:26:46.371 #define SPDK_CONFIG_IPSEC_MB 1 00:26:46.371 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:26:46.371 #define SPDK_CONFIG_ISAL 1 00:26:46.371 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:26:46.371 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:26:46.371 #define SPDK_CONFIG_LIBDIR 00:26:46.371 #undef SPDK_CONFIG_LTO 00:26:46.371 #define SPDK_CONFIG_MAX_LCORES 00:26:46.371 #define SPDK_CONFIG_NVME_CUSE 1 00:26:46.371 #undef SPDK_CONFIG_OCF 00:26:46.371 #define SPDK_CONFIG_OCF_PATH 00:26:46.371 #define SPDK_CONFIG_OPENSSL_PATH 00:26:46.371 #undef SPDK_CONFIG_PGO_CAPTURE 00:26:46.371 #define SPDK_CONFIG_PGO_DIR 00:26:46.371 #undef SPDK_CONFIG_PGO_USE 00:26:46.371 #define SPDK_CONFIG_PREFIX /usr/local 00:26:46.371 #undef SPDK_CONFIG_RAID5F 00:26:46.371 #undef SPDK_CONFIG_RBD 00:26:46.371 #define SPDK_CONFIG_RDMA 1 00:26:46.371 #define SPDK_CONFIG_RDMA_PROV verbs 00:26:46.371 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:26:46.371 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:26:46.371 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:26:46.371 #define SPDK_CONFIG_SHARED 1 00:26:46.371 #undef SPDK_CONFIG_SMA 00:26:46.371 #define SPDK_CONFIG_TESTS 1 00:26:46.371 #undef SPDK_CONFIG_TSAN 00:26:46.371 #define SPDK_CONFIG_UBLK 1 00:26:46.371 #define SPDK_CONFIG_UBSAN 1 00:26:46.371 #undef SPDK_CONFIG_UNIT_TESTS 00:26:46.371 #undef SPDK_CONFIG_URING 00:26:46.371 #define SPDK_CONFIG_URING_PATH 00:26:46.371 #undef SPDK_CONFIG_URING_ZNS 00:26:46.371 #undef SPDK_CONFIG_USDT 00:26:46.371 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:26:46.371 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:26:46.371 #undef SPDK_CONFIG_VFIO_USER 00:26:46.371 #define SPDK_CONFIG_VFIO_USER_DIR 00:26:46.371 #define SPDK_CONFIG_VHOST 1 00:26:46.371 #define SPDK_CONFIG_VIRTIO 1 00:26:46.371 #undef SPDK_CONFIG_VTUNE 00:26:46.371 #define SPDK_CONFIG_VTUNE_DIR 00:26:46.371 #define SPDK_CONFIG_WERROR 1 00:26:46.371 #define SPDK_CONFIG_WPDK_DIR 00:26:46.371 #undef SPDK_CONFIG_XNVME 00:26:46.371 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:26:46.371 00:08:46 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:26:46.371 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:26:46.371 00:08:46 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:26:46.371 00:08:46 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:26:46.371 00:08:46 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:26:46.371 00:08:46 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:46.371 00:08:46 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:46.371 00:08:46 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:46.371 00:08:46 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:26:46.371 00:08:46 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:46.371 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:26:46.371 00:08:46 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:26:46.371 00:08:46 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:26:46.371 00:08:46 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:26:46.371 00:08:46 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:26:46.371 00:08:46 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:46.371 00:08:46 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:26:46.371 00:08:46 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:26:46.371 00:08:46 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:26:46.371 00:08:46 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:26:46.371 00:08:46 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:26:46.371 00:08:46 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:26:46.371 00:08:46 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:26:46.371 00:08:46 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:26:46.371 00:08:46 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:26:46.371 00:08:46 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:26:46.371 00:08:46 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:26:46.371 00:08:46 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:26:46.371 00:08:46 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:26:46.371 00:08:46 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:26:46.371 00:08:46 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:26:46.371 00:08:46 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:26:46.371 00:08:46 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:26:46.371 00:08:46 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:26:46.371 00:08:46 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:26:46.371 00:08:46 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:26:46.371 00:08:46 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:26:46.371 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@57 -- # : 0 00:26:46.371 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@58 -- # export RUN_NIGHTLY 00:26:46.371 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@61 -- # : 0 00:26:46.371 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@62 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:26:46.371 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@63 -- # : 0 00:26:46.371 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@64 -- # export SPDK_RUN_VALGRIND 00:26:46.371 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@65 -- # : 1 00:26:46.371 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@66 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:26:46.371 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@67 -- # : 0 00:26:46.371 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@68 -- # export SPDK_TEST_UNITTEST 00:26:46.371 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@69 -- # : 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@70 -- # export SPDK_TEST_AUTOBUILD 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@71 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@72 -- # export SPDK_TEST_RELEASE_BUILD 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@73 -- # : 1 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@74 -- # export SPDK_TEST_ISAL 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@75 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@76 -- # export SPDK_TEST_ISCSI 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@77 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@78 -- # export SPDK_TEST_ISCSI_INITIATOR 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@79 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@80 -- # export SPDK_TEST_NVME 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@81 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@82 -- # export SPDK_TEST_NVME_PMR 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@83 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@84 -- # export SPDK_TEST_NVME_BP 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@85 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@86 -- # export SPDK_TEST_NVME_CLI 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@87 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@88 -- # export SPDK_TEST_NVME_CUSE 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@89 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@90 -- # export SPDK_TEST_NVME_FDP 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@91 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@92 -- # export SPDK_TEST_NVMF 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@93 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@94 -- # export SPDK_TEST_VFIOUSER 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@95 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@96 -- # export SPDK_TEST_VFIOUSER_QEMU 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@97 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@98 -- # export SPDK_TEST_FUZZER 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@99 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@100 -- # export SPDK_TEST_FUZZER_SHORT 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@101 -- # : rdma 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@102 -- # export SPDK_TEST_NVMF_TRANSPORT 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@103 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@104 -- # export SPDK_TEST_RBD 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@105 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@106 -- # export SPDK_TEST_VHOST 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@107 -- # : 1 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@108 -- # export SPDK_TEST_BLOCKDEV 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@109 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@110 -- # export SPDK_TEST_IOAT 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@111 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@112 -- # export SPDK_TEST_BLOBFS 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@113 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@114 -- # export SPDK_TEST_VHOST_INIT 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@115 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@116 -- # export SPDK_TEST_LVOL 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@117 -- # : 1 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@118 -- # export SPDK_TEST_VBDEV_COMPRESS 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@119 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@120 -- # export SPDK_RUN_ASAN 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@121 -- # : 1 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@122 -- # export SPDK_RUN_UBSAN 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@123 -- # : 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@124 -- # export SPDK_RUN_EXTERNAL_DPDK 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@125 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@126 -- # export SPDK_RUN_NON_ROOT 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@127 -- # : 1 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@128 -- # export SPDK_TEST_CRYPTO 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@129 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@130 -- # export SPDK_TEST_FTL 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@131 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@132 -- # export SPDK_TEST_OCF 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@133 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@134 -- # export SPDK_TEST_VMD 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@135 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@136 -- # export SPDK_TEST_OPAL 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@137 -- # : 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@138 -- # export SPDK_TEST_NATIVE_DPDK 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@139 -- # : true 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@140 -- # export SPDK_AUTOTEST_X 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@141 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@142 -- # export SPDK_TEST_RAID5 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@143 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@144 -- # export SPDK_TEST_URING 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@145 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@146 -- # export SPDK_TEST_USDT 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@147 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@148 -- # export SPDK_TEST_USE_IGB_UIO 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@149 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@150 -- # export SPDK_TEST_SCHEDULER 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@151 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@152 -- # export SPDK_TEST_SCANBUILD 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@153 -- # : 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@154 -- # export SPDK_TEST_NVMF_NICS 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@155 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@156 -- # export SPDK_TEST_SMA 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@157 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@158 -- # export SPDK_TEST_DAOS 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@159 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@160 -- # export SPDK_TEST_XNVME 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@161 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@162 -- # export SPDK_TEST_ACCEL_DSA 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@163 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@164 -- # export SPDK_TEST_ACCEL_IAA 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@166 -- # : 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@167 -- # export SPDK_TEST_FUZZER_TARGET 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@168 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@169 -- # export SPDK_TEST_NVMF_MDNS 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@170 -- # : 0 00:26:46.372 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@171 -- # export SPDK_JSONRPC_GO_CLIENT 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@174 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@174 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@175 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@175 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@176 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@176 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@177 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@180 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@180 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@184 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@184 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@188 -- # export PYTHONDONTWRITEBYTECODE=1 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@188 -- # PYTHONDONTWRITEBYTECODE=1 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@192 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@192 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@193 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@193 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@197 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@198 -- # rm -rf /var/tmp/asan_suppression_file 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@199 -- # cat 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@235 -- # echo leak:libfuse3.so 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@237 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@237 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@239 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@239 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@241 -- # '[' -z /var/spdk/dependencies ']' 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@244 -- # export DEPENDENCY_DIR 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@248 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@248 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@249 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@249 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@252 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@252 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@253 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@253 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@255 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@255 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@258 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@258 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@261 -- # '[' 0 -eq 0 ']' 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@262 -- # export valgrind= 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@262 -- # valgrind= 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@268 -- # uname -s 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@268 -- # '[' Linux = Linux ']' 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@269 -- # HUGEMEM=4096 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@270 -- # export CLEAR_HUGE=yes 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@270 -- # CLEAR_HUGE=yes 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@271 -- # [[ 1 -eq 1 ]] 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@275 -- # export HUGE_EVEN_ALLOC=yes 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@275 -- # HUGE_EVEN_ALLOC=yes 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@278 -- # MAKE=make 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@279 -- # MAKEFLAGS=-j72 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@295 -- # export HUGEMEM=4096 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@295 -- # HUGEMEM=4096 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@297 -- # NO_HUGE=() 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@298 -- # TEST_MODE= 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@317 -- # [[ -z 532327 ]] 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@317 -- # kill -0 532327 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@1676 -- # set_test_storage 2147483648 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@327 -- # [[ -v testdir ]] 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@329 -- # local requested_size=2147483648 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@330 -- # local mount target_dir 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@332 -- # local -A mounts fss sizes avails uses 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local source fs size avail mount use 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@335 -- # local storage_fallback storage_candidates 00:26:46.373 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@337 -- # mktemp -udt spdk.XXXXXX 00:26:46.633 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@337 -- # storage_fallback=/tmp/spdk.u9ylOk 00:26:46.633 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@342 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:26:46.633 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@344 -- # [[ -n '' ]] 00:26:46.633 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@349 -- # [[ -n '' ]] 00:26:46.633 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@354 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.u9ylOk/tests/interrupt /tmp/spdk.u9ylOk 00:26:46.633 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@357 -- # requested_size=2214592512 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@326 -- # df -T 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@326 -- # grep -v Filesystem 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@360 -- # mounts["$mount"]=spdk_devtmpfs 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@360 -- # fss["$mount"]=devtmpfs 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@361 -- # avails["$mount"]=67108864 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@361 -- # sizes["$mount"]=67108864 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@362 -- # uses["$mount"]=0 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@360 -- # mounts["$mount"]=/dev/pmem0 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@360 -- # fss["$mount"]=ext2 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@361 -- # avails["$mount"]=969789440 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@361 -- # sizes["$mount"]=5284429824 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@362 -- # uses["$mount"]=4314640384 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@360 -- # mounts["$mount"]=spdk_root 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@360 -- # fss["$mount"]=overlay 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@361 -- # avails["$mount"]=88767016960 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@361 -- # sizes["$mount"]=94508531712 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@362 -- # uses["$mount"]=5741514752 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@361 -- # avails["$mount"]=47249555456 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@361 -- # sizes["$mount"]=47254265856 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@362 -- # uses["$mount"]=4710400 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@361 -- # avails["$mount"]=18892210176 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@361 -- # sizes["$mount"]=18901708800 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@362 -- # uses["$mount"]=9498624 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@361 -- # avails["$mount"]=47253536768 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@361 -- # sizes["$mount"]=47254265856 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@362 -- # uses["$mount"]=729088 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@361 -- # avails["$mount"]=9450848256 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@361 -- # sizes["$mount"]=9450852352 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@362 -- # uses["$mount"]=4096 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@365 -- # printf '* Looking for test storage...\n' 00:26:46.634 * Looking for test storage... 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@367 -- # local target_space new_size 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@368 -- # for target_dir in "${storage_candidates[@]}" 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@371 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@371 -- # awk '$1 !~ /Filesystem/{print $6}' 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@371 -- # mount=/ 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@373 -- # target_space=88767016960 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@374 -- # (( target_space == 0 || target_space < requested_size )) 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@377 -- # (( target_space >= requested_size )) 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@379 -- # [[ overlay == tmpfs ]] 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@379 -- # [[ overlay == ramfs ]] 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@379 -- # [[ / == / ]] 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@380 -- # new_size=7956107264 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@381 -- # (( new_size * 100 / sizes[/] > 95 )) 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@386 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@386 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@387 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:46.634 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@388 -- # return 0 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@1678 -- # set -o errtrace 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@1679 -- # shopt -s extdebug 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@1680 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # PS4=' \t $test_domain -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # true 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@1685 -- # xtrace_fd 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:26:46.634 00:08:46 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:26:46.634 00:08:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:26:46.634 00:08:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:46.634 00:08:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:26:46.634 00:08:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:26:46.634 00:08:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:26:46.634 00:08:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:26:46.634 00:08:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:26:46.634 00:08:46 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:26:46.634 00:08:46 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:26:46.634 00:08:46 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:26:46.634 00:08:47 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:46.634 00:08:47 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:26:46.634 00:08:47 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=532459 00:26:46.634 00:08:47 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:46.634 00:08:47 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 532459 /var/tmp/spdk.sock 00:26:46.634 00:08:47 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:26:46.634 00:08:47 reap_unregistered_poller -- common/autotest_common.sh@827 -- # '[' -z 532459 ']' 00:26:46.634 00:08:47 reap_unregistered_poller -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:46.634 00:08:47 reap_unregistered_poller -- common/autotest_common.sh@832 -- # local max_retries=100 00:26:46.634 00:08:47 reap_unregistered_poller -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:46.634 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:46.634 00:08:47 reap_unregistered_poller -- common/autotest_common.sh@836 -- # xtrace_disable 00:26:46.634 00:08:47 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:26:46.634 [2024-05-15 00:08:47.042772] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:26:46.634 [2024-05-15 00:08:47.042842] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid532459 ] 00:26:46.634 [2024-05-15 00:08:47.178425] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:26:46.894 [2024-05-15 00:08:47.282292] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:26:46.894 [2024-05-15 00:08:47.282378] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:26:46.894 [2024-05-15 00:08:47.282385] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:46.894 [2024-05-15 00:08:47.356921] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:26:47.461 00:08:47 reap_unregistered_poller -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:26:47.461 00:08:47 reap_unregistered_poller -- common/autotest_common.sh@860 -- # return 0 00:26:47.461 00:08:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:26:47.461 00:08:47 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:47.461 00:08:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:26:47.461 00:08:47 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:26:47.461 00:08:48 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:47.461 00:08:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:26:47.461 "name": "app_thread", 00:26:47.461 "id": 1, 00:26:47.461 "active_pollers": [], 00:26:47.461 "timed_pollers": [ 00:26:47.461 { 00:26:47.461 "name": "rpc_subsystem_poll_servers", 00:26:47.461 "id": 1, 00:26:47.461 "state": "waiting", 00:26:47.461 "run_count": 0, 00:26:47.461 "busy_count": 0, 00:26:47.461 "period_ticks": 9200000 00:26:47.461 } 00:26:47.461 ], 00:26:47.461 "paused_pollers": [] 00:26:47.461 }' 00:26:47.461 00:08:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:26:47.719 00:08:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:26:47.719 00:08:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:26:47.719 00:08:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:26:47.719 00:08:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:26:47.719 00:08:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:26:47.719 00:08:48 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:26:47.719 00:08:48 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:26:47.719 00:08:48 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:26:47.719 5000+0 records in 00:26:47.719 5000+0 records out 00:26:47.719 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0262071 s, 391 MB/s 00:26:47.719 00:08:48 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:26:47.978 AIO0 00:26:47.978 00:08:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:48.237 00:08:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:26:48.237 00:08:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:26:48.237 00:08:48 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:48.237 00:08:48 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:26:48.237 00:08:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:26:48.237 00:08:48 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:48.237 00:08:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:26:48.237 "name": "app_thread", 00:26:48.237 "id": 1, 00:26:48.237 "active_pollers": [], 00:26:48.237 "timed_pollers": [ 00:26:48.237 { 00:26:48.237 "name": "rpc_subsystem_poll_servers", 00:26:48.237 "id": 1, 00:26:48.238 "state": "waiting", 00:26:48.238 "run_count": 0, 00:26:48.238 "busy_count": 0, 00:26:48.238 "period_ticks": 9200000 00:26:48.238 } 00:26:48.238 ], 00:26:48.238 "paused_pollers": [] 00:26:48.238 }' 00:26:48.238 00:08:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:26:48.497 00:08:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:26:48.497 00:08:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:26:48.497 00:08:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:26:48.497 00:08:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:26:48.497 00:08:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:26:48.497 00:08:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:26:48.497 00:08:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 532459 00:26:48.497 00:08:48 reap_unregistered_poller -- common/autotest_common.sh@946 -- # '[' -z 532459 ']' 00:26:48.497 00:08:48 reap_unregistered_poller -- common/autotest_common.sh@950 -- # kill -0 532459 00:26:48.497 00:08:48 reap_unregistered_poller -- common/autotest_common.sh@951 -- # uname 00:26:48.497 00:08:48 reap_unregistered_poller -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:26:48.497 00:08:48 reap_unregistered_poller -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 532459 00:26:48.497 00:08:48 reap_unregistered_poller -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:26:48.497 00:08:48 reap_unregistered_poller -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:26:48.497 00:08:48 reap_unregistered_poller -- common/autotest_common.sh@964 -- # echo 'killing process with pid 532459' 00:26:48.497 killing process with pid 532459 00:26:48.497 00:08:48 reap_unregistered_poller -- common/autotest_common.sh@965 -- # kill 532459 00:26:48.497 00:08:48 reap_unregistered_poller -- common/autotest_common.sh@970 -- # wait 532459 00:26:48.756 00:08:49 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:26:48.756 00:08:49 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:26:48.756 00:26:48.756 real 0m2.524s 00:26:48.756 user 0m1.592s 00:26:48.756 sys 0m0.690s 00:26:48.756 00:08:49 reap_unregistered_poller -- common/autotest_common.sh@1122 -- # xtrace_disable 00:26:48.756 00:08:49 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:26:48.756 ************************************ 00:26:48.756 END TEST reap_unregistered_poller 00:26:48.756 ************************************ 00:26:48.756 00:08:49 -- spdk/autotest.sh@194 -- # uname -s 00:26:48.756 00:08:49 -- spdk/autotest.sh@194 -- # [[ Linux == Linux ]] 00:26:48.756 00:08:49 -- spdk/autotest.sh@195 -- # [[ 1 -eq 1 ]] 00:26:48.756 00:08:49 -- spdk/autotest.sh@201 -- # [[ 1 -eq 0 ]] 00:26:48.756 00:08:49 -- spdk/autotest.sh@207 -- # '[' 0 -eq 1 ']' 00:26:48.756 00:08:49 -- spdk/autotest.sh@252 -- # '[' 0 -eq 1 ']' 00:26:48.756 00:08:49 -- spdk/autotest.sh@256 -- # timing_exit lib 00:26:48.756 00:08:49 -- common/autotest_common.sh@726 -- # xtrace_disable 00:26:48.756 00:08:49 -- common/autotest_common.sh@10 -- # set +x 00:26:48.756 00:08:49 -- spdk/autotest.sh@258 -- # '[' 0 -eq 1 ']' 00:26:48.756 00:08:49 -- spdk/autotest.sh@266 -- # '[' 0 -eq 1 ']' 00:26:48.756 00:08:49 -- spdk/autotest.sh@275 -- # '[' 0 -eq 1 ']' 00:26:48.756 00:08:49 -- spdk/autotest.sh@304 -- # '[' 0 -eq 1 ']' 00:26:48.756 00:08:49 -- spdk/autotest.sh@308 -- # '[' 0 -eq 1 ']' 00:26:48.756 00:08:49 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:26:48.756 00:08:49 -- spdk/autotest.sh@317 -- # '[' 0 -eq 1 ']' 00:26:48.756 00:08:49 -- spdk/autotest.sh@326 -- # '[' 0 -eq 1 ']' 00:26:48.756 00:08:49 -- spdk/autotest.sh@331 -- # '[' 0 -eq 1 ']' 00:26:48.756 00:08:49 -- spdk/autotest.sh@335 -- # '[' 0 -eq 1 ']' 00:26:48.756 00:08:49 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:26:48.756 00:08:49 -- spdk/autotest.sh@343 -- # '[' 1 -eq 1 ']' 00:26:48.756 00:08:49 -- spdk/autotest.sh@344 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:26:48.756 00:08:49 -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:26:48.756 00:08:49 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:26:48.756 00:08:49 -- common/autotest_common.sh@10 -- # set +x 00:26:49.016 ************************************ 00:26:49.016 START TEST compress_compdev 00:26:49.016 ************************************ 00:26:49.016 00:08:49 compress_compdev -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:26:49.016 * Looking for test storage... 00:26:49.016 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:26:49.016 00:08:49 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:26:49.016 00:08:49 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:26:49.016 00:08:49 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:26:49.016 00:08:49 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:26:49.016 00:08:49 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:26:49.016 00:08:49 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:26:49.016 00:08:49 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:26:49.016 00:08:49 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:26:49.016 00:08:49 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:26:49.016 00:08:49 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:26:49.016 00:08:49 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:26:49.016 00:08:49 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:26:49.016 00:08:49 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:26:49.016 00:08:49 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:26:49.016 00:08:49 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:26:49.016 00:08:49 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:26:49.016 00:08:49 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:26:49.016 00:08:49 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:26:49.016 00:08:49 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:26:49.016 00:08:49 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:26:49.016 00:08:49 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:26:49.016 00:08:49 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:26:49.016 00:08:49 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:49.016 00:08:49 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:49.016 00:08:49 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:49.016 00:08:49 compress_compdev -- paths/export.sh@5 -- # export PATH 00:26:49.016 00:08:49 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:49.016 00:08:49 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:26:49.016 00:08:49 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:26:49.016 00:08:49 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:26:49.016 00:08:49 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:26:49.016 00:08:49 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:26:49.016 00:08:49 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:26:49.016 00:08:49 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:26:49.016 00:08:49 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:26:49.016 00:08:49 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:26:49.017 00:08:49 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:49.017 00:08:49 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:26:49.017 00:08:49 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:26:49.017 00:08:49 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:26:49.017 00:08:49 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:26:49.017 00:08:49 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=532811 00:26:49.017 00:08:49 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:49.017 00:08:49 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 532811 00:26:49.017 00:08:49 compress_compdev -- common/autotest_common.sh@827 -- # '[' -z 532811 ']' 00:26:49.017 00:08:49 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:26:49.017 00:08:49 compress_compdev -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:49.017 00:08:49 compress_compdev -- common/autotest_common.sh@832 -- # local max_retries=100 00:26:49.017 00:08:49 compress_compdev -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:49.017 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:49.017 00:08:49 compress_compdev -- common/autotest_common.sh@836 -- # xtrace_disable 00:26:49.017 00:08:49 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:26:49.017 [2024-05-15 00:08:49.592537] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:26:49.017 [2024-05-15 00:08:49.592600] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid532811 ] 00:26:49.276 [2024-05-15 00:08:49.712753] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:26:49.276 [2024-05-15 00:08:49.820424] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:26:49.276 [2024-05-15 00:08:49.820429] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:26:50.211 [2024-05-15 00:08:50.573495] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:26:50.211 00:08:50 compress_compdev -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:26:50.211 00:08:50 compress_compdev -- common/autotest_common.sh@860 -- # return 0 00:26:50.211 00:08:50 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:26:50.211 00:08:50 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:26:50.211 00:08:50 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:26:50.777 [2024-05-15 00:08:51.208745] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x269a5f0 PMD being used: compress_qat 00:26:50.777 00:08:51 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:26:50.777 00:08:51 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=Nvme0n1 00:26:50.777 00:08:51 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:26:50.777 00:08:51 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:26:50.777 00:08:51 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:26:50.777 00:08:51 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:26:50.777 00:08:51 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:51.036 00:08:51 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:26:51.294 [ 00:26:51.294 { 00:26:51.294 "name": "Nvme0n1", 00:26:51.294 "aliases": [ 00:26:51.294 "01000000-0000-0000-5cd2-e43197705251" 00:26:51.294 ], 00:26:51.294 "product_name": "NVMe disk", 00:26:51.294 "block_size": 512, 00:26:51.294 "num_blocks": 15002931888, 00:26:51.294 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:26:51.294 "assigned_rate_limits": { 00:26:51.294 "rw_ios_per_sec": 0, 00:26:51.294 "rw_mbytes_per_sec": 0, 00:26:51.294 "r_mbytes_per_sec": 0, 00:26:51.294 "w_mbytes_per_sec": 0 00:26:51.294 }, 00:26:51.294 "claimed": false, 00:26:51.294 "zoned": false, 00:26:51.294 "supported_io_types": { 00:26:51.294 "read": true, 00:26:51.294 "write": true, 00:26:51.294 "unmap": true, 00:26:51.294 "write_zeroes": true, 00:26:51.295 "flush": true, 00:26:51.295 "reset": true, 00:26:51.295 "compare": false, 00:26:51.295 "compare_and_write": false, 00:26:51.295 "abort": true, 00:26:51.295 "nvme_admin": true, 00:26:51.295 "nvme_io": true 00:26:51.295 }, 00:26:51.295 "driver_specific": { 00:26:51.295 "nvme": [ 00:26:51.295 { 00:26:51.295 "pci_address": "0000:5e:00.0", 00:26:51.295 "trid": { 00:26:51.295 "trtype": "PCIe", 00:26:51.295 "traddr": "0000:5e:00.0" 00:26:51.295 }, 00:26:51.295 "ctrlr_data": { 00:26:51.295 "cntlid": 0, 00:26:51.295 "vendor_id": "0x8086", 00:26:51.295 "model_number": "INTEL SSDPF2KX076TZO", 00:26:51.295 "serial_number": "PHAC0301002G7P6CGN", 00:26:51.295 "firmware_revision": "JCV10200", 00:26:51.295 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:26:51.295 "oacs": { 00:26:51.295 "security": 1, 00:26:51.295 "format": 1, 00:26:51.295 "firmware": 1, 00:26:51.295 "ns_manage": 1 00:26:51.295 }, 00:26:51.295 "multi_ctrlr": false, 00:26:51.295 "ana_reporting": false 00:26:51.295 }, 00:26:51.295 "vs": { 00:26:51.295 "nvme_version": "1.3" 00:26:51.295 }, 00:26:51.295 "ns_data": { 00:26:51.295 "id": 1, 00:26:51.295 "can_share": false 00:26:51.295 }, 00:26:51.295 "security": { 00:26:51.295 "opal": true 00:26:51.295 } 00:26:51.295 } 00:26:51.295 ], 00:26:51.295 "mp_policy": "active_passive" 00:26:51.295 } 00:26:51.295 } 00:26:51.295 ] 00:26:51.295 00:08:51 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:26:51.295 00:08:51 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:26:51.553 [2024-05-15 00:08:51.962457] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x269af30 PMD being used: compress_qat 00:26:54.116 5cb201ec-a70f-4e22-8e25-51f08c7a8cc8 00:26:54.116 00:08:54 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:26:54.116 8fbda541-0412-480f-bdb1-e1fdf7ba1bb2 00:26:54.116 00:08:54 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:26:54.116 00:08:54 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=lvs0/lv0 00:26:54.116 00:08:54 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:26:54.116 00:08:54 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:26:54.116 00:08:54 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:26:54.116 00:08:54 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:26:54.116 00:08:54 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:54.117 00:08:54 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:26:54.386 [ 00:26:54.386 { 00:26:54.386 "name": "8fbda541-0412-480f-bdb1-e1fdf7ba1bb2", 00:26:54.386 "aliases": [ 00:26:54.386 "lvs0/lv0" 00:26:54.386 ], 00:26:54.386 "product_name": "Logical Volume", 00:26:54.386 "block_size": 512, 00:26:54.386 "num_blocks": 204800, 00:26:54.386 "uuid": "8fbda541-0412-480f-bdb1-e1fdf7ba1bb2", 00:26:54.386 "assigned_rate_limits": { 00:26:54.386 "rw_ios_per_sec": 0, 00:26:54.386 "rw_mbytes_per_sec": 0, 00:26:54.386 "r_mbytes_per_sec": 0, 00:26:54.386 "w_mbytes_per_sec": 0 00:26:54.386 }, 00:26:54.386 "claimed": false, 00:26:54.386 "zoned": false, 00:26:54.386 "supported_io_types": { 00:26:54.386 "read": true, 00:26:54.386 "write": true, 00:26:54.386 "unmap": true, 00:26:54.386 "write_zeroes": true, 00:26:54.386 "flush": false, 00:26:54.386 "reset": true, 00:26:54.386 "compare": false, 00:26:54.386 "compare_and_write": false, 00:26:54.386 "abort": false, 00:26:54.386 "nvme_admin": false, 00:26:54.386 "nvme_io": false 00:26:54.386 }, 00:26:54.386 "driver_specific": { 00:26:54.386 "lvol": { 00:26:54.386 "lvol_store_uuid": "5cb201ec-a70f-4e22-8e25-51f08c7a8cc8", 00:26:54.386 "base_bdev": "Nvme0n1", 00:26:54.386 "thin_provision": true, 00:26:54.386 "num_allocated_clusters": 0, 00:26:54.386 "snapshot": false, 00:26:54.386 "clone": false, 00:26:54.386 "esnap_clone": false 00:26:54.386 } 00:26:54.386 } 00:26:54.386 } 00:26:54.386 ] 00:26:54.386 00:08:54 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:26:54.386 00:08:54 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:26:54.386 00:08:54 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:26:54.645 [2024-05-15 00:08:55.157284] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:26:54.645 COMP_lvs0/lv0 00:26:54.645 00:08:55 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:26:54.645 00:08:55 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=COMP_lvs0/lv0 00:26:54.645 00:08:55 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:26:54.645 00:08:55 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:26:54.645 00:08:55 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:26:54.645 00:08:55 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:26:54.645 00:08:55 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:54.904 00:08:55 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:26:55.162 [ 00:26:55.162 { 00:26:55.162 "name": "COMP_lvs0/lv0", 00:26:55.162 "aliases": [ 00:26:55.162 "4d63f1ed-fa4c-5dd2-8054-2c36aa905ac9" 00:26:55.162 ], 00:26:55.162 "product_name": "compress", 00:26:55.162 "block_size": 512, 00:26:55.162 "num_blocks": 200704, 00:26:55.162 "uuid": "4d63f1ed-fa4c-5dd2-8054-2c36aa905ac9", 00:26:55.162 "assigned_rate_limits": { 00:26:55.162 "rw_ios_per_sec": 0, 00:26:55.162 "rw_mbytes_per_sec": 0, 00:26:55.162 "r_mbytes_per_sec": 0, 00:26:55.162 "w_mbytes_per_sec": 0 00:26:55.162 }, 00:26:55.162 "claimed": false, 00:26:55.162 "zoned": false, 00:26:55.162 "supported_io_types": { 00:26:55.162 "read": true, 00:26:55.162 "write": true, 00:26:55.162 "unmap": false, 00:26:55.162 "write_zeroes": true, 00:26:55.162 "flush": false, 00:26:55.162 "reset": false, 00:26:55.162 "compare": false, 00:26:55.162 "compare_and_write": false, 00:26:55.162 "abort": false, 00:26:55.162 "nvme_admin": false, 00:26:55.162 "nvme_io": false 00:26:55.162 }, 00:26:55.162 "driver_specific": { 00:26:55.162 "compress": { 00:26:55.162 "name": "COMP_lvs0/lv0", 00:26:55.162 "base_bdev_name": "8fbda541-0412-480f-bdb1-e1fdf7ba1bb2" 00:26:55.162 } 00:26:55.162 } 00:26:55.162 } 00:26:55.162 ] 00:26:55.162 00:08:55 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:26:55.162 00:08:55 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:26:55.162 [2024-05-15 00:08:55.703281] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fba641b15a0 PMD being used: compress_qat 00:26:55.162 [2024-05-15 00:08:55.705570] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x25748d0 PMD being used: compress_qat 00:26:55.162 Running I/O for 3 seconds... 00:26:58.450 00:26:58.450 Latency(us) 00:26:58.450 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:58.450 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:26:58.450 Verification LBA range: start 0x0 length 0x3100 00:26:58.450 COMP_lvs0/lv0 : 3.00 5051.91 19.73 0.00 0.00 6281.53 498.64 5869.75 00:26:58.450 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:26:58.450 Verification LBA range: start 0x3100 length 0x3100 00:26:58.450 COMP_lvs0/lv0 : 3.00 5322.11 20.79 0.00 0.00 5975.87 341.93 5670.29 00:26:58.450 =================================================================================================================== 00:26:58.450 Total : 10374.02 40.52 0.00 0.00 6124.74 341.93 5869.75 00:26:58.450 0 00:26:58.450 00:08:58 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:26:58.450 00:08:58 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:26:58.450 00:08:58 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:26:58.709 00:08:59 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:26:58.709 00:08:59 compress_compdev -- compress/compress.sh@78 -- # killprocess 532811 00:26:58.709 00:08:59 compress_compdev -- common/autotest_common.sh@946 -- # '[' -z 532811 ']' 00:26:58.709 00:08:59 compress_compdev -- common/autotest_common.sh@950 -- # kill -0 532811 00:26:58.709 00:08:59 compress_compdev -- common/autotest_common.sh@951 -- # uname 00:26:58.709 00:08:59 compress_compdev -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:26:58.709 00:08:59 compress_compdev -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 532811 00:26:58.709 00:08:59 compress_compdev -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:26:58.709 00:08:59 compress_compdev -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:26:58.709 00:08:59 compress_compdev -- common/autotest_common.sh@964 -- # echo 'killing process with pid 532811' 00:26:58.709 killing process with pid 532811 00:26:58.709 00:08:59 compress_compdev -- common/autotest_common.sh@965 -- # kill 532811 00:26:58.709 Received shutdown signal, test time was about 3.000000 seconds 00:26:58.709 00:26:58.709 Latency(us) 00:26:58.709 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:58.709 =================================================================================================================== 00:26:58.709 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:58.709 00:08:59 compress_compdev -- common/autotest_common.sh@970 -- # wait 532811 00:27:02.007 00:09:02 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:27:02.007 00:09:02 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:27:02.007 00:09:02 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=534576 00:27:02.007 00:09:02 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:27:02.007 00:09:02 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 534576 00:27:02.007 00:09:02 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:27:02.007 00:09:02 compress_compdev -- common/autotest_common.sh@827 -- # '[' -z 534576 ']' 00:27:02.007 00:09:02 compress_compdev -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:02.007 00:09:02 compress_compdev -- common/autotest_common.sh@832 -- # local max_retries=100 00:27:02.007 00:09:02 compress_compdev -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:02.007 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:02.007 00:09:02 compress_compdev -- common/autotest_common.sh@836 -- # xtrace_disable 00:27:02.007 00:09:02 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:27:02.007 [2024-05-15 00:09:02.141022] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:27:02.007 [2024-05-15 00:09:02.141086] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid534576 ] 00:27:02.007 [2024-05-15 00:09:02.262218] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:02.007 [2024-05-15 00:09:02.368904] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:27:02.007 [2024-05-15 00:09:02.368910] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:27:02.574 [2024-05-15 00:09:03.136955] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:27:02.832 00:09:03 compress_compdev -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:27:02.832 00:09:03 compress_compdev -- common/autotest_common.sh@860 -- # return 0 00:27:02.832 00:09:03 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:27:02.832 00:09:03 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:27:02.832 00:09:03 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:27:03.400 [2024-05-15 00:09:03.712464] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x18925f0 PMD being used: compress_qat 00:27:03.400 00:09:03 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:27:03.400 00:09:03 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=Nvme0n1 00:27:03.400 00:09:03 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:27:03.400 00:09:03 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:27:03.400 00:09:03 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:27:03.400 00:09:03 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:27:03.400 00:09:03 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:03.400 00:09:03 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:27:03.658 [ 00:27:03.658 { 00:27:03.658 "name": "Nvme0n1", 00:27:03.658 "aliases": [ 00:27:03.658 "01000000-0000-0000-5cd2-e43197705251" 00:27:03.658 ], 00:27:03.658 "product_name": "NVMe disk", 00:27:03.658 "block_size": 512, 00:27:03.658 "num_blocks": 15002931888, 00:27:03.658 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:27:03.658 "assigned_rate_limits": { 00:27:03.658 "rw_ios_per_sec": 0, 00:27:03.658 "rw_mbytes_per_sec": 0, 00:27:03.658 "r_mbytes_per_sec": 0, 00:27:03.658 "w_mbytes_per_sec": 0 00:27:03.658 }, 00:27:03.658 "claimed": false, 00:27:03.658 "zoned": false, 00:27:03.658 "supported_io_types": { 00:27:03.658 "read": true, 00:27:03.658 "write": true, 00:27:03.658 "unmap": true, 00:27:03.658 "write_zeroes": true, 00:27:03.658 "flush": true, 00:27:03.658 "reset": true, 00:27:03.658 "compare": false, 00:27:03.658 "compare_and_write": false, 00:27:03.658 "abort": true, 00:27:03.658 "nvme_admin": true, 00:27:03.658 "nvme_io": true 00:27:03.658 }, 00:27:03.658 "driver_specific": { 00:27:03.658 "nvme": [ 00:27:03.658 { 00:27:03.658 "pci_address": "0000:5e:00.0", 00:27:03.658 "trid": { 00:27:03.658 "trtype": "PCIe", 00:27:03.658 "traddr": "0000:5e:00.0" 00:27:03.658 }, 00:27:03.658 "ctrlr_data": { 00:27:03.658 "cntlid": 0, 00:27:03.658 "vendor_id": "0x8086", 00:27:03.658 "model_number": "INTEL SSDPF2KX076TZO", 00:27:03.658 "serial_number": "PHAC0301002G7P6CGN", 00:27:03.658 "firmware_revision": "JCV10200", 00:27:03.658 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:27:03.658 "oacs": { 00:27:03.658 "security": 1, 00:27:03.658 "format": 1, 00:27:03.658 "firmware": 1, 00:27:03.658 "ns_manage": 1 00:27:03.658 }, 00:27:03.658 "multi_ctrlr": false, 00:27:03.658 "ana_reporting": false 00:27:03.658 }, 00:27:03.658 "vs": { 00:27:03.658 "nvme_version": "1.3" 00:27:03.658 }, 00:27:03.658 "ns_data": { 00:27:03.658 "id": 1, 00:27:03.658 "can_share": false 00:27:03.658 }, 00:27:03.658 "security": { 00:27:03.658 "opal": true 00:27:03.658 } 00:27:03.658 } 00:27:03.658 ], 00:27:03.658 "mp_policy": "active_passive" 00:27:03.658 } 00:27:03.658 } 00:27:03.658 ] 00:27:03.658 00:09:04 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:27:03.658 00:09:04 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:27:03.917 [2024-05-15 00:09:04.409962] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x16ca450 PMD being used: compress_qat 00:27:06.454 839444f6-08d0-482a-ba7c-0a0fe483b8f5 00:27:06.454 00:09:06 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:27:06.454 b00bce65-a460-4877-ab7d-afeec8f4e35b 00:27:06.454 00:09:06 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:27:06.454 00:09:06 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=lvs0/lv0 00:27:06.454 00:09:06 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:27:06.454 00:09:06 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:27:06.454 00:09:06 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:27:06.454 00:09:06 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:27:06.454 00:09:06 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:06.712 00:09:07 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:27:06.971 [ 00:27:06.971 { 00:27:06.971 "name": "b00bce65-a460-4877-ab7d-afeec8f4e35b", 00:27:06.971 "aliases": [ 00:27:06.971 "lvs0/lv0" 00:27:06.971 ], 00:27:06.971 "product_name": "Logical Volume", 00:27:06.971 "block_size": 512, 00:27:06.971 "num_blocks": 204800, 00:27:06.971 "uuid": "b00bce65-a460-4877-ab7d-afeec8f4e35b", 00:27:06.971 "assigned_rate_limits": { 00:27:06.971 "rw_ios_per_sec": 0, 00:27:06.971 "rw_mbytes_per_sec": 0, 00:27:06.971 "r_mbytes_per_sec": 0, 00:27:06.971 "w_mbytes_per_sec": 0 00:27:06.971 }, 00:27:06.971 "claimed": false, 00:27:06.971 "zoned": false, 00:27:06.971 "supported_io_types": { 00:27:06.971 "read": true, 00:27:06.971 "write": true, 00:27:06.971 "unmap": true, 00:27:06.971 "write_zeroes": true, 00:27:06.971 "flush": false, 00:27:06.971 "reset": true, 00:27:06.971 "compare": false, 00:27:06.971 "compare_and_write": false, 00:27:06.971 "abort": false, 00:27:06.971 "nvme_admin": false, 00:27:06.971 "nvme_io": false 00:27:06.971 }, 00:27:06.971 "driver_specific": { 00:27:06.971 "lvol": { 00:27:06.971 "lvol_store_uuid": "839444f6-08d0-482a-ba7c-0a0fe483b8f5", 00:27:06.971 "base_bdev": "Nvme0n1", 00:27:06.971 "thin_provision": true, 00:27:06.971 "num_allocated_clusters": 0, 00:27:06.971 "snapshot": false, 00:27:06.971 "clone": false, 00:27:06.971 "esnap_clone": false 00:27:06.971 } 00:27:06.971 } 00:27:06.971 } 00:27:06.971 ] 00:27:06.971 00:09:07 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:27:06.971 00:09:07 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:27:06.971 00:09:07 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:27:07.230 [2024-05-15 00:09:07.600618] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:27:07.230 COMP_lvs0/lv0 00:27:07.230 00:09:07 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:27:07.230 00:09:07 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=COMP_lvs0/lv0 00:27:07.230 00:09:07 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:27:07.230 00:09:07 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:27:07.230 00:09:07 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:27:07.230 00:09:07 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:27:07.230 00:09:07 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:07.489 00:09:07 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:27:07.747 [ 00:27:07.747 { 00:27:07.747 "name": "COMP_lvs0/lv0", 00:27:07.747 "aliases": [ 00:27:07.748 "a0bede89-6627-582c-a16f-e8e3ba7938f4" 00:27:07.748 ], 00:27:07.748 "product_name": "compress", 00:27:07.748 "block_size": 512, 00:27:07.748 "num_blocks": 200704, 00:27:07.748 "uuid": "a0bede89-6627-582c-a16f-e8e3ba7938f4", 00:27:07.748 "assigned_rate_limits": { 00:27:07.748 "rw_ios_per_sec": 0, 00:27:07.748 "rw_mbytes_per_sec": 0, 00:27:07.748 "r_mbytes_per_sec": 0, 00:27:07.748 "w_mbytes_per_sec": 0 00:27:07.748 }, 00:27:07.748 "claimed": false, 00:27:07.748 "zoned": false, 00:27:07.748 "supported_io_types": { 00:27:07.748 "read": true, 00:27:07.748 "write": true, 00:27:07.748 "unmap": false, 00:27:07.748 "write_zeroes": true, 00:27:07.748 "flush": false, 00:27:07.748 "reset": false, 00:27:07.748 "compare": false, 00:27:07.748 "compare_and_write": false, 00:27:07.748 "abort": false, 00:27:07.748 "nvme_admin": false, 00:27:07.748 "nvme_io": false 00:27:07.748 }, 00:27:07.748 "driver_specific": { 00:27:07.748 "compress": { 00:27:07.748 "name": "COMP_lvs0/lv0", 00:27:07.748 "base_bdev_name": "b00bce65-a460-4877-ab7d-afeec8f4e35b" 00:27:07.748 } 00:27:07.748 } 00:27:07.748 } 00:27:07.748 ] 00:27:07.748 00:09:08 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:27:07.748 00:09:08 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:27:07.748 [2024-05-15 00:09:08.194731] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fc4d01b15a0 PMD being used: compress_qat 00:27:07.748 [2024-05-15 00:09:08.196960] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1773c20 PMD being used: compress_qat 00:27:07.748 Running I/O for 3 seconds... 00:27:11.036 00:27:11.036 Latency(us) 00:27:11.036 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:11.036 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:27:11.036 Verification LBA range: start 0x0 length 0x3100 00:27:11.036 COMP_lvs0/lv0 : 3.00 5048.65 19.72 0.00 0.00 6286.32 584.13 5641.79 00:27:11.036 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:27:11.036 Verification LBA range: start 0x3100 length 0x3100 00:27:11.036 COMP_lvs0/lv0 : 3.00 5332.87 20.83 0.00 0.00 5964.10 372.20 5584.81 00:27:11.036 =================================================================================================================== 00:27:11.036 Total : 10381.51 40.55 0.00 0.00 6120.79 372.20 5641.79 00:27:11.036 0 00:27:11.036 00:09:11 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:27:11.036 00:09:11 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:27:11.036 00:09:11 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:27:11.295 00:09:11 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:27:11.295 00:09:11 compress_compdev -- compress/compress.sh@78 -- # killprocess 534576 00:27:11.295 00:09:11 compress_compdev -- common/autotest_common.sh@946 -- # '[' -z 534576 ']' 00:27:11.295 00:09:11 compress_compdev -- common/autotest_common.sh@950 -- # kill -0 534576 00:27:11.295 00:09:11 compress_compdev -- common/autotest_common.sh@951 -- # uname 00:27:11.295 00:09:11 compress_compdev -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:27:11.295 00:09:11 compress_compdev -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 534576 00:27:11.295 00:09:11 compress_compdev -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:27:11.295 00:09:11 compress_compdev -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:27:11.295 00:09:11 compress_compdev -- common/autotest_common.sh@964 -- # echo 'killing process with pid 534576' 00:27:11.295 killing process with pid 534576 00:27:11.295 00:09:11 compress_compdev -- common/autotest_common.sh@965 -- # kill 534576 00:27:11.295 Received shutdown signal, test time was about 3.000000 seconds 00:27:11.295 00:27:11.295 Latency(us) 00:27:11.295 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:11.295 =================================================================================================================== 00:27:11.295 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:11.295 00:09:11 compress_compdev -- common/autotest_common.sh@970 -- # wait 534576 00:27:14.580 00:09:14 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:27:14.580 00:09:14 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:27:14.580 00:09:14 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=536561 00:27:14.580 00:09:14 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:27:14.580 00:09:14 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:27:14.580 00:09:14 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 536561 00:27:14.580 00:09:14 compress_compdev -- common/autotest_common.sh@827 -- # '[' -z 536561 ']' 00:27:14.580 00:09:14 compress_compdev -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:14.580 00:09:14 compress_compdev -- common/autotest_common.sh@832 -- # local max_retries=100 00:27:14.580 00:09:14 compress_compdev -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:14.580 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:14.580 00:09:14 compress_compdev -- common/autotest_common.sh@836 -- # xtrace_disable 00:27:14.580 00:09:14 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:27:14.580 [2024-05-15 00:09:14.578432] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:27:14.580 [2024-05-15 00:09:14.578506] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid536561 ] 00:27:14.580 [2024-05-15 00:09:14.700219] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:14.580 [2024-05-15 00:09:14.798370] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:27:14.580 [2024-05-15 00:09:14.798375] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:27:15.147 [2024-05-15 00:09:15.538494] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:27:15.147 00:09:15 compress_compdev -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:27:15.147 00:09:15 compress_compdev -- common/autotest_common.sh@860 -- # return 0 00:27:15.147 00:09:15 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:27:15.147 00:09:15 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:27:15.147 00:09:15 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:27:15.715 [2024-05-15 00:09:16.184592] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x178e5f0 PMD being used: compress_qat 00:27:15.715 00:09:16 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:27:15.715 00:09:16 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=Nvme0n1 00:27:15.715 00:09:16 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:27:15.715 00:09:16 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:27:15.715 00:09:16 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:27:15.715 00:09:16 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:27:15.715 00:09:16 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:15.973 00:09:16 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:27:16.231 [ 00:27:16.231 { 00:27:16.231 "name": "Nvme0n1", 00:27:16.231 "aliases": [ 00:27:16.231 "01000000-0000-0000-5cd2-e43197705251" 00:27:16.231 ], 00:27:16.231 "product_name": "NVMe disk", 00:27:16.231 "block_size": 512, 00:27:16.231 "num_blocks": 15002931888, 00:27:16.231 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:27:16.231 "assigned_rate_limits": { 00:27:16.231 "rw_ios_per_sec": 0, 00:27:16.231 "rw_mbytes_per_sec": 0, 00:27:16.231 "r_mbytes_per_sec": 0, 00:27:16.231 "w_mbytes_per_sec": 0 00:27:16.231 }, 00:27:16.231 "claimed": false, 00:27:16.231 "zoned": false, 00:27:16.231 "supported_io_types": { 00:27:16.231 "read": true, 00:27:16.231 "write": true, 00:27:16.231 "unmap": true, 00:27:16.231 "write_zeroes": true, 00:27:16.231 "flush": true, 00:27:16.231 "reset": true, 00:27:16.231 "compare": false, 00:27:16.231 "compare_and_write": false, 00:27:16.231 "abort": true, 00:27:16.231 "nvme_admin": true, 00:27:16.231 "nvme_io": true 00:27:16.231 }, 00:27:16.231 "driver_specific": { 00:27:16.231 "nvme": [ 00:27:16.231 { 00:27:16.231 "pci_address": "0000:5e:00.0", 00:27:16.231 "trid": { 00:27:16.231 "trtype": "PCIe", 00:27:16.231 "traddr": "0000:5e:00.0" 00:27:16.231 }, 00:27:16.231 "ctrlr_data": { 00:27:16.231 "cntlid": 0, 00:27:16.231 "vendor_id": "0x8086", 00:27:16.231 "model_number": "INTEL SSDPF2KX076TZO", 00:27:16.231 "serial_number": "PHAC0301002G7P6CGN", 00:27:16.231 "firmware_revision": "JCV10200", 00:27:16.231 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:27:16.231 "oacs": { 00:27:16.231 "security": 1, 00:27:16.231 "format": 1, 00:27:16.231 "firmware": 1, 00:27:16.231 "ns_manage": 1 00:27:16.231 }, 00:27:16.232 "multi_ctrlr": false, 00:27:16.232 "ana_reporting": false 00:27:16.232 }, 00:27:16.232 "vs": { 00:27:16.232 "nvme_version": "1.3" 00:27:16.232 }, 00:27:16.232 "ns_data": { 00:27:16.232 "id": 1, 00:27:16.232 "can_share": false 00:27:16.232 }, 00:27:16.232 "security": { 00:27:16.232 "opal": true 00:27:16.232 } 00:27:16.232 } 00:27:16.232 ], 00:27:16.232 "mp_policy": "active_passive" 00:27:16.232 } 00:27:16.232 } 00:27:16.232 ] 00:27:16.232 00:09:16 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:27:16.232 00:09:16 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:27:16.490 [2024-05-15 00:09:16.942125] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x178ef30 PMD being used: compress_qat 00:27:19.053 37a9db5a-dec5-46b0-9847-7f652dc14887 00:27:19.053 00:09:19 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:27:19.053 0ccda65d-510a-4871-b7a2-8a68328ca5d5 00:27:19.053 00:09:19 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:27:19.053 00:09:19 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=lvs0/lv0 00:27:19.053 00:09:19 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:27:19.053 00:09:19 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:27:19.053 00:09:19 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:27:19.053 00:09:19 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:27:19.053 00:09:19 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:19.311 00:09:19 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:27:19.311 [ 00:27:19.311 { 00:27:19.311 "name": "0ccda65d-510a-4871-b7a2-8a68328ca5d5", 00:27:19.311 "aliases": [ 00:27:19.311 "lvs0/lv0" 00:27:19.311 ], 00:27:19.311 "product_name": "Logical Volume", 00:27:19.311 "block_size": 512, 00:27:19.311 "num_blocks": 204800, 00:27:19.311 "uuid": "0ccda65d-510a-4871-b7a2-8a68328ca5d5", 00:27:19.311 "assigned_rate_limits": { 00:27:19.311 "rw_ios_per_sec": 0, 00:27:19.311 "rw_mbytes_per_sec": 0, 00:27:19.311 "r_mbytes_per_sec": 0, 00:27:19.311 "w_mbytes_per_sec": 0 00:27:19.311 }, 00:27:19.311 "claimed": false, 00:27:19.311 "zoned": false, 00:27:19.311 "supported_io_types": { 00:27:19.311 "read": true, 00:27:19.311 "write": true, 00:27:19.311 "unmap": true, 00:27:19.311 "write_zeroes": true, 00:27:19.311 "flush": false, 00:27:19.311 "reset": true, 00:27:19.311 "compare": false, 00:27:19.311 "compare_and_write": false, 00:27:19.311 "abort": false, 00:27:19.311 "nvme_admin": false, 00:27:19.311 "nvme_io": false 00:27:19.311 }, 00:27:19.311 "driver_specific": { 00:27:19.311 "lvol": { 00:27:19.311 "lvol_store_uuid": "37a9db5a-dec5-46b0-9847-7f652dc14887", 00:27:19.311 "base_bdev": "Nvme0n1", 00:27:19.311 "thin_provision": true, 00:27:19.311 "num_allocated_clusters": 0, 00:27:19.311 "snapshot": false, 00:27:19.311 "clone": false, 00:27:19.311 "esnap_clone": false 00:27:19.311 } 00:27:19.311 } 00:27:19.311 } 00:27:19.311 ] 00:27:19.568 00:09:19 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:27:19.568 00:09:19 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:27:19.568 00:09:19 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:27:19.568 [2024-05-15 00:09:20.141051] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:27:19.568 COMP_lvs0/lv0 00:27:19.826 00:09:20 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:27:19.826 00:09:20 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=COMP_lvs0/lv0 00:27:19.826 00:09:20 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:27:19.826 00:09:20 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:27:19.826 00:09:20 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:27:19.826 00:09:20 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:27:19.826 00:09:20 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:20.084 00:09:20 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:27:20.084 [ 00:27:20.084 { 00:27:20.084 "name": "COMP_lvs0/lv0", 00:27:20.084 "aliases": [ 00:27:20.084 "a5fd11e2-c826-5582-91da-ac6a159c86e8" 00:27:20.084 ], 00:27:20.084 "product_name": "compress", 00:27:20.084 "block_size": 4096, 00:27:20.084 "num_blocks": 25088, 00:27:20.084 "uuid": "a5fd11e2-c826-5582-91da-ac6a159c86e8", 00:27:20.084 "assigned_rate_limits": { 00:27:20.084 "rw_ios_per_sec": 0, 00:27:20.084 "rw_mbytes_per_sec": 0, 00:27:20.084 "r_mbytes_per_sec": 0, 00:27:20.084 "w_mbytes_per_sec": 0 00:27:20.084 }, 00:27:20.084 "claimed": false, 00:27:20.084 "zoned": false, 00:27:20.084 "supported_io_types": { 00:27:20.084 "read": true, 00:27:20.084 "write": true, 00:27:20.084 "unmap": false, 00:27:20.084 "write_zeroes": true, 00:27:20.084 "flush": false, 00:27:20.084 "reset": false, 00:27:20.084 "compare": false, 00:27:20.084 "compare_and_write": false, 00:27:20.084 "abort": false, 00:27:20.084 "nvme_admin": false, 00:27:20.084 "nvme_io": false 00:27:20.084 }, 00:27:20.084 "driver_specific": { 00:27:20.084 "compress": { 00:27:20.084 "name": "COMP_lvs0/lv0", 00:27:20.084 "base_bdev_name": "0ccda65d-510a-4871-b7a2-8a68328ca5d5" 00:27:20.084 } 00:27:20.084 } 00:27:20.084 } 00:27:20.084 ] 00:27:20.084 00:09:20 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:27:20.342 00:09:20 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:27:20.342 [2024-05-15 00:09:20.775319] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f38181b15a0 PMD being used: compress_qat 00:27:20.342 [2024-05-15 00:09:20.777563] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x16689f0 PMD being used: compress_qat 00:27:20.342 Running I/O for 3 seconds... 00:27:23.628 00:27:23.628 Latency(us) 00:27:23.628 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:23.628 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:27:23.628 Verification LBA range: start 0x0 length 0x3100 00:27:23.628 COMP_lvs0/lv0 : 3.00 5045.56 19.71 0.00 0.00 6291.00 315.21 5784.26 00:27:23.628 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:27:23.628 Verification LBA range: start 0x3100 length 0x3100 00:27:23.628 COMP_lvs0/lv0 : 3.00 5297.17 20.69 0.00 0.00 6003.82 302.75 5727.28 00:27:23.628 =================================================================================================================== 00:27:23.628 Total : 10342.72 40.40 0.00 0.00 6143.92 302.75 5784.26 00:27:23.628 0 00:27:23.628 00:09:23 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:27:23.628 00:09:23 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:27:23.628 00:09:24 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:27:23.887 00:09:24 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:27:23.887 00:09:24 compress_compdev -- compress/compress.sh@78 -- # killprocess 536561 00:27:23.887 00:09:24 compress_compdev -- common/autotest_common.sh@946 -- # '[' -z 536561 ']' 00:27:23.887 00:09:24 compress_compdev -- common/autotest_common.sh@950 -- # kill -0 536561 00:27:23.887 00:09:24 compress_compdev -- common/autotest_common.sh@951 -- # uname 00:27:23.887 00:09:24 compress_compdev -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:27:23.887 00:09:24 compress_compdev -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 536561 00:27:23.887 00:09:24 compress_compdev -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:27:23.887 00:09:24 compress_compdev -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:27:23.887 00:09:24 compress_compdev -- common/autotest_common.sh@964 -- # echo 'killing process with pid 536561' 00:27:23.887 killing process with pid 536561 00:27:23.887 00:09:24 compress_compdev -- common/autotest_common.sh@965 -- # kill 536561 00:27:23.887 Received shutdown signal, test time was about 3.000000 seconds 00:27:23.887 00:27:23.887 Latency(us) 00:27:23.887 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:23.887 =================================================================================================================== 00:27:23.887 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:23.887 00:09:24 compress_compdev -- common/autotest_common.sh@970 -- # wait 536561 00:27:27.172 00:09:27 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:27:27.172 00:09:27 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:27:27.172 00:09:27 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=538232 00:27:27.172 00:09:27 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:27:27.172 00:09:27 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:27:27.172 00:09:27 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 538232 00:27:27.172 00:09:27 compress_compdev -- common/autotest_common.sh@827 -- # '[' -z 538232 ']' 00:27:27.172 00:09:27 compress_compdev -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:27.172 00:09:27 compress_compdev -- common/autotest_common.sh@832 -- # local max_retries=100 00:27:27.172 00:09:27 compress_compdev -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:27.172 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:27.172 00:09:27 compress_compdev -- common/autotest_common.sh@836 -- # xtrace_disable 00:27:27.172 00:09:27 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:27:27.172 [2024-05-15 00:09:27.192754] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:27:27.172 [2024-05-15 00:09:27.192824] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid538232 ] 00:27:27.172 [2024-05-15 00:09:27.321980] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:27:27.172 [2024-05-15 00:09:27.421036] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:27:27.172 [2024-05-15 00:09:27.421120] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:27:27.172 [2024-05-15 00:09:27.421124] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:27:27.739 [2024-05-15 00:09:28.181843] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:27:27.739 00:09:28 compress_compdev -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:27:27.739 00:09:28 compress_compdev -- common/autotest_common.sh@860 -- # return 0 00:27:27.739 00:09:28 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:27:27.739 00:09:28 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:27:27.739 00:09:28 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:27:28.307 [2024-05-15 00:09:28.824851] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1b46f80 PMD being used: compress_qat 00:27:28.307 00:09:28 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:27:28.307 00:09:28 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=Nvme0n1 00:27:28.307 00:09:28 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:27:28.307 00:09:28 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:27:28.307 00:09:28 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:27:28.307 00:09:28 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:27:28.307 00:09:28 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:28.565 00:09:29 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:27:28.823 [ 00:27:28.823 { 00:27:28.823 "name": "Nvme0n1", 00:27:28.823 "aliases": [ 00:27:28.823 "01000000-0000-0000-5cd2-e43197705251" 00:27:28.823 ], 00:27:28.823 "product_name": "NVMe disk", 00:27:28.823 "block_size": 512, 00:27:28.823 "num_blocks": 15002931888, 00:27:28.823 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:27:28.823 "assigned_rate_limits": { 00:27:28.823 "rw_ios_per_sec": 0, 00:27:28.823 "rw_mbytes_per_sec": 0, 00:27:28.823 "r_mbytes_per_sec": 0, 00:27:28.823 "w_mbytes_per_sec": 0 00:27:28.823 }, 00:27:28.823 "claimed": false, 00:27:28.823 "zoned": false, 00:27:28.823 "supported_io_types": { 00:27:28.823 "read": true, 00:27:28.823 "write": true, 00:27:28.823 "unmap": true, 00:27:28.823 "write_zeroes": true, 00:27:28.823 "flush": true, 00:27:28.823 "reset": true, 00:27:28.823 "compare": false, 00:27:28.823 "compare_and_write": false, 00:27:28.823 "abort": true, 00:27:28.823 "nvme_admin": true, 00:27:28.823 "nvme_io": true 00:27:28.823 }, 00:27:28.823 "driver_specific": { 00:27:28.823 "nvme": [ 00:27:28.823 { 00:27:28.823 "pci_address": "0000:5e:00.0", 00:27:28.823 "trid": { 00:27:28.823 "trtype": "PCIe", 00:27:28.823 "traddr": "0000:5e:00.0" 00:27:28.823 }, 00:27:28.823 "ctrlr_data": { 00:27:28.823 "cntlid": 0, 00:27:28.823 "vendor_id": "0x8086", 00:27:28.823 "model_number": "INTEL SSDPF2KX076TZO", 00:27:28.823 "serial_number": "PHAC0301002G7P6CGN", 00:27:28.823 "firmware_revision": "JCV10200", 00:27:28.823 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:27:28.823 "oacs": { 00:27:28.824 "security": 1, 00:27:28.824 "format": 1, 00:27:28.824 "firmware": 1, 00:27:28.824 "ns_manage": 1 00:27:28.824 }, 00:27:28.824 "multi_ctrlr": false, 00:27:28.824 "ana_reporting": false 00:27:28.824 }, 00:27:28.824 "vs": { 00:27:28.824 "nvme_version": "1.3" 00:27:28.824 }, 00:27:28.824 "ns_data": { 00:27:28.824 "id": 1, 00:27:28.824 "can_share": false 00:27:28.824 }, 00:27:28.824 "security": { 00:27:28.824 "opal": true 00:27:28.824 } 00:27:28.824 } 00:27:28.824 ], 00:27:28.824 "mp_policy": "active_passive" 00:27:28.824 } 00:27:28.824 } 00:27:28.824 ] 00:27:28.824 00:09:29 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:27:28.824 00:09:29 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:27:29.081 [2024-05-15 00:09:29.590544] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x19abca0 PMD being used: compress_qat 00:27:31.610 3ec29b4f-9ee7-48c3-a01d-34354eae8f4e 00:27:31.610 00:09:31 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:27:31.611 94be7082-179f-4159-9995-e4094dd40ca6 00:27:31.611 00:09:32 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:27:31.611 00:09:32 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=lvs0/lv0 00:27:31.611 00:09:32 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:27:31.611 00:09:32 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:27:31.611 00:09:32 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:27:31.611 00:09:32 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:27:31.611 00:09:32 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:31.869 00:09:32 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:27:32.127 [ 00:27:32.127 { 00:27:32.127 "name": "94be7082-179f-4159-9995-e4094dd40ca6", 00:27:32.127 "aliases": [ 00:27:32.127 "lvs0/lv0" 00:27:32.127 ], 00:27:32.127 "product_name": "Logical Volume", 00:27:32.127 "block_size": 512, 00:27:32.127 "num_blocks": 204800, 00:27:32.127 "uuid": "94be7082-179f-4159-9995-e4094dd40ca6", 00:27:32.127 "assigned_rate_limits": { 00:27:32.127 "rw_ios_per_sec": 0, 00:27:32.127 "rw_mbytes_per_sec": 0, 00:27:32.127 "r_mbytes_per_sec": 0, 00:27:32.127 "w_mbytes_per_sec": 0 00:27:32.127 }, 00:27:32.127 "claimed": false, 00:27:32.127 "zoned": false, 00:27:32.127 "supported_io_types": { 00:27:32.127 "read": true, 00:27:32.127 "write": true, 00:27:32.127 "unmap": true, 00:27:32.127 "write_zeroes": true, 00:27:32.127 "flush": false, 00:27:32.127 "reset": true, 00:27:32.127 "compare": false, 00:27:32.127 "compare_and_write": false, 00:27:32.127 "abort": false, 00:27:32.127 "nvme_admin": false, 00:27:32.127 "nvme_io": false 00:27:32.127 }, 00:27:32.127 "driver_specific": { 00:27:32.127 "lvol": { 00:27:32.127 "lvol_store_uuid": "3ec29b4f-9ee7-48c3-a01d-34354eae8f4e", 00:27:32.127 "base_bdev": "Nvme0n1", 00:27:32.127 "thin_provision": true, 00:27:32.127 "num_allocated_clusters": 0, 00:27:32.127 "snapshot": false, 00:27:32.127 "clone": false, 00:27:32.127 "esnap_clone": false 00:27:32.127 } 00:27:32.127 } 00:27:32.127 } 00:27:32.127 ] 00:27:32.127 00:09:32 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:27:32.127 00:09:32 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:27:32.127 00:09:32 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:27:32.385 [2024-05-15 00:09:32.810152] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:27:32.385 COMP_lvs0/lv0 00:27:32.385 00:09:32 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:27:32.385 00:09:32 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=COMP_lvs0/lv0 00:27:32.385 00:09:32 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:27:32.385 00:09:32 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:27:32.385 00:09:32 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:27:32.385 00:09:32 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:27:32.385 00:09:32 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:32.643 00:09:33 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:27:32.902 [ 00:27:32.902 { 00:27:32.902 "name": "COMP_lvs0/lv0", 00:27:32.902 "aliases": [ 00:27:32.902 "a2202d50-2e7d-5c1a-b3bb-9e9276dc1443" 00:27:32.902 ], 00:27:32.902 "product_name": "compress", 00:27:32.902 "block_size": 512, 00:27:32.902 "num_blocks": 200704, 00:27:32.902 "uuid": "a2202d50-2e7d-5c1a-b3bb-9e9276dc1443", 00:27:32.902 "assigned_rate_limits": { 00:27:32.902 "rw_ios_per_sec": 0, 00:27:32.902 "rw_mbytes_per_sec": 0, 00:27:32.902 "r_mbytes_per_sec": 0, 00:27:32.902 "w_mbytes_per_sec": 0 00:27:32.902 }, 00:27:32.902 "claimed": false, 00:27:32.902 "zoned": false, 00:27:32.902 "supported_io_types": { 00:27:32.902 "read": true, 00:27:32.902 "write": true, 00:27:32.902 "unmap": false, 00:27:32.902 "write_zeroes": true, 00:27:32.902 "flush": false, 00:27:32.902 "reset": false, 00:27:32.902 "compare": false, 00:27:32.902 "compare_and_write": false, 00:27:32.902 "abort": false, 00:27:32.902 "nvme_admin": false, 00:27:32.902 "nvme_io": false 00:27:32.902 }, 00:27:32.902 "driver_specific": { 00:27:32.902 "compress": { 00:27:32.902 "name": "COMP_lvs0/lv0", 00:27:32.902 "base_bdev_name": "94be7082-179f-4159-9995-e4094dd40ca6" 00:27:32.902 } 00:27:32.902 } 00:27:32.902 } 00:27:32.902 ] 00:27:32.902 00:09:33 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:27:32.902 00:09:33 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:27:32.902 [2024-05-15 00:09:33.447256] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f46581b1330 PMD being used: compress_qat 00:27:32.902 I/O targets: 00:27:32.902 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:27:32.902 00:27:32.902 00:27:32.902 CUnit - A unit testing framework for C - Version 2.1-3 00:27:32.902 http://cunit.sourceforge.net/ 00:27:32.902 00:27:32.902 00:27:32.902 Suite: bdevio tests on: COMP_lvs0/lv0 00:27:32.902 Test: blockdev write read block ...passed 00:27:32.902 Test: blockdev write zeroes read block ...passed 00:27:32.902 Test: blockdev write zeroes read no split ...passed 00:27:32.903 Test: blockdev write zeroes read split ...passed 00:27:32.903 Test: blockdev write zeroes read split partial ...passed 00:27:32.903 Test: blockdev reset ...[2024-05-15 00:09:33.485566] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:27:32.903 passed 00:27:32.903 Test: blockdev write read 8 blocks ...passed 00:27:32.903 Test: blockdev write read size > 128k ...passed 00:27:32.903 Test: blockdev write read invalid size ...passed 00:27:32.903 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:27:32.903 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:27:32.903 Test: blockdev write read max offset ...passed 00:27:32.903 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:27:32.903 Test: blockdev writev readv 8 blocks ...passed 00:27:32.903 Test: blockdev writev readv 30 x 1block ...passed 00:27:32.903 Test: blockdev writev readv block ...passed 00:27:32.903 Test: blockdev writev readv size > 128k ...passed 00:27:32.903 Test: blockdev writev readv size > 128k in two iovs ...passed 00:27:32.903 Test: blockdev comparev and writev ...passed 00:27:32.903 Test: blockdev nvme passthru rw ...passed 00:27:32.903 Test: blockdev nvme passthru vendor specific ...passed 00:27:32.903 Test: blockdev nvme admin passthru ...passed 00:27:32.903 Test: blockdev copy ...passed 00:27:32.903 00:27:32.903 Run Summary: Type Total Ran Passed Failed Inactive 00:27:32.903 suites 1 1 n/a 0 0 00:27:32.903 tests 23 23 23 0 0 00:27:32.903 asserts 130 130 130 0 n/a 00:27:32.903 00:27:32.903 Elapsed time = 0.093 seconds 00:27:33.160 0 00:27:33.160 00:09:33 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:27:33.160 00:09:33 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:27:33.417 00:09:33 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:27:33.676 00:09:34 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:27:33.676 00:09:34 compress_compdev -- compress/compress.sh@62 -- # killprocess 538232 00:27:33.676 00:09:34 compress_compdev -- common/autotest_common.sh@946 -- # '[' -z 538232 ']' 00:27:33.676 00:09:34 compress_compdev -- common/autotest_common.sh@950 -- # kill -0 538232 00:27:33.676 00:09:34 compress_compdev -- common/autotest_common.sh@951 -- # uname 00:27:33.676 00:09:34 compress_compdev -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:27:33.676 00:09:34 compress_compdev -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 538232 00:27:33.676 00:09:34 compress_compdev -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:27:33.676 00:09:34 compress_compdev -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:27:33.676 00:09:34 compress_compdev -- common/autotest_common.sh@964 -- # echo 'killing process with pid 538232' 00:27:33.676 killing process with pid 538232 00:27:33.676 00:09:34 compress_compdev -- common/autotest_common.sh@965 -- # kill 538232 00:27:33.676 00:09:34 compress_compdev -- common/autotest_common.sh@970 -- # wait 538232 00:27:36.960 00:09:37 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:27:36.960 00:09:37 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:27:36.960 00:27:36.960 real 0m47.738s 00:27:36.960 user 1m50.473s 00:27:36.960 sys 0m5.810s 00:27:36.960 00:09:37 compress_compdev -- common/autotest_common.sh@1122 -- # xtrace_disable 00:27:36.960 00:09:37 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:27:36.960 ************************************ 00:27:36.960 END TEST compress_compdev 00:27:36.960 ************************************ 00:27:36.960 00:09:37 -- spdk/autotest.sh@345 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:27:36.960 00:09:37 -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:27:36.960 00:09:37 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:27:36.960 00:09:37 -- common/autotest_common.sh@10 -- # set +x 00:27:36.960 ************************************ 00:27:36.960 START TEST compress_isal 00:27:36.960 ************************************ 00:27:36.960 00:09:37 compress_isal -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:27:36.960 * Looking for test storage... 00:27:36.960 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:27:36.960 00:09:37 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:27:36.960 00:09:37 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:27:36.960 00:09:37 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:27:36.960 00:09:37 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:27:36.960 00:09:37 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:27:36.960 00:09:37 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:27:36.960 00:09:37 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:27:36.960 00:09:37 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:27:36.960 00:09:37 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:27:36.960 00:09:37 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:27:36.960 00:09:37 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:27:36.960 00:09:37 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:27:36.960 00:09:37 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:27:36.960 00:09:37 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:27:36.960 00:09:37 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:27:36.960 00:09:37 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:27:36.960 00:09:37 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:27:36.960 00:09:37 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:27:36.960 00:09:37 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:27:36.960 00:09:37 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:27:36.960 00:09:37 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:27:36.960 00:09:37 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:27:36.960 00:09:37 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:36.960 00:09:37 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:36.960 00:09:37 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:36.960 00:09:37 compress_isal -- paths/export.sh@5 -- # export PATH 00:27:36.960 00:09:37 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:36.960 00:09:37 compress_isal -- nvmf/common.sh@47 -- # : 0 00:27:36.960 00:09:37 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:27:36.960 00:09:37 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:27:36.960 00:09:37 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:27:36.960 00:09:37 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:27:36.960 00:09:37 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:27:36.960 00:09:37 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:27:36.960 00:09:37 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:27:36.960 00:09:37 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:27:36.960 00:09:37 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:36.960 00:09:37 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:27:36.960 00:09:37 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:27:36.960 00:09:37 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:27:36.961 00:09:37 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:27:36.961 00:09:37 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=539583 00:27:36.961 00:09:37 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:27:36.961 00:09:37 compress_isal -- compress/compress.sh@73 -- # waitforlisten 539583 00:27:36.961 00:09:37 compress_isal -- common/autotest_common.sh@827 -- # '[' -z 539583 ']' 00:27:36.961 00:09:37 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:27:36.961 00:09:37 compress_isal -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:36.961 00:09:37 compress_isal -- common/autotest_common.sh@832 -- # local max_retries=100 00:27:36.961 00:09:37 compress_isal -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:36.961 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:36.961 00:09:37 compress_isal -- common/autotest_common.sh@836 -- # xtrace_disable 00:27:36.961 00:09:37 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:27:36.961 [2024-05-15 00:09:37.403849] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:27:36.961 [2024-05-15 00:09:37.403920] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid539583 ] 00:27:36.961 [2024-05-15 00:09:37.525709] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:37.220 [2024-05-15 00:09:37.624072] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:27:37.220 [2024-05-15 00:09:37.624078] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:27:37.787 00:09:38 compress_isal -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:27:37.787 00:09:38 compress_isal -- common/autotest_common.sh@860 -- # return 0 00:27:37.787 00:09:38 compress_isal -- compress/compress.sh@74 -- # create_vols 00:27:37.787 00:09:38 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:27:37.787 00:09:38 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:27:38.353 00:09:38 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:27:38.353 00:09:38 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=Nvme0n1 00:27:38.353 00:09:38 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:27:38.353 00:09:38 compress_isal -- common/autotest_common.sh@897 -- # local i 00:27:38.353 00:09:38 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:27:38.353 00:09:38 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:27:38.353 00:09:38 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:38.611 00:09:39 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:27:38.869 [ 00:27:38.869 { 00:27:38.869 "name": "Nvme0n1", 00:27:38.869 "aliases": [ 00:27:38.869 "01000000-0000-0000-5cd2-e43197705251" 00:27:38.870 ], 00:27:38.870 "product_name": "NVMe disk", 00:27:38.870 "block_size": 512, 00:27:38.870 "num_blocks": 15002931888, 00:27:38.870 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:27:38.870 "assigned_rate_limits": { 00:27:38.870 "rw_ios_per_sec": 0, 00:27:38.870 "rw_mbytes_per_sec": 0, 00:27:38.870 "r_mbytes_per_sec": 0, 00:27:38.870 "w_mbytes_per_sec": 0 00:27:38.870 }, 00:27:38.870 "claimed": false, 00:27:38.870 "zoned": false, 00:27:38.870 "supported_io_types": { 00:27:38.870 "read": true, 00:27:38.870 "write": true, 00:27:38.870 "unmap": true, 00:27:38.870 "write_zeroes": true, 00:27:38.870 "flush": true, 00:27:38.870 "reset": true, 00:27:38.870 "compare": false, 00:27:38.870 "compare_and_write": false, 00:27:38.870 "abort": true, 00:27:38.870 "nvme_admin": true, 00:27:38.870 "nvme_io": true 00:27:38.870 }, 00:27:38.870 "driver_specific": { 00:27:38.870 "nvme": [ 00:27:38.870 { 00:27:38.870 "pci_address": "0000:5e:00.0", 00:27:38.870 "trid": { 00:27:38.870 "trtype": "PCIe", 00:27:38.870 "traddr": "0000:5e:00.0" 00:27:38.870 }, 00:27:38.870 "ctrlr_data": { 00:27:38.870 "cntlid": 0, 00:27:38.870 "vendor_id": "0x8086", 00:27:38.870 "model_number": "INTEL SSDPF2KX076TZO", 00:27:38.870 "serial_number": "PHAC0301002G7P6CGN", 00:27:38.870 "firmware_revision": "JCV10200", 00:27:38.870 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:27:38.870 "oacs": { 00:27:38.870 "security": 1, 00:27:38.870 "format": 1, 00:27:38.870 "firmware": 1, 00:27:38.870 "ns_manage": 1 00:27:38.870 }, 00:27:38.870 "multi_ctrlr": false, 00:27:38.870 "ana_reporting": false 00:27:38.870 }, 00:27:38.870 "vs": { 00:27:38.870 "nvme_version": "1.3" 00:27:38.870 }, 00:27:38.870 "ns_data": { 00:27:38.870 "id": 1, 00:27:38.870 "can_share": false 00:27:38.870 }, 00:27:38.870 "security": { 00:27:38.870 "opal": true 00:27:38.870 } 00:27:38.870 } 00:27:38.870 ], 00:27:38.870 "mp_policy": "active_passive" 00:27:38.870 } 00:27:38.870 } 00:27:38.870 ] 00:27:38.870 00:09:39 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:27:38.870 00:09:39 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:27:41.420 0d8abe00-57a4-4535-b47c-20ad5f86e2ee 00:27:41.420 00:09:41 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:27:41.694 aa778dc4-e52e-4e4d-b9a7-529363621d20 00:27:41.694 00:09:42 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:27:41.694 00:09:42 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=lvs0/lv0 00:27:41.694 00:09:42 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:27:41.694 00:09:42 compress_isal -- common/autotest_common.sh@897 -- # local i 00:27:41.694 00:09:42 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:27:41.694 00:09:42 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:27:41.694 00:09:42 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:41.953 00:09:42 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:27:42.211 [ 00:27:42.211 { 00:27:42.211 "name": "aa778dc4-e52e-4e4d-b9a7-529363621d20", 00:27:42.211 "aliases": [ 00:27:42.211 "lvs0/lv0" 00:27:42.211 ], 00:27:42.211 "product_name": "Logical Volume", 00:27:42.211 "block_size": 512, 00:27:42.211 "num_blocks": 204800, 00:27:42.211 "uuid": "aa778dc4-e52e-4e4d-b9a7-529363621d20", 00:27:42.211 "assigned_rate_limits": { 00:27:42.211 "rw_ios_per_sec": 0, 00:27:42.211 "rw_mbytes_per_sec": 0, 00:27:42.211 "r_mbytes_per_sec": 0, 00:27:42.211 "w_mbytes_per_sec": 0 00:27:42.211 }, 00:27:42.211 "claimed": false, 00:27:42.211 "zoned": false, 00:27:42.211 "supported_io_types": { 00:27:42.211 "read": true, 00:27:42.211 "write": true, 00:27:42.211 "unmap": true, 00:27:42.211 "write_zeroes": true, 00:27:42.211 "flush": false, 00:27:42.211 "reset": true, 00:27:42.211 "compare": false, 00:27:42.211 "compare_and_write": false, 00:27:42.211 "abort": false, 00:27:42.211 "nvme_admin": false, 00:27:42.211 "nvme_io": false 00:27:42.211 }, 00:27:42.211 "driver_specific": { 00:27:42.211 "lvol": { 00:27:42.211 "lvol_store_uuid": "0d8abe00-57a4-4535-b47c-20ad5f86e2ee", 00:27:42.211 "base_bdev": "Nvme0n1", 00:27:42.211 "thin_provision": true, 00:27:42.211 "num_allocated_clusters": 0, 00:27:42.211 "snapshot": false, 00:27:42.211 "clone": false, 00:27:42.211 "esnap_clone": false 00:27:42.211 } 00:27:42.211 } 00:27:42.211 } 00:27:42.211 ] 00:27:42.211 00:09:42 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:27:42.211 00:09:42 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:27:42.212 00:09:42 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:27:42.470 [2024-05-15 00:09:42.858769] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:27:42.470 COMP_lvs0/lv0 00:27:42.470 00:09:42 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:27:42.470 00:09:42 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=COMP_lvs0/lv0 00:27:42.470 00:09:42 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:27:42.470 00:09:42 compress_isal -- common/autotest_common.sh@897 -- # local i 00:27:42.470 00:09:42 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:27:42.470 00:09:42 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:27:42.470 00:09:42 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:42.729 00:09:43 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:27:42.988 [ 00:27:42.988 { 00:27:42.988 "name": "COMP_lvs0/lv0", 00:27:42.988 "aliases": [ 00:27:42.988 "5c533573-aef2-5433-a241-9e11dd9caa6d" 00:27:42.988 ], 00:27:42.988 "product_name": "compress", 00:27:42.988 "block_size": 512, 00:27:42.988 "num_blocks": 200704, 00:27:42.988 "uuid": "5c533573-aef2-5433-a241-9e11dd9caa6d", 00:27:42.988 "assigned_rate_limits": { 00:27:42.988 "rw_ios_per_sec": 0, 00:27:42.988 "rw_mbytes_per_sec": 0, 00:27:42.988 "r_mbytes_per_sec": 0, 00:27:42.988 "w_mbytes_per_sec": 0 00:27:42.988 }, 00:27:42.988 "claimed": false, 00:27:42.988 "zoned": false, 00:27:42.988 "supported_io_types": { 00:27:42.988 "read": true, 00:27:42.988 "write": true, 00:27:42.988 "unmap": false, 00:27:42.988 "write_zeroes": true, 00:27:42.988 "flush": false, 00:27:42.988 "reset": false, 00:27:42.988 "compare": false, 00:27:42.988 "compare_and_write": false, 00:27:42.988 "abort": false, 00:27:42.988 "nvme_admin": false, 00:27:42.988 "nvme_io": false 00:27:42.988 }, 00:27:42.988 "driver_specific": { 00:27:42.988 "compress": { 00:27:42.988 "name": "COMP_lvs0/lv0", 00:27:42.988 "base_bdev_name": "aa778dc4-e52e-4e4d-b9a7-529363621d20" 00:27:42.988 } 00:27:42.988 } 00:27:42.988 } 00:27:42.988 ] 00:27:42.988 00:09:43 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:27:42.988 00:09:43 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:27:42.988 Running I/O for 3 seconds... 00:27:46.273 00:27:46.273 Latency(us) 00:27:46.273 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:46.273 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:27:46.273 Verification LBA range: start 0x0 length 0x3100 00:27:46.273 COMP_lvs0/lv0 : 3.00 3902.07 15.24 0.00 0.00 8145.25 676.73 7522.39 00:27:46.273 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:27:46.273 Verification LBA range: start 0x3100 length 0x3100 00:27:46.273 COMP_lvs0/lv0 : 3.00 3905.27 15.25 0.00 0.00 8152.21 633.99 7522.39 00:27:46.273 =================================================================================================================== 00:27:46.274 Total : 7807.34 30.50 0.00 0.00 8148.73 633.99 7522.39 00:27:46.274 0 00:27:46.274 00:09:46 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:27:46.274 00:09:46 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:27:46.274 00:09:46 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:27:46.533 00:09:47 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:27:46.533 00:09:47 compress_isal -- compress/compress.sh@78 -- # killprocess 539583 00:27:46.533 00:09:47 compress_isal -- common/autotest_common.sh@946 -- # '[' -z 539583 ']' 00:27:46.533 00:09:47 compress_isal -- common/autotest_common.sh@950 -- # kill -0 539583 00:27:46.533 00:09:47 compress_isal -- common/autotest_common.sh@951 -- # uname 00:27:46.533 00:09:47 compress_isal -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:27:46.533 00:09:47 compress_isal -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 539583 00:27:46.533 00:09:47 compress_isal -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:27:46.533 00:09:47 compress_isal -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:27:46.533 00:09:47 compress_isal -- common/autotest_common.sh@964 -- # echo 'killing process with pid 539583' 00:27:46.533 killing process with pid 539583 00:27:46.533 00:09:47 compress_isal -- common/autotest_common.sh@965 -- # kill 539583 00:27:46.533 Received shutdown signal, test time was about 3.000000 seconds 00:27:46.533 00:27:46.533 Latency(us) 00:27:46.533 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:46.533 =================================================================================================================== 00:27:46.533 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:46.533 00:09:47 compress_isal -- common/autotest_common.sh@970 -- # wait 539583 00:27:49.817 00:09:49 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:27:49.817 00:09:49 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:27:49.817 00:09:49 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=541185 00:27:49.817 00:09:49 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:27:49.817 00:09:49 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:27:49.818 00:09:49 compress_isal -- compress/compress.sh@73 -- # waitforlisten 541185 00:27:49.818 00:09:49 compress_isal -- common/autotest_common.sh@827 -- # '[' -z 541185 ']' 00:27:49.818 00:09:49 compress_isal -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:49.818 00:09:49 compress_isal -- common/autotest_common.sh@832 -- # local max_retries=100 00:27:49.818 00:09:49 compress_isal -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:49.818 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:49.818 00:09:49 compress_isal -- common/autotest_common.sh@836 -- # xtrace_disable 00:27:49.818 00:09:49 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:27:49.818 [2024-05-15 00:09:49.924855] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:27:49.818 [2024-05-15 00:09:49.924934] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid541185 ] 00:27:49.818 [2024-05-15 00:09:50.048769] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:49.818 [2024-05-15 00:09:50.159187] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:27:49.818 [2024-05-15 00:09:50.159193] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:27:50.383 00:09:50 compress_isal -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:27:50.383 00:09:50 compress_isal -- common/autotest_common.sh@860 -- # return 0 00:27:50.384 00:09:50 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:27:50.384 00:09:50 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:27:50.384 00:09:50 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:27:50.951 00:09:51 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:27:50.951 00:09:51 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=Nvme0n1 00:27:50.951 00:09:51 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:27:50.951 00:09:51 compress_isal -- common/autotest_common.sh@897 -- # local i 00:27:50.951 00:09:51 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:27:50.951 00:09:51 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:27:50.951 00:09:51 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:51.209 00:09:51 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:27:51.468 [ 00:27:51.468 { 00:27:51.468 "name": "Nvme0n1", 00:27:51.468 "aliases": [ 00:27:51.468 "01000000-0000-0000-5cd2-e43197705251" 00:27:51.468 ], 00:27:51.468 "product_name": "NVMe disk", 00:27:51.468 "block_size": 512, 00:27:51.468 "num_blocks": 15002931888, 00:27:51.468 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:27:51.468 "assigned_rate_limits": { 00:27:51.468 "rw_ios_per_sec": 0, 00:27:51.468 "rw_mbytes_per_sec": 0, 00:27:51.468 "r_mbytes_per_sec": 0, 00:27:51.468 "w_mbytes_per_sec": 0 00:27:51.468 }, 00:27:51.468 "claimed": false, 00:27:51.468 "zoned": false, 00:27:51.468 "supported_io_types": { 00:27:51.468 "read": true, 00:27:51.468 "write": true, 00:27:51.468 "unmap": true, 00:27:51.468 "write_zeroes": true, 00:27:51.468 "flush": true, 00:27:51.468 "reset": true, 00:27:51.468 "compare": false, 00:27:51.468 "compare_and_write": false, 00:27:51.468 "abort": true, 00:27:51.468 "nvme_admin": true, 00:27:51.468 "nvme_io": true 00:27:51.468 }, 00:27:51.468 "driver_specific": { 00:27:51.468 "nvme": [ 00:27:51.468 { 00:27:51.468 "pci_address": "0000:5e:00.0", 00:27:51.468 "trid": { 00:27:51.468 "trtype": "PCIe", 00:27:51.468 "traddr": "0000:5e:00.0" 00:27:51.468 }, 00:27:51.468 "ctrlr_data": { 00:27:51.468 "cntlid": 0, 00:27:51.468 "vendor_id": "0x8086", 00:27:51.468 "model_number": "INTEL SSDPF2KX076TZO", 00:27:51.468 "serial_number": "PHAC0301002G7P6CGN", 00:27:51.468 "firmware_revision": "JCV10200", 00:27:51.468 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:27:51.468 "oacs": { 00:27:51.468 "security": 1, 00:27:51.468 "format": 1, 00:27:51.468 "firmware": 1, 00:27:51.468 "ns_manage": 1 00:27:51.468 }, 00:27:51.468 "multi_ctrlr": false, 00:27:51.468 "ana_reporting": false 00:27:51.468 }, 00:27:51.468 "vs": { 00:27:51.468 "nvme_version": "1.3" 00:27:51.468 }, 00:27:51.468 "ns_data": { 00:27:51.468 "id": 1, 00:27:51.468 "can_share": false 00:27:51.468 }, 00:27:51.468 "security": { 00:27:51.468 "opal": true 00:27:51.468 } 00:27:51.468 } 00:27:51.468 ], 00:27:51.468 "mp_policy": "active_passive" 00:27:51.468 } 00:27:51.468 } 00:27:51.468 ] 00:27:51.468 00:09:51 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:27:51.468 00:09:51 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:27:54.000 cd4f1c12-10f2-42f5-a25c-7560be4113a4 00:27:54.000 00:09:54 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:27:54.000 3e214e4c-0a40-48e2-9259-2d7db069fbe5 00:27:54.000 00:09:54 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:27:54.000 00:09:54 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=lvs0/lv0 00:27:54.000 00:09:54 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:27:54.000 00:09:54 compress_isal -- common/autotest_common.sh@897 -- # local i 00:27:54.000 00:09:54 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:27:54.000 00:09:54 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:27:54.000 00:09:54 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:54.259 00:09:54 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:27:54.518 [ 00:27:54.518 { 00:27:54.518 "name": "3e214e4c-0a40-48e2-9259-2d7db069fbe5", 00:27:54.518 "aliases": [ 00:27:54.518 "lvs0/lv0" 00:27:54.518 ], 00:27:54.518 "product_name": "Logical Volume", 00:27:54.518 "block_size": 512, 00:27:54.518 "num_blocks": 204800, 00:27:54.518 "uuid": "3e214e4c-0a40-48e2-9259-2d7db069fbe5", 00:27:54.518 "assigned_rate_limits": { 00:27:54.518 "rw_ios_per_sec": 0, 00:27:54.518 "rw_mbytes_per_sec": 0, 00:27:54.518 "r_mbytes_per_sec": 0, 00:27:54.518 "w_mbytes_per_sec": 0 00:27:54.518 }, 00:27:54.518 "claimed": false, 00:27:54.518 "zoned": false, 00:27:54.518 "supported_io_types": { 00:27:54.518 "read": true, 00:27:54.518 "write": true, 00:27:54.518 "unmap": true, 00:27:54.518 "write_zeroes": true, 00:27:54.518 "flush": false, 00:27:54.518 "reset": true, 00:27:54.518 "compare": false, 00:27:54.518 "compare_and_write": false, 00:27:54.518 "abort": false, 00:27:54.518 "nvme_admin": false, 00:27:54.518 "nvme_io": false 00:27:54.518 }, 00:27:54.518 "driver_specific": { 00:27:54.518 "lvol": { 00:27:54.518 "lvol_store_uuid": "cd4f1c12-10f2-42f5-a25c-7560be4113a4", 00:27:54.518 "base_bdev": "Nvme0n1", 00:27:54.518 "thin_provision": true, 00:27:54.518 "num_allocated_clusters": 0, 00:27:54.518 "snapshot": false, 00:27:54.518 "clone": false, 00:27:54.518 "esnap_clone": false 00:27:54.518 } 00:27:54.518 } 00:27:54.518 } 00:27:54.518 ] 00:27:54.518 00:09:55 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:27:54.518 00:09:55 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:27:54.518 00:09:55 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:27:54.777 [2024-05-15 00:09:55.244191] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:27:54.777 COMP_lvs0/lv0 00:27:54.777 00:09:55 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:27:54.777 00:09:55 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=COMP_lvs0/lv0 00:27:54.777 00:09:55 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:27:54.777 00:09:55 compress_isal -- common/autotest_common.sh@897 -- # local i 00:27:54.777 00:09:55 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:27:54.777 00:09:55 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:27:54.777 00:09:55 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:55.036 00:09:55 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:27:55.294 [ 00:27:55.294 { 00:27:55.294 "name": "COMP_lvs0/lv0", 00:27:55.294 "aliases": [ 00:27:55.294 "25a225e4-b11d-593a-88f1-56978bd58fb3" 00:27:55.294 ], 00:27:55.294 "product_name": "compress", 00:27:55.294 "block_size": 512, 00:27:55.294 "num_blocks": 200704, 00:27:55.294 "uuid": "25a225e4-b11d-593a-88f1-56978bd58fb3", 00:27:55.294 "assigned_rate_limits": { 00:27:55.294 "rw_ios_per_sec": 0, 00:27:55.294 "rw_mbytes_per_sec": 0, 00:27:55.294 "r_mbytes_per_sec": 0, 00:27:55.294 "w_mbytes_per_sec": 0 00:27:55.294 }, 00:27:55.294 "claimed": false, 00:27:55.294 "zoned": false, 00:27:55.294 "supported_io_types": { 00:27:55.294 "read": true, 00:27:55.294 "write": true, 00:27:55.294 "unmap": false, 00:27:55.294 "write_zeroes": true, 00:27:55.294 "flush": false, 00:27:55.294 "reset": false, 00:27:55.294 "compare": false, 00:27:55.294 "compare_and_write": false, 00:27:55.294 "abort": false, 00:27:55.294 "nvme_admin": false, 00:27:55.294 "nvme_io": false 00:27:55.294 }, 00:27:55.294 "driver_specific": { 00:27:55.294 "compress": { 00:27:55.294 "name": "COMP_lvs0/lv0", 00:27:55.294 "base_bdev_name": "3e214e4c-0a40-48e2-9259-2d7db069fbe5" 00:27:55.294 } 00:27:55.294 } 00:27:55.294 } 00:27:55.294 ] 00:27:55.294 00:09:55 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:27:55.294 00:09:55 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:27:55.294 Running I/O for 3 seconds... 00:27:58.579 00:27:58.579 Latency(us) 00:27:58.579 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:58.579 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:27:58.579 Verification LBA range: start 0x0 length 0x3100 00:27:58.579 COMP_lvs0/lv0 : 3.01 2879.00 11.25 0.00 0.00 11066.00 673.17 9573.95 00:27:58.579 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:27:58.579 Verification LBA range: start 0x3100 length 0x3100 00:27:58.579 COMP_lvs0/lv0 : 3.01 2876.01 11.23 0.00 0.00 11087.05 918.93 9630.94 00:27:58.579 =================================================================================================================== 00:27:58.579 Total : 5755.01 22.48 0.00 0.00 11076.52 673.17 9630.94 00:27:58.579 0 00:27:58.579 00:09:58 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:27:58.579 00:09:58 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:27:58.579 00:09:59 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:27:58.838 00:09:59 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:27:58.838 00:09:59 compress_isal -- compress/compress.sh@78 -- # killprocess 541185 00:27:58.838 00:09:59 compress_isal -- common/autotest_common.sh@946 -- # '[' -z 541185 ']' 00:27:58.838 00:09:59 compress_isal -- common/autotest_common.sh@950 -- # kill -0 541185 00:27:58.838 00:09:59 compress_isal -- common/autotest_common.sh@951 -- # uname 00:27:58.838 00:09:59 compress_isal -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:27:58.838 00:09:59 compress_isal -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 541185 00:27:59.096 00:09:59 compress_isal -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:27:59.096 00:09:59 compress_isal -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:27:59.096 00:09:59 compress_isal -- common/autotest_common.sh@964 -- # echo 'killing process with pid 541185' 00:27:59.096 killing process with pid 541185 00:27:59.096 00:09:59 compress_isal -- common/autotest_common.sh@965 -- # kill 541185 00:27:59.096 Received shutdown signal, test time was about 3.000000 seconds 00:27:59.096 00:27:59.096 Latency(us) 00:27:59.096 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:59.097 =================================================================================================================== 00:27:59.097 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:59.097 00:09:59 compress_isal -- common/autotest_common.sh@970 -- # wait 541185 00:28:01.628 00:10:02 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:28:01.628 00:10:02 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:28:01.628 00:10:02 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=542781 00:28:01.628 00:10:02 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:01.628 00:10:02 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:28:01.628 00:10:02 compress_isal -- compress/compress.sh@73 -- # waitforlisten 542781 00:28:01.628 00:10:02 compress_isal -- common/autotest_common.sh@827 -- # '[' -z 542781 ']' 00:28:01.628 00:10:02 compress_isal -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:01.629 00:10:02 compress_isal -- common/autotest_common.sh@832 -- # local max_retries=100 00:28:01.629 00:10:02 compress_isal -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:01.629 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:01.629 00:10:02 compress_isal -- common/autotest_common.sh@836 -- # xtrace_disable 00:28:01.629 00:10:02 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:28:01.887 [2024-05-15 00:10:02.271240] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:28:01.887 [2024-05-15 00:10:02.271307] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid542781 ] 00:28:01.887 [2024-05-15 00:10:02.391267] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:28:02.145 [2024-05-15 00:10:02.496883] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:28:02.145 [2024-05-15 00:10:02.496890] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:28:02.710 00:10:03 compress_isal -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:28:02.710 00:10:03 compress_isal -- common/autotest_common.sh@860 -- # return 0 00:28:02.710 00:10:03 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:28:02.710 00:10:03 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:28:02.710 00:10:03 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:28:03.276 00:10:03 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:28:03.276 00:10:03 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=Nvme0n1 00:28:03.276 00:10:03 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:28:03.276 00:10:03 compress_isal -- common/autotest_common.sh@897 -- # local i 00:28:03.276 00:10:03 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:28:03.276 00:10:03 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:28:03.276 00:10:03 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:03.534 00:10:04 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:28:03.792 [ 00:28:03.792 { 00:28:03.792 "name": "Nvme0n1", 00:28:03.792 "aliases": [ 00:28:03.792 "01000000-0000-0000-5cd2-e43197705251" 00:28:03.792 ], 00:28:03.792 "product_name": "NVMe disk", 00:28:03.792 "block_size": 512, 00:28:03.792 "num_blocks": 15002931888, 00:28:03.792 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:28:03.792 "assigned_rate_limits": { 00:28:03.792 "rw_ios_per_sec": 0, 00:28:03.792 "rw_mbytes_per_sec": 0, 00:28:03.792 "r_mbytes_per_sec": 0, 00:28:03.792 "w_mbytes_per_sec": 0 00:28:03.792 }, 00:28:03.792 "claimed": false, 00:28:03.792 "zoned": false, 00:28:03.792 "supported_io_types": { 00:28:03.792 "read": true, 00:28:03.792 "write": true, 00:28:03.792 "unmap": true, 00:28:03.792 "write_zeroes": true, 00:28:03.792 "flush": true, 00:28:03.792 "reset": true, 00:28:03.792 "compare": false, 00:28:03.792 "compare_and_write": false, 00:28:03.792 "abort": true, 00:28:03.792 "nvme_admin": true, 00:28:03.792 "nvme_io": true 00:28:03.792 }, 00:28:03.792 "driver_specific": { 00:28:03.792 "nvme": [ 00:28:03.792 { 00:28:03.792 "pci_address": "0000:5e:00.0", 00:28:03.792 "trid": { 00:28:03.792 "trtype": "PCIe", 00:28:03.792 "traddr": "0000:5e:00.0" 00:28:03.792 }, 00:28:03.792 "ctrlr_data": { 00:28:03.792 "cntlid": 0, 00:28:03.792 "vendor_id": "0x8086", 00:28:03.792 "model_number": "INTEL SSDPF2KX076TZO", 00:28:03.792 "serial_number": "PHAC0301002G7P6CGN", 00:28:03.792 "firmware_revision": "JCV10200", 00:28:03.792 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:28:03.792 "oacs": { 00:28:03.792 "security": 1, 00:28:03.792 "format": 1, 00:28:03.792 "firmware": 1, 00:28:03.792 "ns_manage": 1 00:28:03.792 }, 00:28:03.792 "multi_ctrlr": false, 00:28:03.792 "ana_reporting": false 00:28:03.792 }, 00:28:03.792 "vs": { 00:28:03.792 "nvme_version": "1.3" 00:28:03.792 }, 00:28:03.792 "ns_data": { 00:28:03.792 "id": 1, 00:28:03.792 "can_share": false 00:28:03.792 }, 00:28:03.792 "security": { 00:28:03.792 "opal": true 00:28:03.792 } 00:28:03.792 } 00:28:03.792 ], 00:28:03.792 "mp_policy": "active_passive" 00:28:03.792 } 00:28:03.792 } 00:28:03.792 ] 00:28:03.792 00:10:04 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:28:03.792 00:10:04 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:28:06.382 680425f9-4ffa-4311-8782-ec6404d50eff 00:28:06.382 00:10:06 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:28:06.382 d06e68f4-d51d-478e-8280-0ae9c847cb31 00:28:06.382 00:10:06 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:28:06.382 00:10:06 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=lvs0/lv0 00:28:06.382 00:10:06 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:28:06.382 00:10:06 compress_isal -- common/autotest_common.sh@897 -- # local i 00:28:06.382 00:10:06 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:28:06.382 00:10:06 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:28:06.382 00:10:06 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:06.641 00:10:07 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:28:06.899 [ 00:28:06.899 { 00:28:06.899 "name": "d06e68f4-d51d-478e-8280-0ae9c847cb31", 00:28:06.899 "aliases": [ 00:28:06.899 "lvs0/lv0" 00:28:06.899 ], 00:28:06.899 "product_name": "Logical Volume", 00:28:06.899 "block_size": 512, 00:28:06.899 "num_blocks": 204800, 00:28:06.899 "uuid": "d06e68f4-d51d-478e-8280-0ae9c847cb31", 00:28:06.899 "assigned_rate_limits": { 00:28:06.899 "rw_ios_per_sec": 0, 00:28:06.899 "rw_mbytes_per_sec": 0, 00:28:06.899 "r_mbytes_per_sec": 0, 00:28:06.899 "w_mbytes_per_sec": 0 00:28:06.899 }, 00:28:06.899 "claimed": false, 00:28:06.899 "zoned": false, 00:28:06.899 "supported_io_types": { 00:28:06.899 "read": true, 00:28:06.899 "write": true, 00:28:06.899 "unmap": true, 00:28:06.899 "write_zeroes": true, 00:28:06.899 "flush": false, 00:28:06.899 "reset": true, 00:28:06.899 "compare": false, 00:28:06.899 "compare_and_write": false, 00:28:06.899 "abort": false, 00:28:06.899 "nvme_admin": false, 00:28:06.899 "nvme_io": false 00:28:06.899 }, 00:28:06.899 "driver_specific": { 00:28:06.899 "lvol": { 00:28:06.899 "lvol_store_uuid": "680425f9-4ffa-4311-8782-ec6404d50eff", 00:28:06.899 "base_bdev": "Nvme0n1", 00:28:06.899 "thin_provision": true, 00:28:06.899 "num_allocated_clusters": 0, 00:28:06.899 "snapshot": false, 00:28:06.899 "clone": false, 00:28:06.899 "esnap_clone": false 00:28:06.899 } 00:28:06.899 } 00:28:06.899 } 00:28:06.899 ] 00:28:06.899 00:10:07 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:28:06.899 00:10:07 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:28:06.899 00:10:07 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:28:07.158 [2024-05-15 00:10:07.648355] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:28:07.158 COMP_lvs0/lv0 00:28:07.158 00:10:07 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:28:07.158 00:10:07 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=COMP_lvs0/lv0 00:28:07.158 00:10:07 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:28:07.158 00:10:07 compress_isal -- common/autotest_common.sh@897 -- # local i 00:28:07.158 00:10:07 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:28:07.158 00:10:07 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:28:07.158 00:10:07 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:07.415 00:10:07 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:28:07.673 [ 00:28:07.673 { 00:28:07.673 "name": "COMP_lvs0/lv0", 00:28:07.673 "aliases": [ 00:28:07.673 "9ca7ee8d-8ee2-5e87-a071-fae88b8de6d3" 00:28:07.673 ], 00:28:07.673 "product_name": "compress", 00:28:07.673 "block_size": 4096, 00:28:07.673 "num_blocks": 25088, 00:28:07.673 "uuid": "9ca7ee8d-8ee2-5e87-a071-fae88b8de6d3", 00:28:07.673 "assigned_rate_limits": { 00:28:07.673 "rw_ios_per_sec": 0, 00:28:07.673 "rw_mbytes_per_sec": 0, 00:28:07.673 "r_mbytes_per_sec": 0, 00:28:07.673 "w_mbytes_per_sec": 0 00:28:07.673 }, 00:28:07.673 "claimed": false, 00:28:07.673 "zoned": false, 00:28:07.673 "supported_io_types": { 00:28:07.673 "read": true, 00:28:07.673 "write": true, 00:28:07.673 "unmap": false, 00:28:07.673 "write_zeroes": true, 00:28:07.673 "flush": false, 00:28:07.673 "reset": false, 00:28:07.673 "compare": false, 00:28:07.674 "compare_and_write": false, 00:28:07.674 "abort": false, 00:28:07.674 "nvme_admin": false, 00:28:07.674 "nvme_io": false 00:28:07.674 }, 00:28:07.674 "driver_specific": { 00:28:07.674 "compress": { 00:28:07.674 "name": "COMP_lvs0/lv0", 00:28:07.674 "base_bdev_name": "d06e68f4-d51d-478e-8280-0ae9c847cb31" 00:28:07.674 } 00:28:07.674 } 00:28:07.674 } 00:28:07.674 ] 00:28:07.674 00:10:08 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:28:07.674 00:10:08 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:28:07.931 Running I/O for 3 seconds... 00:28:11.215 00:28:11.215 Latency(us) 00:28:11.215 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:11.215 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:28:11.215 Verification LBA range: start 0x0 length 0x3100 00:28:11.215 COMP_lvs0/lv0 : 3.01 2900.30 11.33 0.00 0.00 10987.26 673.17 10086.85 00:28:11.215 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:28:11.215 Verification LBA range: start 0x3100 length 0x3100 00:28:11.215 COMP_lvs0/lv0 : 3.01 2897.64 11.32 0.00 0.00 11005.12 918.93 10086.85 00:28:11.215 =================================================================================================================== 00:28:11.215 Total : 5797.94 22.65 0.00 0.00 10996.19 673.17 10086.85 00:28:11.215 0 00:28:11.215 00:10:11 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:28:11.215 00:10:11 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:28:11.215 00:10:11 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:28:11.473 00:10:11 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:28:11.473 00:10:11 compress_isal -- compress/compress.sh@78 -- # killprocess 542781 00:28:11.473 00:10:11 compress_isal -- common/autotest_common.sh@946 -- # '[' -z 542781 ']' 00:28:11.473 00:10:11 compress_isal -- common/autotest_common.sh@950 -- # kill -0 542781 00:28:11.473 00:10:11 compress_isal -- common/autotest_common.sh@951 -- # uname 00:28:11.473 00:10:11 compress_isal -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:28:11.473 00:10:11 compress_isal -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 542781 00:28:11.473 00:10:11 compress_isal -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:28:11.473 00:10:11 compress_isal -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:28:11.473 00:10:11 compress_isal -- common/autotest_common.sh@964 -- # echo 'killing process with pid 542781' 00:28:11.473 killing process with pid 542781 00:28:11.473 00:10:11 compress_isal -- common/autotest_common.sh@965 -- # kill 542781 00:28:11.473 Received shutdown signal, test time was about 3.000000 seconds 00:28:11.473 00:28:11.473 Latency(us) 00:28:11.473 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:11.473 =================================================================================================================== 00:28:11.473 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:11.474 00:10:11 compress_isal -- common/autotest_common.sh@970 -- # wait 542781 00:28:14.758 00:10:14 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:28:14.758 00:10:14 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:28:14.758 00:10:14 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=544385 00:28:14.758 00:10:14 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:14.758 00:10:14 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:28:14.758 00:10:14 compress_isal -- compress/compress.sh@57 -- # waitforlisten 544385 00:28:14.758 00:10:14 compress_isal -- common/autotest_common.sh@827 -- # '[' -z 544385 ']' 00:28:14.758 00:10:14 compress_isal -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:14.758 00:10:14 compress_isal -- common/autotest_common.sh@832 -- # local max_retries=100 00:28:14.758 00:10:14 compress_isal -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:14.758 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:14.758 00:10:14 compress_isal -- common/autotest_common.sh@836 -- # xtrace_disable 00:28:14.758 00:10:14 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:28:14.758 [2024-05-15 00:10:14.723874] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:28:14.758 [2024-05-15 00:10:14.723935] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid544385 ] 00:28:14.758 [2024-05-15 00:10:14.837969] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:28:14.758 [2024-05-15 00:10:14.944356] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:28:14.758 [2024-05-15 00:10:14.944431] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:28:14.758 [2024-05-15 00:10:14.944437] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:15.016 00:10:15 compress_isal -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:28:15.016 00:10:15 compress_isal -- common/autotest_common.sh@860 -- # return 0 00:28:15.016 00:10:15 compress_isal -- compress/compress.sh@58 -- # create_vols 00:28:15.016 00:10:15 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:28:15.016 00:10:15 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:28:15.951 00:10:16 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:28:15.951 00:10:16 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=Nvme0n1 00:28:15.951 00:10:16 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:28:15.951 00:10:16 compress_isal -- common/autotest_common.sh@897 -- # local i 00:28:15.951 00:10:16 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:28:15.951 00:10:16 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:28:15.951 00:10:16 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:15.951 00:10:16 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:28:16.210 [ 00:28:16.210 { 00:28:16.210 "name": "Nvme0n1", 00:28:16.210 "aliases": [ 00:28:16.210 "01000000-0000-0000-5cd2-e43197705251" 00:28:16.210 ], 00:28:16.210 "product_name": "NVMe disk", 00:28:16.210 "block_size": 512, 00:28:16.210 "num_blocks": 15002931888, 00:28:16.210 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:28:16.210 "assigned_rate_limits": { 00:28:16.210 "rw_ios_per_sec": 0, 00:28:16.210 "rw_mbytes_per_sec": 0, 00:28:16.210 "r_mbytes_per_sec": 0, 00:28:16.210 "w_mbytes_per_sec": 0 00:28:16.210 }, 00:28:16.210 "claimed": false, 00:28:16.210 "zoned": false, 00:28:16.210 "supported_io_types": { 00:28:16.210 "read": true, 00:28:16.210 "write": true, 00:28:16.210 "unmap": true, 00:28:16.210 "write_zeroes": true, 00:28:16.210 "flush": true, 00:28:16.210 "reset": true, 00:28:16.210 "compare": false, 00:28:16.210 "compare_and_write": false, 00:28:16.210 "abort": true, 00:28:16.210 "nvme_admin": true, 00:28:16.210 "nvme_io": true 00:28:16.210 }, 00:28:16.210 "driver_specific": { 00:28:16.210 "nvme": [ 00:28:16.210 { 00:28:16.210 "pci_address": "0000:5e:00.0", 00:28:16.210 "trid": { 00:28:16.210 "trtype": "PCIe", 00:28:16.210 "traddr": "0000:5e:00.0" 00:28:16.210 }, 00:28:16.210 "ctrlr_data": { 00:28:16.210 "cntlid": 0, 00:28:16.210 "vendor_id": "0x8086", 00:28:16.210 "model_number": "INTEL SSDPF2KX076TZO", 00:28:16.210 "serial_number": "PHAC0301002G7P6CGN", 00:28:16.210 "firmware_revision": "JCV10200", 00:28:16.210 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:28:16.210 "oacs": { 00:28:16.210 "security": 1, 00:28:16.210 "format": 1, 00:28:16.210 "firmware": 1, 00:28:16.210 "ns_manage": 1 00:28:16.210 }, 00:28:16.210 "multi_ctrlr": false, 00:28:16.210 "ana_reporting": false 00:28:16.210 }, 00:28:16.210 "vs": { 00:28:16.210 "nvme_version": "1.3" 00:28:16.210 }, 00:28:16.210 "ns_data": { 00:28:16.210 "id": 1, 00:28:16.210 "can_share": false 00:28:16.210 }, 00:28:16.210 "security": { 00:28:16.210 "opal": true 00:28:16.210 } 00:28:16.210 } 00:28:16.210 ], 00:28:16.210 "mp_policy": "active_passive" 00:28:16.210 } 00:28:16.210 } 00:28:16.210 ] 00:28:16.210 00:10:16 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:28:16.210 00:10:16 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:28:18.744 69c5b627-5022-4c9d-8c46-5061dab71d73 00:28:18.744 00:10:18 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:28:18.744 97f0c27b-c92b-4f81-8005-1576eb214395 00:28:18.744 00:10:19 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:28:18.744 00:10:19 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=lvs0/lv0 00:28:18.744 00:10:19 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:28:18.744 00:10:19 compress_isal -- common/autotest_common.sh@897 -- # local i 00:28:18.744 00:10:19 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:28:18.744 00:10:19 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:28:18.744 00:10:19 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:19.001 00:10:19 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:28:19.001 [ 00:28:19.001 { 00:28:19.001 "name": "97f0c27b-c92b-4f81-8005-1576eb214395", 00:28:19.001 "aliases": [ 00:28:19.001 "lvs0/lv0" 00:28:19.001 ], 00:28:19.001 "product_name": "Logical Volume", 00:28:19.001 "block_size": 512, 00:28:19.001 "num_blocks": 204800, 00:28:19.001 "uuid": "97f0c27b-c92b-4f81-8005-1576eb214395", 00:28:19.001 "assigned_rate_limits": { 00:28:19.001 "rw_ios_per_sec": 0, 00:28:19.002 "rw_mbytes_per_sec": 0, 00:28:19.002 "r_mbytes_per_sec": 0, 00:28:19.002 "w_mbytes_per_sec": 0 00:28:19.002 }, 00:28:19.002 "claimed": false, 00:28:19.002 "zoned": false, 00:28:19.002 "supported_io_types": { 00:28:19.002 "read": true, 00:28:19.002 "write": true, 00:28:19.002 "unmap": true, 00:28:19.002 "write_zeroes": true, 00:28:19.002 "flush": false, 00:28:19.002 "reset": true, 00:28:19.002 "compare": false, 00:28:19.002 "compare_and_write": false, 00:28:19.002 "abort": false, 00:28:19.002 "nvme_admin": false, 00:28:19.002 "nvme_io": false 00:28:19.002 }, 00:28:19.002 "driver_specific": { 00:28:19.002 "lvol": { 00:28:19.002 "lvol_store_uuid": "69c5b627-5022-4c9d-8c46-5061dab71d73", 00:28:19.002 "base_bdev": "Nvme0n1", 00:28:19.002 "thin_provision": true, 00:28:19.002 "num_allocated_clusters": 0, 00:28:19.002 "snapshot": false, 00:28:19.002 "clone": false, 00:28:19.002 "esnap_clone": false 00:28:19.002 } 00:28:19.002 } 00:28:19.002 } 00:28:19.002 ] 00:28:19.259 00:10:19 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:28:19.259 00:10:19 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:28:19.259 00:10:19 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:28:19.259 [2024-05-15 00:10:19.773044] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:28:19.259 COMP_lvs0/lv0 00:28:19.259 00:10:19 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:28:19.259 00:10:19 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=COMP_lvs0/lv0 00:28:19.259 00:10:19 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:28:19.259 00:10:19 compress_isal -- common/autotest_common.sh@897 -- # local i 00:28:19.259 00:10:19 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:28:19.259 00:10:19 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:28:19.259 00:10:19 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:19.517 00:10:19 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:28:19.773 [ 00:28:19.773 { 00:28:19.773 "name": "COMP_lvs0/lv0", 00:28:19.773 "aliases": [ 00:28:19.773 "612835b6-d294-544f-80e9-692af40ba0dd" 00:28:19.773 ], 00:28:19.773 "product_name": "compress", 00:28:19.773 "block_size": 512, 00:28:19.773 "num_blocks": 200704, 00:28:19.773 "uuid": "612835b6-d294-544f-80e9-692af40ba0dd", 00:28:19.773 "assigned_rate_limits": { 00:28:19.773 "rw_ios_per_sec": 0, 00:28:19.773 "rw_mbytes_per_sec": 0, 00:28:19.773 "r_mbytes_per_sec": 0, 00:28:19.773 "w_mbytes_per_sec": 0 00:28:19.773 }, 00:28:19.773 "claimed": false, 00:28:19.773 "zoned": false, 00:28:19.773 "supported_io_types": { 00:28:19.773 "read": true, 00:28:19.773 "write": true, 00:28:19.773 "unmap": false, 00:28:19.773 "write_zeroes": true, 00:28:19.773 "flush": false, 00:28:19.773 "reset": false, 00:28:19.773 "compare": false, 00:28:19.773 "compare_and_write": false, 00:28:19.773 "abort": false, 00:28:19.773 "nvme_admin": false, 00:28:19.773 "nvme_io": false 00:28:19.773 }, 00:28:19.773 "driver_specific": { 00:28:19.773 "compress": { 00:28:19.773 "name": "COMP_lvs0/lv0", 00:28:19.773 "base_bdev_name": "97f0c27b-c92b-4f81-8005-1576eb214395" 00:28:19.773 } 00:28:19.773 } 00:28:19.773 } 00:28:19.774 ] 00:28:19.774 00:10:20 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:28:19.774 00:10:20 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:28:19.774 I/O targets: 00:28:19.774 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:28:19.774 00:28:19.774 00:28:19.774 CUnit - A unit testing framework for C - Version 2.1-3 00:28:19.774 http://cunit.sourceforge.net/ 00:28:19.774 00:28:19.774 00:28:19.774 Suite: bdevio tests on: COMP_lvs0/lv0 00:28:19.774 Test: blockdev write read block ...passed 00:28:19.774 Test: blockdev write zeroes read block ...passed 00:28:19.774 Test: blockdev write zeroes read no split ...passed 00:28:19.774 Test: blockdev write zeroes read split ...passed 00:28:19.774 Test: blockdev write zeroes read split partial ...passed 00:28:19.774 Test: blockdev reset ...[2024-05-15 00:10:20.297989] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:28:19.774 passed 00:28:19.774 Test: blockdev write read 8 blocks ...passed 00:28:19.774 Test: blockdev write read size > 128k ...passed 00:28:19.774 Test: blockdev write read invalid size ...passed 00:28:19.774 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:28:19.774 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:28:19.774 Test: blockdev write read max offset ...passed 00:28:19.774 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:28:19.774 Test: blockdev writev readv 8 blocks ...passed 00:28:19.774 Test: blockdev writev readv 30 x 1block ...passed 00:28:19.774 Test: blockdev writev readv block ...passed 00:28:19.774 Test: blockdev writev readv size > 128k ...passed 00:28:19.774 Test: blockdev writev readv size > 128k in two iovs ...passed 00:28:19.774 Test: blockdev comparev and writev ...passed 00:28:19.774 Test: blockdev nvme passthru rw ...passed 00:28:19.774 Test: blockdev nvme passthru vendor specific ...passed 00:28:19.774 Test: blockdev nvme admin passthru ...passed 00:28:19.774 Test: blockdev copy ...passed 00:28:19.774 00:28:19.774 Run Summary: Type Total Ran Passed Failed Inactive 00:28:19.774 suites 1 1 n/a 0 0 00:28:19.774 tests 23 23 23 0 0 00:28:19.774 asserts 130 130 130 0 n/a 00:28:19.774 00:28:19.774 Elapsed time = 0.113 seconds 00:28:19.774 0 00:28:19.774 00:10:20 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:28:19.774 00:10:20 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:28:20.031 00:10:20 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:28:20.290 00:10:20 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:28:20.290 00:10:20 compress_isal -- compress/compress.sh@62 -- # killprocess 544385 00:28:20.290 00:10:20 compress_isal -- common/autotest_common.sh@946 -- # '[' -z 544385 ']' 00:28:20.290 00:10:20 compress_isal -- common/autotest_common.sh@950 -- # kill -0 544385 00:28:20.290 00:10:20 compress_isal -- common/autotest_common.sh@951 -- # uname 00:28:20.290 00:10:20 compress_isal -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:28:20.290 00:10:20 compress_isal -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 544385 00:28:20.290 00:10:20 compress_isal -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:28:20.290 00:10:20 compress_isal -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:28:20.290 00:10:20 compress_isal -- common/autotest_common.sh@964 -- # echo 'killing process with pid 544385' 00:28:20.290 killing process with pid 544385 00:28:20.290 00:10:20 compress_isal -- common/autotest_common.sh@965 -- # kill 544385 00:28:20.290 00:10:20 compress_isal -- common/autotest_common.sh@970 -- # wait 544385 00:28:23.606 00:10:23 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:28:23.606 00:10:23 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:28:23.606 00:28:23.606 real 0m46.342s 00:28:23.606 user 1m47.617s 00:28:23.606 sys 0m4.181s 00:28:23.606 00:10:23 compress_isal -- common/autotest_common.sh@1122 -- # xtrace_disable 00:28:23.606 00:10:23 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:28:23.606 ************************************ 00:28:23.606 END TEST compress_isal 00:28:23.606 ************************************ 00:28:23.606 00:10:23 -- spdk/autotest.sh@348 -- # '[' 0 -eq 1 ']' 00:28:23.606 00:10:23 -- spdk/autotest.sh@352 -- # '[' 1 -eq 1 ']' 00:28:23.606 00:10:23 -- spdk/autotest.sh@353 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:28:23.606 00:10:23 -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:28:23.606 00:10:23 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:28:23.606 00:10:23 -- common/autotest_common.sh@10 -- # set +x 00:28:23.606 ************************************ 00:28:23.606 START TEST blockdev_crypto_aesni 00:28:23.606 ************************************ 00:28:23.606 00:10:23 blockdev_crypto_aesni -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:28:23.606 * Looking for test storage... 00:28:23.606 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:28:23.606 00:10:23 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:28:23.606 00:10:23 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:28:23.606 00:10:23 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:28:23.606 00:10:23 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:28:23.606 00:10:23 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:28:23.606 00:10:23 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:28:23.606 00:10:23 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:28:23.606 00:10:23 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:28:23.606 00:10:23 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:28:23.606 00:10:23 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:28:23.606 00:10:23 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:28:23.606 00:10:23 blockdev_crypto_aesni -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:28:23.606 00:10:23 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # uname -s 00:28:23.606 00:10:23 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:28:23.606 00:10:23 blockdev_crypto_aesni -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:28:23.606 00:10:23 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # test_type=crypto_aesni 00:28:23.606 00:10:23 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # crypto_device= 00:28:23.606 00:10:23 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # dek= 00:28:23.606 00:10:23 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # env_ctx= 00:28:23.607 00:10:23 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:28:23.607 00:10:23 blockdev_crypto_aesni -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:28:23.607 00:10:23 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == bdev ]] 00:28:23.607 00:10:23 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == crypto_* ]] 00:28:23.607 00:10:23 blockdev_crypto_aesni -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:28:23.607 00:10:23 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:28:23.607 00:10:23 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=545600 00:28:23.607 00:10:23 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:28:23.607 00:10:23 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:28:23.607 00:10:23 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 545600 00:28:23.607 00:10:23 blockdev_crypto_aesni -- common/autotest_common.sh@827 -- # '[' -z 545600 ']' 00:28:23.607 00:10:23 blockdev_crypto_aesni -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:23.607 00:10:23 blockdev_crypto_aesni -- common/autotest_common.sh@832 -- # local max_retries=100 00:28:23.607 00:10:23 blockdev_crypto_aesni -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:23.607 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:23.607 00:10:23 blockdev_crypto_aesni -- common/autotest_common.sh@836 -- # xtrace_disable 00:28:23.607 00:10:23 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:23.607 [2024-05-15 00:10:23.829200] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:28:23.607 [2024-05-15 00:10:23.829274] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid545600 ] 00:28:23.607 [2024-05-15 00:10:23.946827] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:23.607 [2024-05-15 00:10:24.044329] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:24.175 00:10:24 blockdev_crypto_aesni -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:28:24.175 00:10:24 blockdev_crypto_aesni -- common/autotest_common.sh@860 -- # return 0 00:28:24.175 00:10:24 blockdev_crypto_aesni -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:28:24.175 00:10:24 blockdev_crypto_aesni -- bdev/blockdev.sh@705 -- # setup_crypto_aesni_conf 00:28:24.175 00:10:24 blockdev_crypto_aesni -- bdev/blockdev.sh@146 -- # rpc_cmd 00:28:24.175 00:10:24 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:24.175 00:10:24 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:24.175 [2024-05-15 00:10:24.682375] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:28:24.175 [2024-05-15 00:10:24.690414] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:24.175 [2024-05-15 00:10:24.698429] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:24.433 [2024-05-15 00:10:24.767552] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:28:26.967 true 00:28:26.967 true 00:28:26.967 true 00:28:26.967 true 00:28:26.967 Malloc0 00:28:26.967 Malloc1 00:28:26.967 Malloc2 00:28:26.967 Malloc3 00:28:26.967 [2024-05-15 00:10:27.382000] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:28:26.967 crypto_ram 00:28:26.967 [2024-05-15 00:10:27.390017] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:28:26.967 crypto_ram2 00:28:26.967 [2024-05-15 00:10:27.398043] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:28:26.967 crypto_ram3 00:28:26.967 [2024-05-15 00:10:27.406062] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:28:26.967 crypto_ram4 00:28:26.967 00:10:27 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:26.967 00:10:27 blockdev_crypto_aesni -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:28:26.967 00:10:27 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:26.967 00:10:27 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:26.967 00:10:27 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:26.967 00:10:27 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # cat 00:28:26.967 00:10:27 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:28:26.967 00:10:27 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:26.967 00:10:27 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:26.967 00:10:27 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:26.967 00:10:27 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:28:26.967 00:10:27 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:26.967 00:10:27 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:26.967 00:10:27 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:26.967 00:10:27 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:28:26.967 00:10:27 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:26.967 00:10:27 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:26.967 00:10:27 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:26.967 00:10:27 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:28:26.967 00:10:27 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:28:26.967 00:10:27 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:26.967 00:10:27 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:28:26.967 00:10:27 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:27.226 00:10:27 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:27.226 00:10:27 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:28:27.226 00:10:27 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # jq -r .name 00:28:27.226 00:10:27 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "c46d3c0d-d908-580f-a497-826693a4d1a6"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c46d3c0d-d908-580f-a497-826693a4d1a6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "b49b5245-df65-5a9d-ac52-5a1ec0d45396"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b49b5245-df65-5a9d-ac52-5a1ec0d45396",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "b333c8c8-139f-52ba-a094-b58bbc21f953"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "b333c8c8-139f-52ba-a094-b58bbc21f953",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "fe94e446-2760-5bef-84ca-a16ddc887705"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "fe94e446-2760-5bef-84ca-a16ddc887705",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:28:27.226 00:10:27 blockdev_crypto_aesni -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:28:27.226 00:10:27 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:28:27.226 00:10:27 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:28:27.226 00:10:27 blockdev_crypto_aesni -- bdev/blockdev.sh@754 -- # killprocess 545600 00:28:27.227 00:10:27 blockdev_crypto_aesni -- common/autotest_common.sh@946 -- # '[' -z 545600 ']' 00:28:27.227 00:10:27 blockdev_crypto_aesni -- common/autotest_common.sh@950 -- # kill -0 545600 00:28:27.227 00:10:27 blockdev_crypto_aesni -- common/autotest_common.sh@951 -- # uname 00:28:27.227 00:10:27 blockdev_crypto_aesni -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:28:27.227 00:10:27 blockdev_crypto_aesni -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 545600 00:28:27.227 00:10:27 blockdev_crypto_aesni -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:28:27.227 00:10:27 blockdev_crypto_aesni -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:28:27.227 00:10:27 blockdev_crypto_aesni -- common/autotest_common.sh@964 -- # echo 'killing process with pid 545600' 00:28:27.227 killing process with pid 545600 00:28:27.227 00:10:27 blockdev_crypto_aesni -- common/autotest_common.sh@965 -- # kill 545600 00:28:27.227 00:10:27 blockdev_crypto_aesni -- common/autotest_common.sh@970 -- # wait 545600 00:28:27.797 00:10:28 blockdev_crypto_aesni -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:28:27.797 00:10:28 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:28:27.797 00:10:28 blockdev_crypto_aesni -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:28:27.797 00:10:28 blockdev_crypto_aesni -- common/autotest_common.sh@1103 -- # xtrace_disable 00:28:27.797 00:10:28 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:27.797 ************************************ 00:28:27.797 START TEST bdev_hello_world 00:28:27.797 ************************************ 00:28:27.797 00:10:28 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:28:27.797 [2024-05-15 00:10:28.375965] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:28:27.797 [2024-05-15 00:10:28.376031] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid546228 ] 00:28:28.055 [2024-05-15 00:10:28.506444] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:28.055 [2024-05-15 00:10:28.608957] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:28.055 [2024-05-15 00:10:28.630237] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:28:28.055 [2024-05-15 00:10:28.638264] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:28.314 [2024-05-15 00:10:28.646281] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:28.314 [2024-05-15 00:10:28.752992] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:28:30.845 [2024-05-15 00:10:31.203450] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:28:30.845 [2024-05-15 00:10:31.203523] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:30.845 [2024-05-15 00:10:31.203538] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:30.845 [2024-05-15 00:10:31.211465] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:28:30.845 [2024-05-15 00:10:31.211483] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:30.845 [2024-05-15 00:10:31.211495] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:30.845 [2024-05-15 00:10:31.219484] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:28:30.845 [2024-05-15 00:10:31.219501] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:28:30.845 [2024-05-15 00:10:31.219512] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:30.845 [2024-05-15 00:10:31.227504] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:28:30.845 [2024-05-15 00:10:31.227520] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:28:30.845 [2024-05-15 00:10:31.227531] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:30.845 [2024-05-15 00:10:31.305153] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:28:30.845 [2024-05-15 00:10:31.305201] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:28:30.845 [2024-05-15 00:10:31.305221] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:28:30.845 [2024-05-15 00:10:31.306501] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:28:30.845 [2024-05-15 00:10:31.306582] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:28:30.845 [2024-05-15 00:10:31.306599] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:28:30.845 [2024-05-15 00:10:31.306645] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:28:30.845 00:28:30.845 [2024-05-15 00:10:31.306664] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:28:31.414 00:28:31.414 real 0m3.451s 00:28:31.414 user 0m2.867s 00:28:31.414 sys 0m0.548s 00:28:31.414 00:10:31 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1122 -- # xtrace_disable 00:28:31.414 00:10:31 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:28:31.414 ************************************ 00:28:31.414 END TEST bdev_hello_world 00:28:31.414 ************************************ 00:28:31.414 00:10:31 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:28:31.414 00:10:31 blockdev_crypto_aesni -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:28:31.414 00:10:31 blockdev_crypto_aesni -- common/autotest_common.sh@1103 -- # xtrace_disable 00:28:31.414 00:10:31 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:31.414 ************************************ 00:28:31.414 START TEST bdev_bounds 00:28:31.414 ************************************ 00:28:31.414 00:10:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1121 -- # bdev_bounds '' 00:28:31.414 00:10:31 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=546615 00:28:31.414 00:10:31 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:28:31.414 00:10:31 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:28:31.414 00:10:31 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 546615' 00:28:31.414 Process bdevio pid: 546615 00:28:31.414 00:10:31 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 546615 00:28:31.414 00:10:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@827 -- # '[' -z 546615 ']' 00:28:31.414 00:10:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:31.414 00:10:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@832 -- # local max_retries=100 00:28:31.414 00:10:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:31.414 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:31.414 00:10:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@836 -- # xtrace_disable 00:28:31.414 00:10:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:28:31.414 [2024-05-15 00:10:31.911169] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:28:31.414 [2024-05-15 00:10:31.911228] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid546615 ] 00:28:31.674 [2024-05-15 00:10:32.041713] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:28:31.674 [2024-05-15 00:10:32.146812] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:28:31.674 [2024-05-15 00:10:32.146899] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:28:31.674 [2024-05-15 00:10:32.146904] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:31.674 [2024-05-15 00:10:32.168251] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:28:31.674 [2024-05-15 00:10:32.176283] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:31.674 [2024-05-15 00:10:32.184304] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:31.932 [2024-05-15 00:10:32.290325] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:28:34.467 [2024-05-15 00:10:34.734070] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:28:34.467 [2024-05-15 00:10:34.734153] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:34.467 [2024-05-15 00:10:34.734168] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:34.467 [2024-05-15 00:10:34.742085] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:28:34.467 [2024-05-15 00:10:34.742105] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:34.467 [2024-05-15 00:10:34.742117] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:34.467 [2024-05-15 00:10:34.750106] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:28:34.467 [2024-05-15 00:10:34.750124] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:28:34.467 [2024-05-15 00:10:34.750135] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:34.467 [2024-05-15 00:10:34.758126] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:28:34.467 [2024-05-15 00:10:34.758144] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:28:34.467 [2024-05-15 00:10:34.758159] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:34.467 00:10:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:28:34.467 00:10:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@860 -- # return 0 00:28:34.467 00:10:34 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:28:34.467 I/O targets: 00:28:34.467 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:28:34.467 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:28:34.467 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:28:34.467 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:28:34.467 00:28:34.467 00:28:34.467 CUnit - A unit testing framework for C - Version 2.1-3 00:28:34.467 http://cunit.sourceforge.net/ 00:28:34.467 00:28:34.467 00:28:34.467 Suite: bdevio tests on: crypto_ram4 00:28:34.467 Test: blockdev write read block ...passed 00:28:34.467 Test: blockdev write zeroes read block ...passed 00:28:34.467 Test: blockdev write zeroes read no split ...passed 00:28:34.467 Test: blockdev write zeroes read split ...passed 00:28:34.467 Test: blockdev write zeroes read split partial ...passed 00:28:34.467 Test: blockdev reset ...passed 00:28:34.467 Test: blockdev write read 8 blocks ...passed 00:28:34.467 Test: blockdev write read size > 128k ...passed 00:28:34.467 Test: blockdev write read invalid size ...passed 00:28:34.467 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:28:34.467 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:28:34.467 Test: blockdev write read max offset ...passed 00:28:34.467 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:28:34.467 Test: blockdev writev readv 8 blocks ...passed 00:28:34.467 Test: blockdev writev readv 30 x 1block ...passed 00:28:34.467 Test: blockdev writev readv block ...passed 00:28:34.467 Test: blockdev writev readv size > 128k ...passed 00:28:34.467 Test: blockdev writev readv size > 128k in two iovs ...passed 00:28:34.467 Test: blockdev comparev and writev ...passed 00:28:34.467 Test: blockdev nvme passthru rw ...passed 00:28:34.467 Test: blockdev nvme passthru vendor specific ...passed 00:28:34.467 Test: blockdev nvme admin passthru ...passed 00:28:34.467 Test: blockdev copy ...passed 00:28:34.467 Suite: bdevio tests on: crypto_ram3 00:28:34.467 Test: blockdev write read block ...passed 00:28:34.467 Test: blockdev write zeroes read block ...passed 00:28:34.467 Test: blockdev write zeroes read no split ...passed 00:28:34.467 Test: blockdev write zeroes read split ...passed 00:28:34.467 Test: blockdev write zeroes read split partial ...passed 00:28:34.467 Test: blockdev reset ...passed 00:28:34.467 Test: blockdev write read 8 blocks ...passed 00:28:34.467 Test: blockdev write read size > 128k ...passed 00:28:34.467 Test: blockdev write read invalid size ...passed 00:28:34.467 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:28:34.467 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:28:34.467 Test: blockdev write read max offset ...passed 00:28:34.467 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:28:34.467 Test: blockdev writev readv 8 blocks ...passed 00:28:34.467 Test: blockdev writev readv 30 x 1block ...passed 00:28:34.467 Test: blockdev writev readv block ...passed 00:28:34.467 Test: blockdev writev readv size > 128k ...passed 00:28:34.467 Test: blockdev writev readv size > 128k in two iovs ...passed 00:28:34.467 Test: blockdev comparev and writev ...passed 00:28:34.467 Test: blockdev nvme passthru rw ...passed 00:28:34.467 Test: blockdev nvme passthru vendor specific ...passed 00:28:34.467 Test: blockdev nvme admin passthru ...passed 00:28:34.467 Test: blockdev copy ...passed 00:28:34.467 Suite: bdevio tests on: crypto_ram2 00:28:34.467 Test: blockdev write read block ...passed 00:28:34.467 Test: blockdev write zeroes read block ...passed 00:28:34.467 Test: blockdev write zeroes read no split ...passed 00:28:34.729 Test: blockdev write zeroes read split ...passed 00:28:34.729 Test: blockdev write zeroes read split partial ...passed 00:28:34.729 Test: blockdev reset ...passed 00:28:34.729 Test: blockdev write read 8 blocks ...passed 00:28:34.729 Test: blockdev write read size > 128k ...passed 00:28:34.729 Test: blockdev write read invalid size ...passed 00:28:34.729 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:28:34.729 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:28:34.729 Test: blockdev write read max offset ...passed 00:28:34.729 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:28:34.729 Test: blockdev writev readv 8 blocks ...passed 00:28:34.729 Test: blockdev writev readv 30 x 1block ...passed 00:28:34.729 Test: blockdev writev readv block ...passed 00:28:34.729 Test: blockdev writev readv size > 128k ...passed 00:28:34.729 Test: blockdev writev readv size > 128k in two iovs ...passed 00:28:34.729 Test: blockdev comparev and writev ...passed 00:28:34.729 Test: blockdev nvme passthru rw ...passed 00:28:34.729 Test: blockdev nvme passthru vendor specific ...passed 00:28:34.729 Test: blockdev nvme admin passthru ...passed 00:28:34.729 Test: blockdev copy ...passed 00:28:34.729 Suite: bdevio tests on: crypto_ram 00:28:34.729 Test: blockdev write read block ...passed 00:28:34.729 Test: blockdev write zeroes read block ...passed 00:28:34.729 Test: blockdev write zeroes read no split ...passed 00:28:34.729 Test: blockdev write zeroes read split ...passed 00:28:34.730 Test: blockdev write zeroes read split partial ...passed 00:28:34.730 Test: blockdev reset ...passed 00:28:34.730 Test: blockdev write read 8 blocks ...passed 00:28:34.730 Test: blockdev write read size > 128k ...passed 00:28:34.730 Test: blockdev write read invalid size ...passed 00:28:34.730 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:28:34.730 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:28:34.730 Test: blockdev write read max offset ...passed 00:28:34.730 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:28:34.730 Test: blockdev writev readv 8 blocks ...passed 00:28:34.730 Test: blockdev writev readv 30 x 1block ...passed 00:28:34.730 Test: blockdev writev readv block ...passed 00:28:34.730 Test: blockdev writev readv size > 128k ...passed 00:28:34.730 Test: blockdev writev readv size > 128k in two iovs ...passed 00:28:34.730 Test: blockdev comparev and writev ...passed 00:28:34.730 Test: blockdev nvme passthru rw ...passed 00:28:34.730 Test: blockdev nvme passthru vendor specific ...passed 00:28:34.730 Test: blockdev nvme admin passthru ...passed 00:28:34.730 Test: blockdev copy ...passed 00:28:34.730 00:28:34.730 Run Summary: Type Total Ran Passed Failed Inactive 00:28:34.730 suites 4 4 n/a 0 0 00:28:34.730 tests 92 92 92 0 0 00:28:34.730 asserts 520 520 520 0 n/a 00:28:34.730 00:28:34.730 Elapsed time = 0.556 seconds 00:28:34.730 0 00:28:34.730 00:10:35 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 546615 00:28:34.730 00:10:35 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@946 -- # '[' -z 546615 ']' 00:28:34.730 00:10:35 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@950 -- # kill -0 546615 00:28:34.730 00:10:35 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@951 -- # uname 00:28:34.730 00:10:35 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:28:34.730 00:10:35 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 546615 00:28:34.989 00:10:35 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:28:34.989 00:10:35 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:28:34.989 00:10:35 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@964 -- # echo 'killing process with pid 546615' 00:28:34.989 killing process with pid 546615 00:28:34.989 00:10:35 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@965 -- # kill 546615 00:28:34.989 00:10:35 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@970 -- # wait 546615 00:28:35.248 00:10:35 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:28:35.248 00:28:35.248 real 0m3.930s 00:28:35.248 user 0m10.809s 00:28:35.248 sys 0m0.726s 00:28:35.248 00:10:35 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1122 -- # xtrace_disable 00:28:35.248 00:10:35 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:28:35.248 ************************************ 00:28:35.248 END TEST bdev_bounds 00:28:35.248 ************************************ 00:28:35.248 00:10:35 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:28:35.248 00:10:35 blockdev_crypto_aesni -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:28:35.248 00:10:35 blockdev_crypto_aesni -- common/autotest_common.sh@1103 -- # xtrace_disable 00:28:35.248 00:10:35 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:35.507 ************************************ 00:28:35.507 START TEST bdev_nbd 00:28:35.507 ************************************ 00:28:35.507 00:10:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1121 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:28:35.507 00:10:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:28:35.507 00:10:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:28:35.507 00:10:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:35.507 00:10:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:28:35.507 00:10:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:28:35.507 00:10:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:28:35.507 00:10:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:28:35.507 00:10:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:28:35.507 00:10:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:28:35.507 00:10:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:28:35.507 00:10:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:28:35.507 00:10:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:28:35.507 00:10:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:28:35.507 00:10:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:28:35.507 00:10:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:28:35.507 00:10:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=547152 00:28:35.507 00:10:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:28:35.507 00:10:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:28:35.507 00:10:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 547152 /var/tmp/spdk-nbd.sock 00:28:35.507 00:10:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@827 -- # '[' -z 547152 ']' 00:28:35.507 00:10:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:28:35.507 00:10:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@832 -- # local max_retries=100 00:28:35.507 00:10:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:28:35.507 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:28:35.507 00:10:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@836 -- # xtrace_disable 00:28:35.507 00:10:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:28:35.507 [2024-05-15 00:10:35.948665] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:28:35.507 [2024-05-15 00:10:35.948735] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:35.507 [2024-05-15 00:10:36.079789] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:35.766 [2024-05-15 00:10:36.189002] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:35.766 [2024-05-15 00:10:36.210336] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:28:35.766 [2024-05-15 00:10:36.218351] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:35.766 [2024-05-15 00:10:36.226369] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:35.766 [2024-05-15 00:10:36.341355] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:28:38.300 [2024-05-15 00:10:38.780582] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:28:38.300 [2024-05-15 00:10:38.780661] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:38.300 [2024-05-15 00:10:38.780677] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:38.300 [2024-05-15 00:10:38.788602] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:28:38.300 [2024-05-15 00:10:38.788623] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:38.300 [2024-05-15 00:10:38.788637] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:38.300 [2024-05-15 00:10:38.796622] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:28:38.300 [2024-05-15 00:10:38.796641] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:28:38.300 [2024-05-15 00:10:38.796652] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:38.300 [2024-05-15 00:10:38.804642] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:28:38.300 [2024-05-15 00:10:38.804660] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:28:38.300 [2024-05-15 00:10:38.804671] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:38.559 00:10:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:28:38.559 00:10:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@860 -- # return 0 00:28:38.559 00:10:38 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:28:38.559 00:10:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:38.559 00:10:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:28:38.559 00:10:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:28:38.559 00:10:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:28:38.559 00:10:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:38.559 00:10:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:28:38.559 00:10:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:28:38.559 00:10:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:28:38.559 00:10:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:28:38.559 00:10:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:28:38.559 00:10:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:28:38.559 00:10:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:28:38.559 00:10:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:28:38.559 00:10:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:28:38.559 00:10:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:28:38.559 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:28:38.559 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:28:38.559 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:28:38.559 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:28:38.559 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:28:38.559 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:28:38.559 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:28:38.559 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:28:38.559 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:38.559 1+0 records in 00:28:38.559 1+0 records out 00:28:38.559 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000286936 s, 14.3 MB/s 00:28:38.559 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:38.559 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:28:38.559 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:38.559 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:28:38.559 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:28:38.559 00:10:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:28:38.559 00:10:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:28:38.818 00:10:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:28:38.818 00:10:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:28:38.818 00:10:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:28:38.818 00:10:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:28:38.818 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:28:38.818 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:28:38.818 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:28:38.818 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:28:38.818 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:28:39.076 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:28:39.076 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:28:39.076 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:28:39.076 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:39.076 1+0 records in 00:28:39.076 1+0 records out 00:28:39.076 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000325569 s, 12.6 MB/s 00:28:39.076 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:39.076 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:28:39.076 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:39.076 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:28:39.076 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:28:39.076 00:10:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:28:39.076 00:10:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:28:39.076 00:10:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:28:39.336 00:10:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:28:39.336 00:10:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:28:39.336 00:10:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:28:39.336 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd2 00:28:39.336 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:28:39.336 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:28:39.336 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:28:39.336 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd2 /proc/partitions 00:28:39.336 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:28:39.336 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:28:39.336 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:28:39.336 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:39.336 1+0 records in 00:28:39.336 1+0 records out 00:28:39.336 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000322723 s, 12.7 MB/s 00:28:39.336 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:39.336 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:28:39.336 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:39.336 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:28:39.336 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:28:39.336 00:10:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:28:39.336 00:10:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:28:39.336 00:10:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:28:39.595 00:10:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:28:39.595 00:10:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:28:39.595 00:10:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:28:39.595 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd3 00:28:39.595 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:28:39.595 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:28:39.595 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:28:39.595 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd3 /proc/partitions 00:28:39.595 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:28:39.595 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:28:39.595 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:28:39.595 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:39.595 1+0 records in 00:28:39.595 1+0 records out 00:28:39.595 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000373726 s, 11.0 MB/s 00:28:39.595 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:39.595 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:28:39.595 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:39.595 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:28:39.595 00:10:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:28:39.595 00:10:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:28:39.595 00:10:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:28:39.595 00:10:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:28:39.595 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:28:39.595 { 00:28:39.595 "nbd_device": "/dev/nbd0", 00:28:39.595 "bdev_name": "crypto_ram" 00:28:39.595 }, 00:28:39.595 { 00:28:39.595 "nbd_device": "/dev/nbd1", 00:28:39.595 "bdev_name": "crypto_ram2" 00:28:39.595 }, 00:28:39.595 { 00:28:39.595 "nbd_device": "/dev/nbd2", 00:28:39.595 "bdev_name": "crypto_ram3" 00:28:39.595 }, 00:28:39.595 { 00:28:39.595 "nbd_device": "/dev/nbd3", 00:28:39.595 "bdev_name": "crypto_ram4" 00:28:39.595 } 00:28:39.595 ]' 00:28:39.595 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:28:39.595 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:28:39.595 { 00:28:39.595 "nbd_device": "/dev/nbd0", 00:28:39.595 "bdev_name": "crypto_ram" 00:28:39.595 }, 00:28:39.595 { 00:28:39.595 "nbd_device": "/dev/nbd1", 00:28:39.595 "bdev_name": "crypto_ram2" 00:28:39.595 }, 00:28:39.595 { 00:28:39.595 "nbd_device": "/dev/nbd2", 00:28:39.595 "bdev_name": "crypto_ram3" 00:28:39.595 }, 00:28:39.595 { 00:28:39.595 "nbd_device": "/dev/nbd3", 00:28:39.595 "bdev_name": "crypto_ram4" 00:28:39.595 } 00:28:39.595 ]' 00:28:39.595 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:28:39.853 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:28:39.853 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:39.853 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:28:39.853 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:39.853 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:28:39.853 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:39.853 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:28:40.112 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:40.112 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:40.112 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:40.112 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:40.112 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:40.112 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:40.112 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:40.112 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:40.112 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:40.112 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:28:40.370 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:40.370 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:40.370 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:40.371 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:40.371 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:40.371 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:40.371 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:40.371 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:40.371 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:40.371 00:10:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:28:40.630 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:28:40.630 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:28:40.630 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:28:40.630 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:40.630 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:40.630 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:28:40.630 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:40.630 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:40.630 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:40.630 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:28:40.889 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:28:40.889 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:28:40.889 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:28:40.889 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:40.889 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:40.889 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:28:40.889 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:40.889 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:40.889 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:28:40.889 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:40.889 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:28:41.148 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:28:41.148 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:28:41.148 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:28:41.148 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:28:41.148 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:28:41.148 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:28:41.148 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:28:41.148 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:28:41.148 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:28:41.148 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:28:41.148 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:28:41.148 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:28:41.148 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:28:41.148 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:41.148 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:28:41.148 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:28:41.148 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:28:41.148 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:28:41.148 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:28:41.148 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:41.148 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:28:41.148 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:41.148 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:28:41.148 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:41.148 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:28:41.148 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:41.148 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:28:41.148 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:28:41.407 /dev/nbd0 00:28:41.407 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:41.407 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:41.407 00:10:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:28:41.407 00:10:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:28:41.407 00:10:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:28:41.407 00:10:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:28:41.407 00:10:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:28:41.407 00:10:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:28:41.407 00:10:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:28:41.407 00:10:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:28:41.407 00:10:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:41.407 1+0 records in 00:28:41.407 1+0 records out 00:28:41.407 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000210075 s, 19.5 MB/s 00:28:41.407 00:10:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:41.407 00:10:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:28:41.407 00:10:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:41.407 00:10:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:28:41.407 00:10:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:28:41.407 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:41.407 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:28:41.407 00:10:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:28:41.699 /dev/nbd1 00:28:41.699 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:28:41.699 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:28:41.699 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:28:41.700 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:28:41.700 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:28:41.700 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:28:41.700 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:28:41.700 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:28:41.700 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:28:41.700 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:28:41.700 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:41.700 1+0 records in 00:28:41.700 1+0 records out 00:28:41.700 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00035953 s, 11.4 MB/s 00:28:41.700 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:41.700 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:28:41.700 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:41.700 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:28:41.700 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:28:41.700 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:41.700 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:28:41.700 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:28:41.963 /dev/nbd10 00:28:41.963 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:28:41.963 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:28:41.964 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd10 00:28:41.964 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:28:41.964 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:28:41.964 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:28:41.964 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd10 /proc/partitions 00:28:41.964 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:28:41.964 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:28:41.964 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:28:41.964 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:41.964 1+0 records in 00:28:41.964 1+0 records out 00:28:41.964 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000173595 s, 23.6 MB/s 00:28:41.964 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:41.964 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:28:41.964 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:41.964 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:28:41.964 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:28:41.964 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:41.964 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:28:41.964 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:28:41.964 /dev/nbd11 00:28:42.223 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:28:42.223 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:28:42.223 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd11 00:28:42.223 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:28:42.223 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:28:42.223 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:28:42.223 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd11 /proc/partitions 00:28:42.223 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:28:42.223 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:28:42.223 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:28:42.223 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:42.223 1+0 records in 00:28:42.223 1+0 records out 00:28:42.223 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000270343 s, 15.2 MB/s 00:28:42.223 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:42.223 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:28:42.223 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:42.223 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:28:42.223 00:10:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:28:42.223 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:42.223 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:28:42.223 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:28:42.223 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:42.223 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:28:42.482 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:28:42.482 { 00:28:42.482 "nbd_device": "/dev/nbd0", 00:28:42.482 "bdev_name": "crypto_ram" 00:28:42.482 }, 00:28:42.482 { 00:28:42.482 "nbd_device": "/dev/nbd1", 00:28:42.482 "bdev_name": "crypto_ram2" 00:28:42.483 }, 00:28:42.483 { 00:28:42.483 "nbd_device": "/dev/nbd10", 00:28:42.483 "bdev_name": "crypto_ram3" 00:28:42.483 }, 00:28:42.483 { 00:28:42.483 "nbd_device": "/dev/nbd11", 00:28:42.483 "bdev_name": "crypto_ram4" 00:28:42.483 } 00:28:42.483 ]' 00:28:42.483 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:28:42.483 { 00:28:42.483 "nbd_device": "/dev/nbd0", 00:28:42.483 "bdev_name": "crypto_ram" 00:28:42.483 }, 00:28:42.483 { 00:28:42.483 "nbd_device": "/dev/nbd1", 00:28:42.483 "bdev_name": "crypto_ram2" 00:28:42.483 }, 00:28:42.483 { 00:28:42.483 "nbd_device": "/dev/nbd10", 00:28:42.483 "bdev_name": "crypto_ram3" 00:28:42.483 }, 00:28:42.483 { 00:28:42.483 "nbd_device": "/dev/nbd11", 00:28:42.483 "bdev_name": "crypto_ram4" 00:28:42.483 } 00:28:42.483 ]' 00:28:42.483 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:28:42.483 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:28:42.483 /dev/nbd1 00:28:42.483 /dev/nbd10 00:28:42.483 /dev/nbd11' 00:28:42.483 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:28:42.483 /dev/nbd1 00:28:42.483 /dev/nbd10 00:28:42.483 /dev/nbd11' 00:28:42.483 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:28:42.483 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:28:42.483 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:28:42.483 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:28:42.483 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:28:42.483 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:28:42.483 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:28:42.483 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:28:42.483 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:28:42.483 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:28:42.483 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:28:42.483 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:28:42.483 256+0 records in 00:28:42.483 256+0 records out 00:28:42.483 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0115254 s, 91.0 MB/s 00:28:42.483 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:28:42.483 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:28:42.483 256+0 records in 00:28:42.483 256+0 records out 00:28:42.483 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.052379 s, 20.0 MB/s 00:28:42.483 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:28:42.483 00:10:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:28:42.483 256+0 records in 00:28:42.483 256+0 records out 00:28:42.483 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0515793 s, 20.3 MB/s 00:28:42.483 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:28:42.483 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:28:42.742 256+0 records in 00:28:42.742 256+0 records out 00:28:42.742 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0602818 s, 17.4 MB/s 00:28:42.742 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:28:42.742 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:28:42.742 256+0 records in 00:28:42.742 256+0 records out 00:28:42.742 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0399312 s, 26.3 MB/s 00:28:42.742 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:28:42.742 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:28:42.742 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:28:42.742 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:28:42.742 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:28:42.742 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:28:42.742 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:28:42.742 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:28:42.742 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:28:42.742 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:28:42.742 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:28:42.742 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:28:42.742 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:28:42.742 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:28:42.742 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:28:42.742 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:28:42.742 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:28:42.742 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:42.742 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:28:42.742 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:42.742 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:28:42.742 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:42.742 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:28:43.002 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:43.002 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:43.002 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:43.002 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:43.002 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:43.002 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:43.002 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:43.002 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:43.002 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:43.002 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:28:43.261 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:43.261 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:43.261 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:43.261 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:43.261 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:43.261 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:43.261 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:43.261 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:43.261 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:43.261 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:28:43.520 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:28:43.520 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:28:43.520 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:28:43.520 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:43.520 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:43.520 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:28:43.520 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:43.520 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:43.520 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:43.520 00:10:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:28:43.778 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:28:43.778 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:28:43.778 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:28:43.778 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:43.778 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:43.778 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:28:43.778 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:43.778 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:43.778 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:28:43.778 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:43.779 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:28:43.779 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:28:43.779 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:28:43.779 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:28:43.779 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:28:43.779 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:28:43.779 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:28:43.779 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:28:43.779 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:28:43.779 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:28:43.779 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:28:43.779 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:28:43.779 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:28:43.779 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:28:43.779 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:43.779 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:28:43.779 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:28:43.779 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:28:43.779 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:28:44.037 malloc_lvol_verify 00:28:44.037 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:28:44.295 4900f2b9-f5c5-44c8-88c7-f3ed65c0a23c 00:28:44.295 00:10:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:28:44.554 a4fa8b2d-cfdb-4b90-aeee-cde5ae7395a8 00:28:44.554 00:10:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:28:44.812 /dev/nbd0 00:28:44.812 00:10:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:28:44.812 mke2fs 1.46.5 (30-Dec-2021) 00:28:44.812 Discarding device blocks: 0/4096 done 00:28:44.813 Creating filesystem with 4096 1k blocks and 1024 inodes 00:28:44.813 00:28:44.813 Allocating group tables: 0/1 done 00:28:44.813 Writing inode tables: 0/1 done 00:28:44.813 Creating journal (1024 blocks): done 00:28:44.813 Writing superblocks and filesystem accounting information: 0/1 done 00:28:44.813 00:28:44.813 00:10:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:28:44.813 00:10:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:28:44.813 00:10:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:44.813 00:10:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:28:44.813 00:10:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:44.813 00:10:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:28:44.813 00:10:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:44.813 00:10:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:28:45.071 00:10:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:45.071 00:10:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:45.071 00:10:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:45.071 00:10:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:45.071 00:10:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:45.071 00:10:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:45.071 00:10:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:45.071 00:10:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:45.071 00:10:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:28:45.071 00:10:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:28:45.071 00:10:45 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 547152 00:28:45.071 00:10:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@946 -- # '[' -z 547152 ']' 00:28:45.071 00:10:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@950 -- # kill -0 547152 00:28:45.330 00:10:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@951 -- # uname 00:28:45.330 00:10:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:28:45.330 00:10:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 547152 00:28:45.330 00:10:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:28:45.330 00:10:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:28:45.330 00:10:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@964 -- # echo 'killing process with pid 547152' 00:28:45.330 killing process with pid 547152 00:28:45.330 00:10:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@965 -- # kill 547152 00:28:45.330 00:10:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@970 -- # wait 547152 00:28:45.897 00:10:46 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:28:45.898 00:28:45.898 real 0m10.304s 00:28:45.898 user 0m13.152s 00:28:45.898 sys 0m4.141s 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1122 -- # xtrace_disable 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:28:45.898 ************************************ 00:28:45.898 END TEST bdev_nbd 00:28:45.898 ************************************ 00:28:45.898 00:10:46 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:28:45.898 00:10:46 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = nvme ']' 00:28:45.898 00:10:46 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = gpt ']' 00:28:45.898 00:10:46 blockdev_crypto_aesni -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:28:45.898 00:10:46 blockdev_crypto_aesni -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:28:45.898 00:10:46 blockdev_crypto_aesni -- common/autotest_common.sh@1103 -- # xtrace_disable 00:28:45.898 00:10:46 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:45.898 ************************************ 00:28:45.898 START TEST bdev_fio 00:28:45.898 ************************************ 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1121 -- # fio_test_suite '' 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:28:45.898 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1276 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1277 -- # local workload=verify 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1278 -- # local bdev_type=AIO 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1279 -- # local env_context= 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local fio_dir=/usr/src/fio 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1287 -- # '[' -z verify ']' 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -n '' ']' 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1297 -- # cat 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1309 -- # '[' verify == verify ']' 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1310 -- # cat 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1319 -- # '[' AIO == AIO ']' 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1320 -- # /usr/src/fio/fio --version 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1320 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1321 -- # echo serialize_overlap=1 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram4]' 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram4 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1103 -- # xtrace_disable 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:28:45.898 ************************************ 00:28:45.898 START TEST bdev_fio_rw_verify 00:28:45.898 ************************************ 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1121 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1333 -- # local fio_dir=/usr/src/fio 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1335 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1335 -- # local sanitizers 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # shift 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local asan_lib= 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # grep libasan 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # asan_lib= 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # grep libclang_rt.asan 00:28:45.898 00:10:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:28:46.156 00:10:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # asan_lib= 00:28:46.156 00:10:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:28:46.156 00:10:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1348 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:28:46.156 00:10:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1348 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:46.414 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:46.414 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:46.414 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:46.414 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:46.414 fio-3.35 00:28:46.414 Starting 4 threads 00:29:01.293 00:29:01.293 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=549199: Wed May 15 00:10:59 2024 00:29:01.293 read: IOPS=20.1k, BW=78.5MiB/s (82.3MB/s)(785MiB/10001msec) 00:29:01.293 slat (usec): min=16, max=492, avg=65.44, stdev=35.64 00:29:01.293 clat (usec): min=12, max=2303, avg=358.70, stdev=219.29 00:29:01.293 lat (usec): min=50, max=2591, avg=424.14, stdev=238.67 00:29:01.293 clat percentiles (usec): 00:29:01.293 | 50.000th=[ 306], 99.000th=[ 1037], 99.900th=[ 1237], 99.990th=[ 1450], 00:29:01.293 | 99.999th=[ 1991] 00:29:01.293 write: IOPS=22.2k, BW=86.6MiB/s (90.8MB/s)(844MiB/9753msec); 0 zone resets 00:29:01.293 slat (usec): min=24, max=1317, avg=79.50, stdev=35.33 00:29:01.293 clat (usec): min=25, max=2227, avg=441.33, stdev=265.18 00:29:01.293 lat (usec): min=54, max=2414, avg=520.83, stdev=283.55 00:29:01.294 clat percentiles (usec): 00:29:01.294 | 50.000th=[ 392], 99.000th=[ 1303], 99.900th=[ 1532], 99.990th=[ 1663], 00:29:01.294 | 99.999th=[ 1958] 00:29:01.294 bw ( KiB/s): min=69600, max=128504, per=97.91%, avg=86802.11, stdev=3239.85, samples=76 00:29:01.294 iops : min=17400, max=32126, avg=21700.53, stdev=809.96, samples=76 00:29:01.294 lat (usec) : 20=0.01%, 50=0.01%, 100=4.12%, 250=27.23%, 500=41.18% 00:29:01.294 lat (usec) : 750=17.74%, 1000=6.74% 00:29:01.294 lat (msec) : 2=2.99%, 4=0.01% 00:29:01.294 cpu : usr=99.61%, sys=0.00%, ctx=67, majf=0, minf=310 00:29:01.294 IO depths : 1=10.1%, 2=25.6%, 4=51.2%, 8=13.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:29:01.294 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:29:01.294 complete : 0=0.0%, 4=88.7%, 8=11.3%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:29:01.294 issued rwts: total=200982,216156,0,0 short=0,0,0,0 dropped=0,0,0,0 00:29:01.294 latency : target=0, window=0, percentile=100.00%, depth=8 00:29:01.294 00:29:01.294 Run status group 0 (all jobs): 00:29:01.294 READ: bw=78.5MiB/s (82.3MB/s), 78.5MiB/s-78.5MiB/s (82.3MB/s-82.3MB/s), io=785MiB (823MB), run=10001-10001msec 00:29:01.294 WRITE: bw=86.6MiB/s (90.8MB/s), 86.6MiB/s-86.6MiB/s (90.8MB/s-90.8MB/s), io=844MiB (885MB), run=9753-9753msec 00:29:01.294 00:29:01.294 real 0m13.791s 00:29:01.294 user 0m45.746s 00:29:01.294 sys 0m0.681s 00:29:01.294 00:11:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:01.294 00:11:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:29:01.294 ************************************ 00:29:01.294 END TEST bdev_fio_rw_verify 00:29:01.294 ************************************ 00:29:01.294 00:11:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:29:01.294 00:11:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:01.294 00:11:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:29:01.294 00:11:00 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1276 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:01.294 00:11:00 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1277 -- # local workload=trim 00:29:01.294 00:11:00 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1278 -- # local bdev_type= 00:29:01.294 00:11:00 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1279 -- # local env_context= 00:29:01.294 00:11:00 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local fio_dir=/usr/src/fio 00:29:01.294 00:11:00 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:29:01.294 00:11:00 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1287 -- # '[' -z trim ']' 00:29:01.294 00:11:00 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -n '' ']' 00:29:01.294 00:11:00 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:01.294 00:11:00 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1297 -- # cat 00:29:01.294 00:11:00 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1309 -- # '[' trim == verify ']' 00:29:01.294 00:11:00 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # '[' trim == trim ']' 00:29:01.294 00:11:00 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1325 -- # echo rw=trimwrite 00:29:01.294 00:11:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:29:01.294 00:11:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "c46d3c0d-d908-580f-a497-826693a4d1a6"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c46d3c0d-d908-580f-a497-826693a4d1a6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "b49b5245-df65-5a9d-ac52-5a1ec0d45396"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b49b5245-df65-5a9d-ac52-5a1ec0d45396",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "b333c8c8-139f-52ba-a094-b58bbc21f953"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "b333c8c8-139f-52ba-a094-b58bbc21f953",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "fe94e446-2760-5bef-84ca-a16ddc887705"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "fe94e446-2760-5bef-84ca-a16ddc887705",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:29:01.294 00:11:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:29:01.294 crypto_ram2 00:29:01.294 crypto_ram3 00:29:01.294 crypto_ram4 ]] 00:29:01.294 00:11:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "c46d3c0d-d908-580f-a497-826693a4d1a6"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c46d3c0d-d908-580f-a497-826693a4d1a6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "b49b5245-df65-5a9d-ac52-5a1ec0d45396"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b49b5245-df65-5a9d-ac52-5a1ec0d45396",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "b333c8c8-139f-52ba-a094-b58bbc21f953"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "b333c8c8-139f-52ba-a094-b58bbc21f953",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "fe94e446-2760-5bef-84ca-a16ddc887705"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "fe94e446-2760-5bef-84ca-a16ddc887705",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram4]' 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram4 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:29:01.295 ************************************ 00:29:01.295 START TEST bdev_fio_trim 00:29:01.295 ************************************ 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1121 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1333 -- # local fio_dir=/usr/src/fio 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1335 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1335 -- # local sanitizers 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # shift 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local asan_lib= 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # grep libasan 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # asan_lib= 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # grep libclang_rt.asan 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # asan_lib= 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1348 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:29:01.295 00:11:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1348 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:01.295 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:01.295 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:01.295 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:01.295 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:01.295 fio-3.35 00:29:01.295 Starting 4 threads 00:29:13.507 00:29:13.507 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=551060: Wed May 15 00:11:13 2024 00:29:13.507 write: IOPS=34.9k, BW=136MiB/s (143MB/s)(1363MiB/10001msec); 0 zone resets 00:29:13.507 slat (usec): min=10, max=1287, avg=64.80, stdev=35.76 00:29:13.507 clat (usec): min=37, max=1697, avg=291.77, stdev=183.60 00:29:13.507 lat (usec): min=55, max=1939, avg=356.57, stdev=205.24 00:29:13.507 clat percentiles (usec): 00:29:13.507 | 50.000th=[ 245], 99.000th=[ 881], 99.900th=[ 1020], 99.990th=[ 1156], 00:29:13.507 | 99.999th=[ 1647] 00:29:13.507 bw ( KiB/s): min=121440, max=199944, per=100.00%, avg=139924.21, stdev=4728.97, samples=76 00:29:13.507 iops : min=30360, max=49986, avg=34981.05, stdev=1182.24, samples=76 00:29:13.507 trim: IOPS=34.9k, BW=136MiB/s (143MB/s)(1363MiB/10001msec); 0 zone resets 00:29:13.507 slat (usec): min=4, max=146, avg=18.64, stdev= 7.72 00:29:13.507 clat (usec): min=33, max=1519, avg=274.88, stdev=124.63 00:29:13.507 lat (usec): min=39, max=1534, avg=293.53, stdev=127.00 00:29:13.507 clat percentiles (usec): 00:29:13.507 | 50.000th=[ 255], 99.000th=[ 619], 99.900th=[ 742], 99.990th=[ 832], 00:29:13.507 | 99.999th=[ 1123] 00:29:13.507 bw ( KiB/s): min=121448, max=199976, per=100.00%, avg=139925.05, stdev=4730.03, samples=76 00:29:13.507 iops : min=30362, max=49994, avg=34981.37, stdev=1182.50, samples=76 00:29:13.507 lat (usec) : 50=0.36%, 100=6.43%, 250=43.09%, 500=41.05%, 750=7.46% 00:29:13.507 lat (usec) : 1000=1.54% 00:29:13.507 lat (msec) : 2=0.07% 00:29:13.507 cpu : usr=99.64%, sys=0.00%, ctx=57, majf=0, minf=105 00:29:13.507 IO depths : 1=7.9%, 2=26.3%, 4=52.7%, 8=13.2%, 16=0.0%, 32=0.0%, >=64=0.0% 00:29:13.507 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:29:13.507 complete : 0=0.0%, 4=88.4%, 8=11.6%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:29:13.507 issued rwts: total=0,348997,348998,0 short=0,0,0,0 dropped=0,0,0,0 00:29:13.507 latency : target=0, window=0, percentile=100.00%, depth=8 00:29:13.507 00:29:13.507 Run status group 0 (all jobs): 00:29:13.507 WRITE: bw=136MiB/s (143MB/s), 136MiB/s-136MiB/s (143MB/s-143MB/s), io=1363MiB (1429MB), run=10001-10001msec 00:29:13.507 TRIM: bw=136MiB/s (143MB/s), 136MiB/s-136MiB/s (143MB/s-143MB/s), io=1363MiB (1429MB), run=10001-10001msec 00:29:13.766 00:29:13.766 real 0m13.755s 00:29:13.766 user 0m45.611s 00:29:13.766 sys 0m0.648s 00:29:13.766 00:11:14 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:13.766 00:11:14 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:29:13.766 ************************************ 00:29:13.766 END TEST bdev_fio_trim 00:29:13.766 ************************************ 00:29:13.766 00:11:14 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:29:13.766 00:11:14 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:13.766 00:11:14 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:29:13.766 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:13.766 00:11:14 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:29:13.766 00:29:13.766 real 0m27.929s 00:29:13.766 user 1m31.554s 00:29:13.766 sys 0m1.529s 00:29:13.766 00:11:14 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:13.766 00:11:14 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:29:13.766 ************************************ 00:29:13.766 END TEST bdev_fio 00:29:13.766 ************************************ 00:29:13.766 00:11:14 blockdev_crypto_aesni -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:29:13.766 00:11:14 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:29:13.766 00:11:14 blockdev_crypto_aesni -- common/autotest_common.sh@1097 -- # '[' 16 -le 1 ']' 00:29:13.766 00:11:14 blockdev_crypto_aesni -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:13.766 00:11:14 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:29:13.766 ************************************ 00:29:13.766 START TEST bdev_verify 00:29:13.766 ************************************ 00:29:13.766 00:11:14 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:29:13.766 [2024-05-15 00:11:14.348294] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:29:13.766 [2024-05-15 00:11:14.348353] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid552480 ] 00:29:14.025 [2024-05-15 00:11:14.475390] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:14.025 [2024-05-15 00:11:14.574192] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:29:14.025 [2024-05-15 00:11:14.574197] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:29:14.025 [2024-05-15 00:11:14.595572] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:29:14.025 [2024-05-15 00:11:14.603595] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:29:14.025 [2024-05-15 00:11:14.611635] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:29:14.284 [2024-05-15 00:11:14.712754] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:29:16.847 [2024-05-15 00:11:17.144611] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:29:16.847 [2024-05-15 00:11:17.144695] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:16.847 [2024-05-15 00:11:17.144711] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:16.847 [2024-05-15 00:11:17.152630] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:29:16.847 [2024-05-15 00:11:17.152649] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:16.847 [2024-05-15 00:11:17.152661] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:16.847 [2024-05-15 00:11:17.160649] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:29:16.847 [2024-05-15 00:11:17.160666] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:29:16.847 [2024-05-15 00:11:17.160678] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:16.847 [2024-05-15 00:11:17.168672] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:29:16.847 [2024-05-15 00:11:17.168689] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:29:16.847 [2024-05-15 00:11:17.168701] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:16.847 Running I/O for 5 seconds... 00:29:22.118 00:29:22.118 Latency(us) 00:29:22.118 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:22.118 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:29:22.118 Verification LBA range: start 0x0 length 0x1000 00:29:22.118 crypto_ram : 5.08 503.48 1.97 0.00 0.00 253589.94 4046.14 175066.60 00:29:22.118 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:29:22.118 Verification LBA range: start 0x1000 length 0x1000 00:29:22.118 crypto_ram : 5.08 504.14 1.97 0.00 0.00 253303.66 4957.94 174154.80 00:29:22.118 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:29:22.118 Verification LBA range: start 0x0 length 0x1000 00:29:22.118 crypto_ram2 : 5.08 503.58 1.97 0.00 0.00 252704.44 4074.63 162301.33 00:29:22.118 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:29:22.118 Verification LBA range: start 0x1000 length 0x1000 00:29:22.118 crypto_ram2 : 5.08 504.04 1.97 0.00 0.00 252470.77 5043.42 161389.52 00:29:22.118 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:29:22.118 Verification LBA range: start 0x0 length 0x1000 00:29:22.118 crypto_ram3 : 5.07 3888.38 15.19 0.00 0.00 32604.70 4616.01 27582.11 00:29:22.118 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:29:22.118 Verification LBA range: start 0x1000 length 0x1000 00:29:22.118 crypto_ram3 : 5.07 3914.40 15.29 0.00 0.00 32387.34 4786.98 27582.11 00:29:22.118 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:29:22.118 Verification LBA range: start 0x0 length 0x1000 00:29:22.118 crypto_ram4 : 5.07 3888.16 15.19 0.00 0.00 32499.03 5043.42 26100.42 00:29:22.118 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:29:22.118 Verification LBA range: start 0x1000 length 0x1000 00:29:22.118 crypto_ram4 : 5.07 3914.93 15.29 0.00 0.00 32281.65 5071.92 25758.50 00:29:22.118 =================================================================================================================== 00:29:22.118 Total : 17621.11 68.83 0.00 0.00 57725.32 4046.14 175066.60 00:29:22.377 00:29:22.377 real 0m8.542s 00:29:22.377 user 0m16.054s 00:29:22.377 sys 0m0.525s 00:29:22.377 00:11:22 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:22.377 00:11:22 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:29:22.377 ************************************ 00:29:22.377 END TEST bdev_verify 00:29:22.377 ************************************ 00:29:22.377 00:11:22 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:29:22.377 00:11:22 blockdev_crypto_aesni -- common/autotest_common.sh@1097 -- # '[' 16 -le 1 ']' 00:29:22.377 00:11:22 blockdev_crypto_aesni -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:22.377 00:11:22 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:29:22.377 ************************************ 00:29:22.377 START TEST bdev_verify_big_io 00:29:22.377 ************************************ 00:29:22.377 00:11:22 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:29:22.635 [2024-05-15 00:11:22.970820] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:29:22.635 [2024-05-15 00:11:22.970878] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid553546 ] 00:29:22.635 [2024-05-15 00:11:23.096699] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:22.635 [2024-05-15 00:11:23.201589] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:29:22.635 [2024-05-15 00:11:23.201595] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:29:22.635 [2024-05-15 00:11:23.223042] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:29:22.893 [2024-05-15 00:11:23.231063] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:29:22.893 [2024-05-15 00:11:23.239082] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:29:22.893 [2024-05-15 00:11:23.344652] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:29:25.423 [2024-05-15 00:11:25.766533] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:29:25.423 [2024-05-15 00:11:25.766605] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:25.423 [2024-05-15 00:11:25.766621] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:25.423 [2024-05-15 00:11:25.774549] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:29:25.423 [2024-05-15 00:11:25.774575] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:25.423 [2024-05-15 00:11:25.774587] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:25.423 [2024-05-15 00:11:25.782571] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:29:25.423 [2024-05-15 00:11:25.782591] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:29:25.423 [2024-05-15 00:11:25.782603] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:25.423 [2024-05-15 00:11:25.790592] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:29:25.423 [2024-05-15 00:11:25.790610] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:29:25.423 [2024-05-15 00:11:25.790621] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:25.423 Running I/O for 5 seconds... 00:29:28.714 [2024-05-15 00:11:28.591205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.592962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.594722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.595549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.598195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.598609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.598994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.600346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.602444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.604126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.604680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.606339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.607819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.608216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.608956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.610355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.612386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.613619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.614941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.616338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.617821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.618216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.620028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.621801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.623940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.624753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.626139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.627811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.629425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.631039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.632524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.634198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.634968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.636450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.638136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.639820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.642051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.643464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.645145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.646826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.649020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.650736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.652545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.654157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.656811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.658503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.660182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.661098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.714 [2024-05-15 00:11:28.662829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.664503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.666168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.666566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.669697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.671416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.672935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.673948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.676034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.677714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.678656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.679059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.682021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.683694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.683743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.684319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.686369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.688068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.688116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.689207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.690592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.691990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.692039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.693739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.694144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.694831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.694879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.696282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.697418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.697822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.697865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.698254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.698667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.700362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.700434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.702097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.703365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.705059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.705108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.706789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.707348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.707757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.707804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.708458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.709594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.710414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.710462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.711843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.712252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.713272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.713330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.713722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.715834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.715892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.717467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.717517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.718265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.718319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.718716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.718759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.720371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.720436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.721971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.722019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.722808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.722864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.723249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.723306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.725822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.725881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.727484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.727542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.728409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.728466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.729087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.729134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.731835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.731893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.733081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.733127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.734069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.734125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.735862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.735908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.738667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.738731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.739126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.715 [2024-05-15 00:11:28.739173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.740426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.740485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.741758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.741807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.744482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.744542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.744927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.744973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.746460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.746517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.748052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.748100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.749994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.750052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.750445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.750488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.751265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.751323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.751736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.751784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.753564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.753638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.754024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.754067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.754936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.754996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.755387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.755446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.757493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.757552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.757935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.757991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.758954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.759011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.759408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.759457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.761375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.761442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.761829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.761873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.762809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.762866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.763254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.763303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.765117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.765176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.765570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.765615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.766507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.766566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.766952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.767002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.768619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.768697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.769090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.769136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.769968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.770026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.770425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.770475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.772120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.772196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.772601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.772651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.773468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.773532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.773924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.773990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.775796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.775861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.776260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.776313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.777149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.777225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.777632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.777691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.779617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.779682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.780070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.716 [2024-05-15 00:11:28.780118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.780999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.781075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.781485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.781535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.783942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.784000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.784392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.784459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.785395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.785466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.785867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.785922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.788307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.788365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.788777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.788842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.789784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.789847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.790233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.790291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.792331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.792394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.792791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.792839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.792860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.793337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.793845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.793907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.794294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.794359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.794378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.794739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.796443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.796842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.796899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.797286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.797710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.797868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.798261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.798310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.799845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.800105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.801110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.801162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.801203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.801244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.801587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.801743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.801792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.801833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.801873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.802230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.803197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.803252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.803293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.803333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.803621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.803778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.803830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.803871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.803924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.804182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.805168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.805219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.805261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.805302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.805662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.805812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.805859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.805900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.805979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.806251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.717 [2024-05-15 00:11:28.807186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.807250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.807305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.807346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.807614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.807768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.807814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.807854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.807895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.808206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.926392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.927812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.927861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.932773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.932829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.934508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.934553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.937279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.937342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.938982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.939028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.939915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.939971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.941715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.941768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.943585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.943642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.945012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.945057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.947094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.947150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.948383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.948436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.951242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.951299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.952738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.952783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.953789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.953844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.955241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.955286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.956785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.956839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.957234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.958966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.960756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.960810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.962476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.963223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.966000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.966062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.966452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.967799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.967854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.969242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.718 [2024-05-15 00:11:28.970364] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.718 [2024-05-15 00:11:28.971821] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.718 [2024-05-15 00:11:28.973230] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.718 [2024-05-15 00:11:28.974666] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.718 [2024-05-15 00:11:28.975073] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.718 [2024-05-15 00:11:28.975585] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.718 [2024-05-15 00:11:28.975984] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.718 [2024-05-15 00:11:28.976971] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.718 [2024-05-15 00:11:28.979620] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.718 [2024-05-15 00:11:28.980881] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.718 [2024-05-15 00:11:28.982286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.718 [2024-05-15 00:11:28.983704] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.718 [2024-05-15 00:11:28.984975] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.718 [2024-05-15 00:11:28.985380] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.718 [2024-05-15 00:11:28.985998] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:28.987409] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:28.989226] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:28.990656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:28.992084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:28.993771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:28.994557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:28.994962] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:28.996625] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:28.998360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.000968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.002460] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.004139] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.005528] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.006452] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.008195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.009857] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.011434] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.014424] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.016201] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.016256] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.017617] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.018543] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.020256] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.020317] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.022121] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.023363] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.024760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.024814] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.026223] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.026643] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.027663] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.027718] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.028112] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.029309] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.031119] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.031174] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.032554] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.032963] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.034380] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.034441] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.035901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.037301] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.038157] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.038211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.039613] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.040062] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.041746] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.041802] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.042453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.043636] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.045377] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.045444] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.045839] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.046355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.719 [2024-05-15 00:11:29.047976] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.048038] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.049781] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.051006] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.052425] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.052480] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.053911] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.054318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.055408] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.055478] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.055870] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.057104] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.058737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.058806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.059816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.060267] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.061576] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.061633] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.062593] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.064406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.065688] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.065744] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.066168] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.066584] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.068357] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.068427] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.068822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.070112] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.071413] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.071466] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.072802] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.073265] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.074029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.074085] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.074484] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.075667] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.076075] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.077628] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.077692] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.078114] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.078525] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.078589] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.078980] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.081252] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.082538] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.082593] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.083396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.084364] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.084435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.085564] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.087186] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.088726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.088791] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.089183] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.090485] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.090984] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.092007] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.093696] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.093757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.095098] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.095517] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.097324] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.097383] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.098141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.099884] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.099956] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.101742] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.103849] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.104256] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.104333] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.104738] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.105609] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.105688] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.106093] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.106495] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.108220] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.108283] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.108683] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.109082] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.109571] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.720 [2024-05-15 00:11:29.109973] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.110366] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.110448] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.111826] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.112241] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.112649] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.112704] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.113568] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.113981] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.114035] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.114444] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.116211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.116628] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.116687] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.117080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.118002] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.118067] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.118473] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.118870] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.120660] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.120723] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.121119] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.121526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.122009] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.122420] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.122816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.122869] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.124370] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.124780] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.125182] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.125238] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.126106] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.126515] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.126572] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.126967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.128852] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.129262] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.129330] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.129737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.130565] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.130658] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.131060] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.131483] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.133364] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:28.721 [2024-05-15 00:11:29.133825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.134214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.134268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.134795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.134823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.135212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.135261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.135658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.136050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.137395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.137797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.137845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.138242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.139052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.139108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.139518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.139907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.140227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.141169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.141577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.141630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.142016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.142591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.142988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.143032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.143430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.143752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.144692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.145092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.145142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.145539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.721 [2024-05-15 00:11:29.146073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.146477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.146523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.148201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.148475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.149483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.151136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.151199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.151599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.152063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.153153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.153202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.154461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.154765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.155695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.156664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.156717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.158396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.158994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.160301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.160348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.161178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.161479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.162559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.163847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.163897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.163938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.164343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.164744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.164801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.164843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.165255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.166146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.166589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.166639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.167918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.168324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.168732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.168779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.169163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.169437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.170344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.170395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.170444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.170484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.170897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.170946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.171003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.171060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.171325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.172325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.172376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.172427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.172469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.172873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.172922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.172963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.173004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.173344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.174215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.174273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.174317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.174358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.174804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.174852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.174893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.174933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.175196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.176153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.176205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.176247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.176287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.176700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.176748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.176789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.178179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.178908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.178960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.180466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.180899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.181603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.722 [2024-05-15 00:11:29.181678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.182665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.183842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.185250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.185297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.186147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.186587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.188316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.188403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.189703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.190905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.192297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.192351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.194109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.194686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.196347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.196422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.197253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.198601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.199321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.199368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.200763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.201222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.202891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.202970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.203741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.204935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.206621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.206676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.207949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.208451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.210121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.210214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.210584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.211813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.213504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.213552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.214416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.214891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.216298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.216370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.218021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.219365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.220886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.220931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.221582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.222011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.223416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.223491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.225143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.226330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.227755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.227802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.229477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.230005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.231038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.231111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.232507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.233806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.235251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.235303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.236982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.237482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.237544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.238905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.238954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.240358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.240641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.241526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.241588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.242071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.242117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.242520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.243221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.243270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.243311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.243627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.244503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.245654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.723 [2024-05-15 00:11:29.245705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.245753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.247819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.247881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.247937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.249605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.249877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.252215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.252272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.252320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.252716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.253133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.253188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.254732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.254779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.255045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.255986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.256037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.257434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.257480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.257921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.259613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.259662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.259702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.260100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.261173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.262260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.262308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.262350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.264118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.264178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.264220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.265626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.265897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.270876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.270941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.270985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.272422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.272955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.273006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.274658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.274704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.275084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.275949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.276008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.277803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.277850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.278246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.279587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.279635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.279675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.279978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.280890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.281803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.281851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.281892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.283425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.283480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.283521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.283911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.284180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.289952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.290014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.290061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.291576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.291990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.292039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.293269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.293322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.293597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.295232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.295284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.296963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.297012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.297419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.298838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.298888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.298928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.724 [2024-05-15 00:11:29.299198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.303076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.303488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.303533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.303573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.305321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.305376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.305424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.306852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.307131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.310598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.310654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.310695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.312449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.313316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.313369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.314824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.314869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.315199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.318606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.318658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.320361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.320427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.320831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.322299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.322347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.322394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.322669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.325736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.325793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.327423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.327475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.328686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.328742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.330137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.330183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.330520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.336549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.336606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.337127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.337172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.339094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.339152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.340813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.340863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.341157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.345731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.345788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.347152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.347196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.348667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.348719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.349826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.349872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.350175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.355206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.355263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.356682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.356728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.358026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.358081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.358903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.358949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.359220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.363858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.363915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.364736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.366443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.368251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.368307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.370109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.371387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.371748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.376352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.376432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.378105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.380095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.380156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.381889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.382125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.386619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.387747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.388598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.390004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.390429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.392116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.393211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.394869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.395200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.399050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.400410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.401059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.402456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.404537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.406058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.985 [2024-05-15 00:11:29.407274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.408669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.408987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.414969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.415373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.416985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.418745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.420824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.421740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.423115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.424511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.424784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.426217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.427917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.428315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.430068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.431881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.433568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.434261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.435659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.435932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.437978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.438643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.438690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.440495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.442691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.442753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.444503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.444555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.444826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.448986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.450688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.450737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.451821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.452588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.452644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.454032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.454077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.454408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.458460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.459653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.459701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.461096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.463182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.463237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.464145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.464191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.464469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.466634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.468293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.468345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.469548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.471369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.471431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.473125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.473179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.473592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.478255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.478756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.478806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.480110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.481234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.481292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.482245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.482290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.482572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.485360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.486681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.486729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.487742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.489745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.489813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.490298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.490344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.490624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.493282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.494606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.494673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.496275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.497154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.497210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.498137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.498193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.498477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.502258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.503551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.503597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.505427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.507571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.507627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.508016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.508065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.508336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.512478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.513485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.513533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.514265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.515226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.516310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.516363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.516999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.517344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.520019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.520434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.522225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.522273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.522766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.523332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.523378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.524627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.524904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.527824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.528976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.529035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.529904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.531116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.986 [2024-05-15 00:11:29.531174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.532095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.533431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.533856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.536608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.536666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.537494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.537892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.538359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.539304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.540315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.540363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.540647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.542989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.544110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.545237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.545289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.546191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.546602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.546666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.547775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.548122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.551080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.552481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.552529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.553584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.554413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.554477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.554870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.555663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.555946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.558490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.558548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.559978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.560766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.561180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.561587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.561984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.562038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.562459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.566332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.566745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.567255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.567302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.568467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.569578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.569627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.570018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.570405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.571978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.573236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:28.987 [2024-05-15 00:11:29.573285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.573678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.574628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.574685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.575903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.576916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.577232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.579418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.579475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.580712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.581104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.581625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.582102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.583574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.583621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.583956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.586604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.588104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.588819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.588866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.589632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.590033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.590090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.590749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.591028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.593479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.595167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.595215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.595964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.596730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.596793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.597192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.597706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.597983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.600177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.601952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.602346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.603957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.604508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.604564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.604960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.605009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.605456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.609968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.610374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.610777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.610828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.611706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.613097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.613145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.613539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.613943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.615937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.616000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.616405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.616460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.618659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.618727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.619442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.619488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.619766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.622571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.622628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.623861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.623905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.625186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.625245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.626210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.626258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.626603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.631696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.631760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.632249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.632296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.633049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.633109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.634658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.634723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.634997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.639033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.639092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.640659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.640706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.642609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.642671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.248 [2024-05-15 00:11:29.644061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.644109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.644481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.648856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.648914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.649689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.649734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.651448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.651505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.652537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.652582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.652909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.657295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.657347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.658364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.658414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.658879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.658926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.660142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.660190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.660572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.663778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.663836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.663879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.663923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.665787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.665844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.665899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.665943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.666247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.669190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.669245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.669287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.669340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.669751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.669799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.669840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.669881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.670153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.674365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.674424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.674465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.674505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.674985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.675031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.675082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.675123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.675393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.677871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.677923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.677967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.678009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.678419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.678476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.678518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.678572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.678853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.682435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.682487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.682877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.682925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.683335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.683745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.683802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.685407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.685680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.689815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.689871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.691125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.691171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.691740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.693423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.693477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.695141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.695435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.701344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.701412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.702872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.702916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.703324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.703735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.703784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.705216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.705578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.710396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.710458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.711845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.711891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.712360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.714046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.714094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.714947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.249 [2024-05-15 00:11:29.715224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.718863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.718927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.720608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.720654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.721095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.722871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.722932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.724669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.724942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.729812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.729869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.731143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.731188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.731632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.733062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.733110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.734783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.735126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.740059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.740115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.740825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.740869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.741276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.741684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.741744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.743361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.743643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.748224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.748284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.749960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.750006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.750435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.752153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.752201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.752917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.753195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.758043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.758099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.759373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.759429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.759873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.759920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.761379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.761429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.761701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.765124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.765181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.765226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.766630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.768734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.768790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.769427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.769474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.769748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.774184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.774235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.774733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.774793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.775201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.775607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.775655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.775695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.775971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.780429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.781807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.781853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.781893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.783977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.784033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.784074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.784927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.785207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.789026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.789082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.789124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.790799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.791301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.791364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.792951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.793000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.793270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.796836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.796888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.798512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.798566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.799164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.800812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.800864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.800904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.801174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.804733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.806165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.806213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.806258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.250 [2024-05-15 00:11:29.807465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.807523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.807565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.808565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.808841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.813746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.813802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.813842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.814680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.815090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.815151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.816761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.816810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.817078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.820487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.820539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.820931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.820985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.821393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.823128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.823186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.823231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.823505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.827370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.829066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.829114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.829154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.830969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.831023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.831068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.832376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.251 [2024-05-15 00:11:29.832771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.837387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.837446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.837492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.839131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.839550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.839602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.841282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.841329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.841607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.845691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.845744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.847392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.847446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.847854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.849546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.849593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.849633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.849931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.853688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.853741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.853785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.853832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.855257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.856484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.856533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.857073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.857347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.862979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.863046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.863090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.864716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.865140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.865191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.866533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.866578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.866857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.868641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.870218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.870274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.871970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.872395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.873444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.873492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.874881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.875215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.879687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.881206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.881252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.881954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.882363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.883797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.883846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.885518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.885818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.889614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.890434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.890483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.891621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.892040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.892868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.892916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.894304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.894652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.898055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.899804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.899859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.901662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.902068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.903371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.903426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.512 [2024-05-15 00:11:29.903855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.904136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.906979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.908669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.908718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.909525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.909967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.911403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.911451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.913121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.913464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.920182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.921941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.921988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.923666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.925049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.926468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.926517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.927924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.928201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.930503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.930984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.932448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.934011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.934431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.935878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.937201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.938623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.938937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.945085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.945646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.947030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.948472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.950447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.951668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.953068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.954492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.954766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.957476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.959205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.961003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.962571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.964438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.965906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.967586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.968866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.969246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.973659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.974680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.975982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.977095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.977870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.979610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.980000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.981651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.981979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.986489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.987265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.987314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.988296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.990149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.990206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.991623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.991667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.991983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.995700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.997448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.997497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.997885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.999568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:29.999626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:30.000016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:30.000069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:30.000344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:30.003656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:30.004063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:30.004111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:30.005646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:30.006572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.513 [2024-05-15 00:11:30.006631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.007873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.007924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.008198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.010914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.011788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.011880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.013165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.015296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.015362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.016865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.016914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.017260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.020924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.022407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.022466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.022981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.024189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.024254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.025080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.025127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.025410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.027966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.028984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.029033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.030828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.032941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.033000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.033806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.033858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.034209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.036732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.038067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.038118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.039255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.040769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.040839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.042495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.042571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.042917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.045347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.045838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.045895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.047191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.048208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.048282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.049989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.050062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.050417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.052821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.053973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.054043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.054555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.055658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.057178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.057253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.058958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.059428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.062376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.062861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.063331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.063413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.064040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.064666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.064733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.066102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.066501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.069183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.069668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.069742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.071031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.073395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.073476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.073964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.074484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.074992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.077803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.514 [2024-05-15 00:11:30.077884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.078415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.078921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.079537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.080053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.080591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.080664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.081099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.083533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.084062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.084582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.084657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.085815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.086331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.086412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.086913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.087381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.090227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.090675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.090734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.091129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.092114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.092175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.092584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.093005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.093465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.095689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.095751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.096154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.096566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.097071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.097482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.097876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.097926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.515 [2024-05-15 00:11:30.098295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.100162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.101919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.102724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.102780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.104687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.105091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.105141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.105539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.105818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.110191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.110620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.110672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.111204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.113291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.113363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.114356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.115389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.115685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.120565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.120628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.121908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.122547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.122968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.124598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.125564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.125616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.125992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.129306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.130491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.132182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.132241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.133408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.133810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.133857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.134301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.134587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.139281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.139705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.139757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.140849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.142348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.142413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.143896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.145310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.145675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.150242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.151220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.152625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.153913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.154332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.154394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.154809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.154857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.155208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.158640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.159041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.159757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.159809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.161944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.162341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.162395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.162794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.163185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.167563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.167622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.168012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.168060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.169869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.169930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.171351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.171407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.171751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.176883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.176952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.177349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.177391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.179513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.179581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.181320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.181376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.181665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.186938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.186998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.188282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.188331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.189267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.189325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.190739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.190788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.191094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.196194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.196253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.197693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.197743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.198579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.198636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.199884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.199930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.200248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.205536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.205601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.207324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.207380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.208170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.208247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.785 [2024-05-15 00:11:30.208644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.208693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.208966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.213152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.213205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.214625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.214674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.215161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.215209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.216688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.216738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.217108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.221184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.221240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.221282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.221322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.223349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.223414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.223457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.223497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.223805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.226677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.226728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.226774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.226815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.227227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.227273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.227314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.227353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.227685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.228644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.228700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.228741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.228781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.229214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.229260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.229301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.229342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.229659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.230653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.230704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.230745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.230786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.231244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.231294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.231336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.231380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.231658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.232966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.233021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.234505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.234556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.234968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.236384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.236440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.238235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.238519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.240686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.240743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.241491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.241539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.241947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.242355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.242408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.244105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.244380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.246233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.246290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.247686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.247732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.248212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.249646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.249695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.250465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.250834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.253250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.253307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.254736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.254784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.255196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.256979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.257026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.258610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.258887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.260246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.260301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.261883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.261935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.262351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.263912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.263967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.265642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.266000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.268359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.268425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.269584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.269628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.270169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.270874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.270924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.272319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.272633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.274988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.275046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.276632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.276686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.277110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.278834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.278890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.279276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.279722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.282080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.282138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.283546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.283591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.284075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.285485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.285533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.286951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.287298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.288927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.288990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.290421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.290467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.290881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.290928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.292314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.292360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.292643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.294997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.295054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.295103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.296774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.297674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.297733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.299164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.299210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.299559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.300626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.300677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.301573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.301621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.302095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.786 [2024-05-15 00:11:30.303517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.303564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.303605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.303908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.305018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.305944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.305993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.306033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.307880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.307935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.307976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.309390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.309725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.312430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.312489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.312534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.314017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.314608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.314659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.315048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.315094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.315366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.316277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.316328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.318014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.318058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.318476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.319924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.319972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.320020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.320292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.323091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.324496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.324544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.324584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.326411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.326466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.326507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.327367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.327652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.332178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.332233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.332275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.333879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.334359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.334414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.336023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.336078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.336352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.337294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.337345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.338755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.338800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.339242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.340207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.340254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.340295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.340609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.341528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.342948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.342995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.343035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.344642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.344698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.344742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.346186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.346467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.348512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.348567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.348612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.349059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.349480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.349555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.349945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.349991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.350264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.351191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.351249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.353042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.353090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.353510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.354920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.354967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.355016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.355288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.356291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.356341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.356389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.356438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.358485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.359565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.359614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.360279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.360559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.361932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.361988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.362030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.362877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.363344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.363392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.364687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.364737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.365009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.366002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.367832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.367883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.368441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.368855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.369526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.369576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.370620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:29.787 [2024-05-15 00:11:30.370894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.371861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.373364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.373435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.373820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.374439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.376132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.376186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.377565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.377956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.378956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.379550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.379602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.379988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.380462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.381535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.381585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.383005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.383278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.384233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.385645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.385694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.386716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.387262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.387669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.387723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.389275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.389590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.390548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.392197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.392254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.392648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.393189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.394371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.394425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.394941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.395261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.396625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.398087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.398137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.398909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.400880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.401681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.052 [2024-05-15 00:11:30.401732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.402666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.403072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.404251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.405760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.407170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.407685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.408100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.409719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.411411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.411801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.412231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.416771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.418444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.418840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.419225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.420849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.421450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.422766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.423167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.423543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.425947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.426349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.426752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.427141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.428115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.428525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.428918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.429307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.429624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.431149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.431557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.431953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.432348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.433309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.433720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.434115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.434518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.435009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.436391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.436798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.436847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.437237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.438059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.438115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.438509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.438554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.438929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.439997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.440393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.440446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.440835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.441704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.441761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.442146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.442189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.442545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.443662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.444059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.444105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.444503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.445334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.445391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.445783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.445827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.446151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.447277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.447684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.447737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.448128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.448951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.449007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.449393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.449447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.449768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.450916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.451324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.451369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.451764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.452612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.452670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.453055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.453098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.453384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.454493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.454888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.454932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.455336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.456193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.456250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.456643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.456686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.053 [2024-05-15 00:11:30.456968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.458068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.458475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.458520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.458909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.459756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.459815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.460205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.460247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.460582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.461613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.463242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.463299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.463700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.465117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.465175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.466193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.466241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.466533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.467514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.468459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.468515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.468900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.471027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.472566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.472615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.473352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.473651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.474668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.475068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.476212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.476258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.476790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.477800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.477851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.479593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.479909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.483655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.484173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.484222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.485467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.486225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.486281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.486674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.487516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.487842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.490040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.490098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.491387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.491781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.492396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.493918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.495221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.495268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.495648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.496605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.497027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.497732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.497786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.499611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.501280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.501336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.502348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.502663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.506598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.507564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.507615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.508744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.509568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.509632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.510016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.511515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.511824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.514331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.514394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.516178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.517875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.518428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.518825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.519216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.519264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.519644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.521394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.522912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.524508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.524558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.526460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.527432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.527483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.527867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.528253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.530667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.532431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.532487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.533612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.535584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.054 [2024-05-15 00:11:30.535641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.537307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.538128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.538533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.541317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.541380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.543121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.544712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.545135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.546522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.547940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.547986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.548261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.549564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.550180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.551593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.551640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.553681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.555012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.555067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.556760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.557063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.558703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.559102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.559147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.560018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.561904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.561960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.563640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.564641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.564920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.567563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.568138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.568535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.569340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.569782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.569832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.571261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.571308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.571587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.572474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.574158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.575821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.575873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.576635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.577029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.577077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.578557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.578876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.580750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.580820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.582425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.582475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.584613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.584672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.586122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.586167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.586652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.589076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.589132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.590568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.590614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.591694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.591754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.593133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.593179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.593501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.594735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.594790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.595175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.595219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.597291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.597350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.599027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.055 [2024-05-15 00:11:30.599073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.599350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.601679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.601735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.603407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.603453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.604306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.604358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.605384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.605435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.605745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.607817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.607872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.609666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.609718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.611667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.611727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.613370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.613420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.613849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.614853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.614904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.616516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.616569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.616981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.617038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.618553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.618599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.618872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.621462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.621518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.621560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.621600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.622440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.622494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.622537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.622578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.622851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.623819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.623870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.623912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.623952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.624362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.624424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.624466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.624508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.624779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.625727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.625777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.625817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.625858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.626269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.626324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.626366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.626417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.626860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.627864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.627914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.627962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.628005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.628417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.628471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.628512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.628552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.628824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.629710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.629761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.631301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.631348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.631771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.633582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.633631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.634021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.634437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.636766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.636822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.638497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.638543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.056 [2024-05-15 00:11:30.639080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.318 [2024-05-15 00:11:30.640497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.318 [2024-05-15 00:11:30.640544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.318 [2024-05-15 00:11:30.641974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.318 [2024-05-15 00:11:30.642252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.318 [2024-05-15 00:11:30.643544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.318 [2024-05-15 00:11:30.643598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.318 [2024-05-15 00:11:30.645307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.318 [2024-05-15 00:11:30.645359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.318 [2024-05-15 00:11:30.645773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.318 [2024-05-15 00:11:30.647546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.647601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.649088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.649366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.651928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.651984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.652898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.652945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.653487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.654209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.654257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.655637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.655932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.657962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.658019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.659397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.659453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.659898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.661577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.661628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.662019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.662361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.664632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.664688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.666365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.666417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.666909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.668539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.668591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.670136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.670416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.671749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.671804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.673141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.673187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.673637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.675062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.675110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.676793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.677206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.679842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.679900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.681309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.681353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.681959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.682354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.682411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.683968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.684246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.685758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.685815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.687223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.687268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.687717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.687770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.689447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.689494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.689872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.692420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.692477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.692522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.693922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.695668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.695723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.697367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.697424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.697699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.698640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.698691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.699083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.699126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.699651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.701082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.701129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.701178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.701455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.702383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.703009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.703056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.703096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.704929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.704984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.705025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.706705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.707070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.710541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.710598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.710652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.712420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.712934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.712982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.714013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.714062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.714332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.715323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.319 [2024-05-15 00:11:30.715374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.717092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.717146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.717597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.718096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.718145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.718186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.718500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.719493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.719890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.719936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.719978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.721342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.721402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.721444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.722106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.722379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.723697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.723752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.723795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.724535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.724960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.725012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.726449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.726499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.726772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.727714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.727765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.728166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.728211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.728768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.730262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.730319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.730373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.730648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.731639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.732683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.732734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.732775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.734116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.734172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.734214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.734604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.734928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.736285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.736354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.736395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.737789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.738202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.738251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.738649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.738697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.739174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.740161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.740212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.741684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.741733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.742298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.742702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.742748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.742789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.743066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.743938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.743992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.744037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.744079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.745323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.746474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.746539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.746925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.747321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.749924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.749981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.750021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.750451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.750865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.750917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.752592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.752638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.752931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.754168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.755220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.755275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.755924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.756387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.757048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.757094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.757483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.757788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.758682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.760388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.760440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.761612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.762024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.763417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.320 [2024-05-15 00:11:30.763471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.764910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.765185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.766336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.766791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.766842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.768162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.768574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.769346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.769395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.770714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.770993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.772105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.772516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.772567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.772955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.773567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.773967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.774024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.774420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.774835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.776059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.776472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.776523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.776924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.777534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.777932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.777989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.778377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.778777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.780814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.781214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.781263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.781655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.782605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.783006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.783056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.783461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.783929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.784972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.785369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.785763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.786153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.786656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.787052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.787447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.787839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.788155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.789521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.789925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.790317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.790711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.791490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.791891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.792282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.792679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.793074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.794575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.794977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.795370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.795765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.796736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.797137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.797534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.797922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.798296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.799720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.800120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.800522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.800919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.801770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.802173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.802575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.802969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.803357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.804799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.805212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.805258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.805665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.806511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.806575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.806960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.807015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.807331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.808360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.809247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.809298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.809691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.811794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.811857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.813482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.813536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.813868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.814742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.321 [2024-05-15 00:11:30.815141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.815187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.815583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.817274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.817330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.818283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.818346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.818640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.819593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.819990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.820034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.820476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.822601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.822664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.823451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.823504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.823814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.824772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.825184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.825229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.826777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.827962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.828025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.829765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.829809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.830083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.831169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.831579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.831633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.833146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.835242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.835299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.835695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.835744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.836207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.837146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.838196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.838247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.840048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.841290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.841346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.841742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.841790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.842117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.842996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.844781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.844830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.846067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.847756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.847813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.848791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.848842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.849259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.850453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.851960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.852009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.852717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.854839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.855242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.855291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.855685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.856027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.856905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.857306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.858969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.859016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.859433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.859837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.859907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.860293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.860571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.861855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.862259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.862314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.862707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.864147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.864207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.865330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.866121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.866575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.868853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.868911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.869479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.869870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.870310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.872008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.873753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.873798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.874146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.875021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.875430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.875824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.875873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.877648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.879069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.322 [2024-05-15 00:11:30.879118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.323 [2024-05-15 00:11:30.880612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.323 [2024-05-15 00:11:30.880946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.323 [2024-05-15 00:11:30.883578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.323 [2024-05-15 00:11:30.885358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.323 [2024-05-15 00:11:30.885410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.323 [2024-05-15 00:11:30.885799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.323 [2024-05-15 00:11:30.887775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.323 [2024-05-15 00:11:30.887832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.323 [2024-05-15 00:11:30.889396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.323 [2024-05-15 00:11:30.890858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.323 [2024-05-15 00:11:30.891170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.323 [2024-05-15 00:11:30.893856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.323 [2024-05-15 00:11:30.893915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.323 [2024-05-15 00:11:30.895540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.323 [2024-05-15 00:11:30.897273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.323 [2024-05-15 00:11:30.897784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.323 [2024-05-15 00:11:30.898181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.323 [2024-05-15 00:11:30.899713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.323 [2024-05-15 00:11:30.899760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.323 [2024-05-15 00:11:30.900055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.323 [2024-05-15 00:11:30.901050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.323 [2024-05-15 00:11:30.901957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.323 [2024-05-15 00:11:30.903360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.323 [2024-05-15 00:11:30.903410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.323 [2024-05-15 00:11:30.905337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.906950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.907000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.907404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.907886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.910197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.911636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.911685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.912577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.914448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.914504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.915926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.917384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.917843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.920349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.920410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.921968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.923456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.923938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.925349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.926773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.926819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.927134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.928225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.928644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.930147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.930193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.932299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.934085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.934132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.935570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.935890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.938050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.938460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.938517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.938918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.941083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.941141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.942785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.944490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.585 [2024-05-15 00:11:30.944767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.947071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.948378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.948776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.949172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.949593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.949642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.951307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.951360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.951639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.952564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.953963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.955376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.955427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.956963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.957362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.957416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.957943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.958217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.960492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.960549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.961516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.961562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.963387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.963446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.964845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.964890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.965236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.968654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.968711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.970343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.970393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.972559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.972616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.974075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.974120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.974461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.976464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.976526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.976916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.976964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.978759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.978815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.980239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.980284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.980613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.982798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.982855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.984274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.984319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.985757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.985815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.986203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.986251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.986576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.989210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.989268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.990917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.990962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.992715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.992770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.994181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.994226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.994547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.995652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.995702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.996542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.996587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.997034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.997087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.998500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.998546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:30.998856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:31.001241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.586 [2024-05-15 00:11:31.001306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.001350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.001395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.003560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.003618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.003660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.003715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.004178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.005251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.005309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.005352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.005397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.005818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.005868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.005909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.005949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.006281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.007148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.007201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.007250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.007307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.007719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.007771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.007813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.007853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.008160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.009081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.009137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.009191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.009234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.009769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.009817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.009857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.009899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.010219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.011081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.011133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.012407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.012452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.012866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.014672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.014720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.016396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.016681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.018430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.018487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.019880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.019925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.020408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.021828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.021876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.022754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.023028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.025327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.025383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.025786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.025838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.026333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.028042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.028095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.029552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.029826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.032077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.032133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.033562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.033609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.034056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.035126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.035173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.035562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.035895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.038302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.038364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.040116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.040169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.040580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.041981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.042029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.043427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.043739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.045315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.045373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.046765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.046811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.047221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.048646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.048699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.049939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.050217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.052589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.052645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.053381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.053435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.053913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.587 [2024-05-15 00:11:31.054429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.054481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.055663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.055940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.058434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.058496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.058887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.058937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.059476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.060744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.060795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.061822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.062172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.064307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.064364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.064754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.064796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.065258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.065306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.066540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.066591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.066865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.069543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.069607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.069653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.070051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.071981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.072038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.073178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.073226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.073544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.074534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.074592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.074982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.075032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.075533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.076562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.076612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.076652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.076969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.077948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.079353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.079406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.079447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.080274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.080330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.080371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.081385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.081701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.083808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.083868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.083908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.085224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.085754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.085809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.086213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.086259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.086539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.087540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.087595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.089178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.089234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.089688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.090185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.090235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.090276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.090614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.091576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.091990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.092040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.092081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.092863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.092919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.092961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.093353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.093634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.095282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.095341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.095382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.096778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.097190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.097239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.098550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.098599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.099041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.100189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.100239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.101632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.101677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.102134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.103663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.103713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.103760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.104030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.588 [2024-05-15 00:11:31.105040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.106264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.106313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.106353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.108207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.108262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.108303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.109779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.110135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.112667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.112728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.112769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.114104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.114735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.114797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.115188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.115243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.115608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.116565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.116629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.117690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.117744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.118151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.118652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.118703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.118744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.119030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.119959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.120022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.120069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.120110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.121195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.121604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.121657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.122047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.122429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.123867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.123924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.123967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.124357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.124842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.124904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.125296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.125345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.125691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.126779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.127184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.127233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.127633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.128049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.128468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.128529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.128924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.129249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.130382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.130793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.130842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.131230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.131782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.132183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.132239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.132638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.133106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.134581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.134980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.135028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.135430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.136035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.136444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.136495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.136893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.137348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.138522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.138922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.138972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.139359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.139892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.140291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.140341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.140738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.141086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.141973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.142380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.142438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.142840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.143311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.143717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.143786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.144179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.144583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.146285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.146698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.146750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.147141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.148112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.148524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.589 [2024-05-15 00:11:31.148579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.148969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.149346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.150343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.150765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.151158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.151559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.152050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.152473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.152868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.153264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.153635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.155128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.155542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.155938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.156329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.157342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.158650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.160150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.161037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.161387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.162713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.163702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.164987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.166167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.167909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.168973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.169370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.169773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.590 [2024-05-15 00:11:31.170082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.172415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.173723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.174444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.174841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.177001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.178527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.178932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.180569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.180847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.182295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.183822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.183881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.184467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.185229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.185286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.185686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.185739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.186018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.187012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.188604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.188657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.189940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.190821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.190884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.191300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.191346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.191627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.192576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.193603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.193652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.194673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.196267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.196338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.196744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.196798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.197197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.198229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.198641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.198695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.200190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.200954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.201011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.201409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.201460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.201732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.202759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.203407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.203458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.203852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.205845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.205905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.207551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.207605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.207881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.208930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.210373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.210426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.211609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.212499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.851 [2024-05-15 00:11:31.212557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.212946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.212994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.213348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.214353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.215461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.215514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.216916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.218074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.218137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.219951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.219996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.220271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.221260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.221670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.221722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.222109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.222946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.223020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.223429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.223489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.223780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.224722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.226249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.226300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.227934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.228775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.229451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.229502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.230783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.231062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.231948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.233317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.234613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.234660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.235124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.236818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.236867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.237258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.237559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.239862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.241569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.241623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.242631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.244473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.244529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.246129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.247871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.248242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.250531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.250593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.251999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.253674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.254138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.255938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.257685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.257740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.258025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.259021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.259431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.260501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.260549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.262382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.264072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.264120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.264802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.265079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.267653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.268056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.268105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.268498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.270247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.270304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.271750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.273435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.273782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.276150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.276208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.277870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.278263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.278735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.279662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.281068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.281117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.281443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.282374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.284203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.285967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.286023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.288194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.852 [2024-05-15 00:11:31.288610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.288663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.289053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.289329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.291969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.292966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.293016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.294678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.296585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.296650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.298327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.298723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.299073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.301447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.301505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.303169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.304012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.304434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.306242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.308027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.308082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.308360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.309354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.310821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.312217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.312265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.314348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.314973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.315024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.316411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.316686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.317993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.318408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.318459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.319953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.321760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.321820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.323484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.324127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.324411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.326958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.327384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.327784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.329088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.329552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.329603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.331019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.331067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.331341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.332407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.333816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.335512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.335559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.336389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.336800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.336848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.338561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.338840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.340410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.340470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.341870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.341917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.344004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.344061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.344882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.344931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.345333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.347822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.347881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.349304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.349351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:30.853 [2024-05-15 00:11:31.350591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:31.421 00:29:31.421 Latency(us) 00:29:31.421 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:31.421 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:29:31.421 Verification LBA range: start 0x0 length 0x100 00:29:31.421 crypto_ram : 5.88 43.55 2.72 0.00 0.00 2851759.86 60635.05 2698943.44 00:29:31.421 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:29:31.421 Verification LBA range: start 0x100 length 0x100 00:29:31.422 crypto_ram : 5.84 43.87 2.74 0.00 0.00 2823575.82 73856.22 2625999.03 00:29:31.422 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:29:31.422 Verification LBA range: start 0x0 length 0x100 00:29:31.422 crypto_ram2 : 5.88 43.54 2.72 0.00 0.00 2744537.27 60179.14 2698943.44 00:29:31.422 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:29:31.422 Verification LBA range: start 0x100 length 0x100 00:29:31.422 crypto_ram2 : 5.84 43.86 2.74 0.00 0.00 2719403.85 73400.32 2523876.84 00:29:31.422 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:29:31.422 Verification LBA range: start 0x0 length 0x100 00:29:31.422 crypto_ram3 : 5.63 270.28 16.89 0.00 0.00 419723.65 3476.26 601791.44 00:29:31.422 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:29:31.422 Verification LBA range: start 0x100 length 0x100 00:29:31.422 crypto_ram3 : 5.59 285.91 17.87 0.00 0.00 398680.63 33964.74 601791.44 00:29:31.422 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:29:31.422 Verification LBA range: start 0x0 length 0x100 00:29:31.422 crypto_ram4 : 5.73 286.50 17.91 0.00 0.00 384895.36 19489.84 503316.48 00:29:31.422 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:29:31.422 Verification LBA range: start 0x100 length 0x100 00:29:31.422 crypto_ram4 : 5.71 301.24 18.83 0.00 0.00 367102.46 10884.67 506963.70 00:29:31.422 =================================================================================================================== 00:29:31.422 Total : 1318.75 82.42 0.00 0.00 718227.44 3476.26 2698943.44 00:29:31.679 00:29:31.679 real 0m9.317s 00:29:31.679 user 0m17.532s 00:29:31.679 sys 0m0.587s 00:29:31.679 00:11:32 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:31.679 00:11:32 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:29:31.680 ************************************ 00:29:31.680 END TEST bdev_verify_big_io 00:29:31.680 ************************************ 00:29:31.938 00:11:32 blockdev_crypto_aesni -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:31.938 00:11:32 blockdev_crypto_aesni -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:29:31.938 00:11:32 blockdev_crypto_aesni -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:31.938 00:11:32 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:29:31.938 ************************************ 00:29:31.938 START TEST bdev_write_zeroes 00:29:31.938 ************************************ 00:29:31.938 00:11:32 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:31.938 [2024-05-15 00:11:32.370008] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:29:31.938 [2024-05-15 00:11:32.370066] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid554783 ] 00:29:31.938 [2024-05-15 00:11:32.498143] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:32.197 [2024-05-15 00:11:32.595928] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:29:32.197 [2024-05-15 00:11:32.617196] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:29:32.197 [2024-05-15 00:11:32.625224] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:29:32.197 [2024-05-15 00:11:32.633243] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:29:32.197 [2024-05-15 00:11:32.739354] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:29:34.730 [2024-05-15 00:11:35.167337] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:29:34.730 [2024-05-15 00:11:35.167406] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:34.730 [2024-05-15 00:11:35.167423] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:34.730 [2024-05-15 00:11:35.175356] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:29:34.730 [2024-05-15 00:11:35.175374] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:34.730 [2024-05-15 00:11:35.175386] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:34.730 [2024-05-15 00:11:35.183375] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:29:34.730 [2024-05-15 00:11:35.183392] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:29:34.730 [2024-05-15 00:11:35.183408] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:34.730 [2024-05-15 00:11:35.191396] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:29:34.730 [2024-05-15 00:11:35.191417] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:29:34.730 [2024-05-15 00:11:35.191428] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:34.730 Running I/O for 1 seconds... 00:29:35.726 00:29:35.726 Latency(us) 00:29:35.726 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:35.726 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:29:35.726 crypto_ram : 1.03 1972.61 7.71 0.00 0.00 64370.07 5413.84 77047.54 00:29:35.726 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:29:35.726 crypto_ram2 : 1.03 1978.33 7.73 0.00 0.00 63830.10 5385.35 72032.61 00:29:35.726 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:29:35.726 crypto_ram3 : 1.02 15166.94 59.25 0.00 0.00 8315.23 2464.72 10713.71 00:29:35.726 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:29:35.726 crypto_ram4 : 1.02 15151.62 59.19 0.00 0.00 8287.38 2450.48 8776.13 00:29:35.727 =================================================================================================================== 00:29:35.727 Total : 34269.50 133.87 0.00 0.00 14761.38 2450.48 77047.54 00:29:36.296 00:29:36.296 real 0m4.417s 00:29:36.296 user 0m3.842s 00:29:36.296 sys 0m0.531s 00:29:36.296 00:11:36 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:36.296 00:11:36 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:29:36.296 ************************************ 00:29:36.296 END TEST bdev_write_zeroes 00:29:36.296 ************************************ 00:29:36.296 00:11:36 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:36.296 00:11:36 blockdev_crypto_aesni -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:29:36.296 00:11:36 blockdev_crypto_aesni -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:36.296 00:11:36 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:29:36.296 ************************************ 00:29:36.296 START TEST bdev_json_nonenclosed 00:29:36.296 ************************************ 00:29:36.296 00:11:36 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:36.296 [2024-05-15 00:11:36.885045] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:29:36.296 [2024-05-15 00:11:36.885103] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid555328 ] 00:29:36.555 [2024-05-15 00:11:37.012911] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:36.555 [2024-05-15 00:11:37.110607] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:29:36.555 [2024-05-15 00:11:37.110677] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:29:36.555 [2024-05-15 00:11:37.110698] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:29:36.555 [2024-05-15 00:11:37.110712] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:29:36.815 00:29:36.815 real 0m0.400s 00:29:36.815 user 0m0.255s 00:29:36.815 sys 0m0.142s 00:29:36.815 00:11:37 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:36.815 00:11:37 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:29:36.815 ************************************ 00:29:36.815 END TEST bdev_json_nonenclosed 00:29:36.815 ************************************ 00:29:36.815 00:11:37 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:36.815 00:11:37 blockdev_crypto_aesni -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:29:36.815 00:11:37 blockdev_crypto_aesni -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:36.815 00:11:37 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:29:36.815 ************************************ 00:29:36.815 START TEST bdev_json_nonarray 00:29:36.815 ************************************ 00:29:36.815 00:11:37 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:36.815 [2024-05-15 00:11:37.373537] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:29:36.815 [2024-05-15 00:11:37.373594] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid555506 ] 00:29:37.073 [2024-05-15 00:11:37.501333] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:37.073 [2024-05-15 00:11:37.600627] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:29:37.073 [2024-05-15 00:11:37.600702] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:29:37.073 [2024-05-15 00:11:37.600723] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:29:37.073 [2024-05-15 00:11:37.600736] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:29:37.333 00:29:37.333 real 0m0.407s 00:29:37.333 user 0m0.247s 00:29:37.333 sys 0m0.158s 00:29:37.333 00:11:37 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:37.333 00:11:37 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:29:37.333 ************************************ 00:29:37.333 END TEST bdev_json_nonarray 00:29:37.333 ************************************ 00:29:37.333 00:11:37 blockdev_crypto_aesni -- bdev/blockdev.sh@787 -- # [[ crypto_aesni == bdev ]] 00:29:37.333 00:11:37 blockdev_crypto_aesni -- bdev/blockdev.sh@794 -- # [[ crypto_aesni == gpt ]] 00:29:37.333 00:11:37 blockdev_crypto_aesni -- bdev/blockdev.sh@798 -- # [[ crypto_aesni == crypto_sw ]] 00:29:37.333 00:11:37 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:29:37.333 00:11:37 blockdev_crypto_aesni -- bdev/blockdev.sh@811 -- # cleanup 00:29:37.333 00:11:37 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:29:37.333 00:11:37 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:29:37.333 00:11:37 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:29:37.333 00:11:37 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:29:37.333 00:11:37 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:29:37.333 00:11:37 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:29:37.333 00:29:37.333 real 1m14.143s 00:29:37.333 user 2m40.948s 00:29:37.333 sys 0m10.363s 00:29:37.333 00:11:37 blockdev_crypto_aesni -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:37.333 00:11:37 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:29:37.333 ************************************ 00:29:37.333 END TEST blockdev_crypto_aesni 00:29:37.333 ************************************ 00:29:37.333 00:11:37 -- spdk/autotest.sh@354 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:29:37.333 00:11:37 -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:29:37.333 00:11:37 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:37.333 00:11:37 -- common/autotest_common.sh@10 -- # set +x 00:29:37.333 ************************************ 00:29:37.333 START TEST blockdev_crypto_sw 00:29:37.333 ************************************ 00:29:37.333 00:11:37 blockdev_crypto_sw -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:29:37.593 * Looking for test storage... 00:29:37.593 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # uname -s 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # test_type=crypto_sw 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # crypto_device= 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # dek= 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # env_ctx= 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == bdev ]] 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == crypto_* ]] 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=555580 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 555580 00:29:37.593 00:11:37 blockdev_crypto_sw -- common/autotest_common.sh@827 -- # '[' -z 555580 ']' 00:29:37.593 00:11:37 blockdev_crypto_sw -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:37.593 00:11:37 blockdev_crypto_sw -- common/autotest_common.sh@832 -- # local max_retries=100 00:29:37.593 00:11:37 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:29:37.593 00:11:37 blockdev_crypto_sw -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:37.593 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:37.593 00:11:37 blockdev_crypto_sw -- common/autotest_common.sh@836 -- # xtrace_disable 00:29:37.593 00:11:37 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:37.593 [2024-05-15 00:11:38.049665] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:29:37.593 [2024-05-15 00:11:38.049736] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid555580 ] 00:29:37.593 [2024-05-15 00:11:38.177515] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:37.853 [2024-05-15 00:11:38.281329] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:29:38.422 00:11:38 blockdev_crypto_sw -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:29:38.422 00:11:38 blockdev_crypto_sw -- common/autotest_common.sh@860 -- # return 0 00:29:38.422 00:11:38 blockdev_crypto_sw -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:29:38.422 00:11:38 blockdev_crypto_sw -- bdev/blockdev.sh@711 -- # setup_crypto_sw_conf 00:29:38.422 00:11:38 blockdev_crypto_sw -- bdev/blockdev.sh@193 -- # rpc_cmd 00:29:38.422 00:11:38 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:38.422 00:11:38 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:38.681 Malloc0 00:29:38.681 Malloc1 00:29:38.681 true 00:29:38.681 true 00:29:38.681 true 00:29:38.681 [2024-05-15 00:11:39.222242] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:29:38.681 crypto_ram 00:29:38.681 [2024-05-15 00:11:39.230270] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:29:38.681 crypto_ram2 00:29:38.681 [2024-05-15 00:11:39.238293] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:29:38.681 crypto_ram3 00:29:38.681 [ 00:29:38.681 { 00:29:38.681 "name": "Malloc1", 00:29:38.681 "aliases": [ 00:29:38.681 "011456f7-7f13-4345-9867-a7850d874a30" 00:29:38.681 ], 00:29:38.681 "product_name": "Malloc disk", 00:29:38.681 "block_size": 4096, 00:29:38.681 "num_blocks": 4096, 00:29:38.681 "uuid": "011456f7-7f13-4345-9867-a7850d874a30", 00:29:38.681 "assigned_rate_limits": { 00:29:38.681 "rw_ios_per_sec": 0, 00:29:38.681 "rw_mbytes_per_sec": 0, 00:29:38.681 "r_mbytes_per_sec": 0, 00:29:38.681 "w_mbytes_per_sec": 0 00:29:38.681 }, 00:29:38.681 "claimed": true, 00:29:38.681 "claim_type": "exclusive_write", 00:29:38.681 "zoned": false, 00:29:38.681 "supported_io_types": { 00:29:38.681 "read": true, 00:29:38.681 "write": true, 00:29:38.681 "unmap": true, 00:29:38.681 "write_zeroes": true, 00:29:38.681 "flush": true, 00:29:38.681 "reset": true, 00:29:38.681 "compare": false, 00:29:38.681 "compare_and_write": false, 00:29:38.681 "abort": true, 00:29:38.681 "nvme_admin": false, 00:29:38.681 "nvme_io": false 00:29:38.681 }, 00:29:38.681 "memory_domains": [ 00:29:38.681 { 00:29:38.681 "dma_device_id": "system", 00:29:38.681 "dma_device_type": 1 00:29:38.681 }, 00:29:38.681 { 00:29:38.681 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:38.681 "dma_device_type": 2 00:29:38.681 } 00:29:38.681 ], 00:29:38.681 "driver_specific": {} 00:29:38.681 } 00:29:38.681 ] 00:29:38.681 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:38.681 00:11:39 blockdev_crypto_sw -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:29:38.681 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:38.681 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:38.681 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:38.681 00:11:39 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # cat 00:29:38.681 00:11:39 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:29:38.681 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:38.681 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:38.941 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:38.941 00:11:39 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:29:38.941 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:38.941 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:38.941 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:38.941 00:11:39 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:29:38.941 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:38.941 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:38.941 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:38.941 00:11:39 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:29:38.941 00:11:39 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:29:38.941 00:11:39 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:29:38.941 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:38.941 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:38.941 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:38.941 00:11:39 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:29:38.941 00:11:39 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # jq -r .name 00:29:38.941 00:11:39 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "77178265-6f73-5699-9e7a-aaf7bf4941e1"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "77178265-6f73-5699-9e7a-aaf7bf4941e1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "e222d605-d54a-5719-99b0-1c3bd8660010"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "e222d605-d54a-5719-99b0-1c3bd8660010",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:29:38.941 00:11:39 blockdev_crypto_sw -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:29:38.941 00:11:39 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:29:38.941 00:11:39 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:29:38.941 00:11:39 blockdev_crypto_sw -- bdev/blockdev.sh@754 -- # killprocess 555580 00:29:38.941 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@946 -- # '[' -z 555580 ']' 00:29:38.941 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@950 -- # kill -0 555580 00:29:38.941 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@951 -- # uname 00:29:38.941 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:29:38.941 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 555580 00:29:38.941 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:29:38.941 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:29:38.941 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@964 -- # echo 'killing process with pid 555580' 00:29:38.941 killing process with pid 555580 00:29:38.941 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@965 -- # kill 555580 00:29:38.941 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@970 -- # wait 555580 00:29:39.510 00:11:39 blockdev_crypto_sw -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:29:39.510 00:11:39 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:29:39.510 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:29:39.510 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:39.510 00:11:39 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:39.510 ************************************ 00:29:39.510 START TEST bdev_hello_world 00:29:39.510 ************************************ 00:29:39.510 00:11:39 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:29:39.510 [2024-05-15 00:11:39.952476] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:29:39.510 [2024-05-15 00:11:39.952536] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid555889 ] 00:29:39.510 [2024-05-15 00:11:40.083356] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:39.768 [2024-05-15 00:11:40.185176] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:29:39.768 [2024-05-15 00:11:40.357211] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:29:39.768 [2024-05-15 00:11:40.357282] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:39.768 [2024-05-15 00:11:40.357298] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:40.027 [2024-05-15 00:11:40.365231] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:29:40.027 [2024-05-15 00:11:40.365251] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:40.027 [2024-05-15 00:11:40.365262] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:40.027 [2024-05-15 00:11:40.373252] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:29:40.027 [2024-05-15 00:11:40.373270] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:29:40.027 [2024-05-15 00:11:40.373281] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:40.027 [2024-05-15 00:11:40.415020] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:29:40.027 [2024-05-15 00:11:40.415056] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:29:40.027 [2024-05-15 00:11:40.415076] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:29:40.027 [2024-05-15 00:11:40.416335] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:29:40.027 [2024-05-15 00:11:40.416417] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:29:40.027 [2024-05-15 00:11:40.416433] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:29:40.027 [2024-05-15 00:11:40.416468] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:29:40.027 00:29:40.027 [2024-05-15 00:11:40.416485] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:29:40.287 00:29:40.287 real 0m0.771s 00:29:40.287 user 0m0.514s 00:29:40.287 sys 0m0.243s 00:29:40.287 00:11:40 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:40.287 00:11:40 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:29:40.287 ************************************ 00:29:40.287 END TEST bdev_hello_world 00:29:40.287 ************************************ 00:29:40.287 00:11:40 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:29:40.287 00:11:40 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:29:40.287 00:11:40 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:40.287 00:11:40 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:40.287 ************************************ 00:29:40.287 START TEST bdev_bounds 00:29:40.287 ************************************ 00:29:40.287 00:11:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1121 -- # bdev_bounds '' 00:29:40.287 00:11:40 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=555974 00:29:40.287 00:11:40 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:29:40.287 00:11:40 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 555974' 00:29:40.287 Process bdevio pid: 555974 00:29:40.287 00:11:40 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 555974 00:29:40.287 00:11:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@827 -- # '[' -z 555974 ']' 00:29:40.287 00:11:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:40.287 00:11:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@832 -- # local max_retries=100 00:29:40.287 00:11:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:40.287 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:40.287 00:11:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@836 -- # xtrace_disable 00:29:40.287 00:11:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:29:40.287 00:11:40 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:29:40.287 [2024-05-15 00:11:40.804331] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:29:40.287 [2024-05-15 00:11:40.804392] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid555974 ] 00:29:40.546 [2024-05-15 00:11:40.932332] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:29:40.546 [2024-05-15 00:11:41.039634] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:29:40.546 [2024-05-15 00:11:41.039717] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:29:40.546 [2024-05-15 00:11:41.039721] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:29:40.805 [2024-05-15 00:11:41.202223] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:29:40.805 [2024-05-15 00:11:41.202293] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:40.805 [2024-05-15 00:11:41.202308] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:40.805 [2024-05-15 00:11:41.210243] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:29:40.805 [2024-05-15 00:11:41.210264] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:40.805 [2024-05-15 00:11:41.210276] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:40.805 [2024-05-15 00:11:41.218264] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:29:40.805 [2024-05-15 00:11:41.218283] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:29:40.805 [2024-05-15 00:11:41.218294] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:41.374 00:11:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:29:41.374 00:11:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@860 -- # return 0 00:29:41.374 00:11:41 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:29:41.374 I/O targets: 00:29:41.374 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:29:41.374 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:29:41.374 00:29:41.374 00:29:41.374 CUnit - A unit testing framework for C - Version 2.1-3 00:29:41.374 http://cunit.sourceforge.net/ 00:29:41.374 00:29:41.374 00:29:41.374 Suite: bdevio tests on: crypto_ram3 00:29:41.374 Test: blockdev write read block ...passed 00:29:41.374 Test: blockdev write zeroes read block ...passed 00:29:41.374 Test: blockdev write zeroes read no split ...passed 00:29:41.374 Test: blockdev write zeroes read split ...passed 00:29:41.374 Test: blockdev write zeroes read split partial ...passed 00:29:41.374 Test: blockdev reset ...passed 00:29:41.374 Test: blockdev write read 8 blocks ...passed 00:29:41.374 Test: blockdev write read size > 128k ...passed 00:29:41.374 Test: blockdev write read invalid size ...passed 00:29:41.374 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:29:41.374 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:29:41.374 Test: blockdev write read max offset ...passed 00:29:41.374 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:29:41.374 Test: blockdev writev readv 8 blocks ...passed 00:29:41.374 Test: blockdev writev readv 30 x 1block ...passed 00:29:41.374 Test: blockdev writev readv block ...passed 00:29:41.374 Test: blockdev writev readv size > 128k ...passed 00:29:41.374 Test: blockdev writev readv size > 128k in two iovs ...passed 00:29:41.374 Test: blockdev comparev and writev ...passed 00:29:41.374 Test: blockdev nvme passthru rw ...passed 00:29:41.374 Test: blockdev nvme passthru vendor specific ...passed 00:29:41.374 Test: blockdev nvme admin passthru ...passed 00:29:41.374 Test: blockdev copy ...passed 00:29:41.374 Suite: bdevio tests on: crypto_ram 00:29:41.374 Test: blockdev write read block ...passed 00:29:41.374 Test: blockdev write zeroes read block ...passed 00:29:41.374 Test: blockdev write zeroes read no split ...passed 00:29:41.374 Test: blockdev write zeroes read split ...passed 00:29:41.374 Test: blockdev write zeroes read split partial ...passed 00:29:41.374 Test: blockdev reset ...passed 00:29:41.374 Test: blockdev write read 8 blocks ...passed 00:29:41.374 Test: blockdev write read size > 128k ...passed 00:29:41.374 Test: blockdev write read invalid size ...passed 00:29:41.374 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:29:41.374 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:29:41.374 Test: blockdev write read max offset ...passed 00:29:41.374 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:29:41.374 Test: blockdev writev readv 8 blocks ...passed 00:29:41.374 Test: blockdev writev readv 30 x 1block ...passed 00:29:41.374 Test: blockdev writev readv block ...passed 00:29:41.374 Test: blockdev writev readv size > 128k ...passed 00:29:41.374 Test: blockdev writev readv size > 128k in two iovs ...passed 00:29:41.374 Test: blockdev comparev and writev ...passed 00:29:41.374 Test: blockdev nvme passthru rw ...passed 00:29:41.374 Test: blockdev nvme passthru vendor specific ...passed 00:29:41.374 Test: blockdev nvme admin passthru ...passed 00:29:41.374 Test: blockdev copy ...passed 00:29:41.374 00:29:41.374 Run Summary: Type Total Ran Passed Failed Inactive 00:29:41.374 suites 2 2 n/a 0 0 00:29:41.374 tests 46 46 46 0 0 00:29:41.374 asserts 260 260 260 0 n/a 00:29:41.374 00:29:41.374 Elapsed time = 0.089 seconds 00:29:41.374 0 00:29:41.374 00:11:41 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 555974 00:29:41.374 00:11:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@946 -- # '[' -z 555974 ']' 00:29:41.374 00:11:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@950 -- # kill -0 555974 00:29:41.374 00:11:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@951 -- # uname 00:29:41.374 00:11:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:29:41.374 00:11:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 555974 00:29:41.374 00:11:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:29:41.374 00:11:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:29:41.374 00:11:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@964 -- # echo 'killing process with pid 555974' 00:29:41.374 killing process with pid 555974 00:29:41.374 00:11:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@965 -- # kill 555974 00:29:41.374 00:11:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@970 -- # wait 555974 00:29:41.633 00:11:42 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:29:41.633 00:29:41.633 real 0m1.446s 00:29:41.633 user 0m3.735s 00:29:41.633 sys 0m0.360s 00:29:41.633 00:11:42 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:41.633 00:11:42 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:29:41.633 ************************************ 00:29:41.633 END TEST bdev_bounds 00:29:41.633 ************************************ 00:29:41.893 00:11:42 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:29:41.893 00:11:42 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:29:41.893 00:11:42 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:41.893 00:11:42 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:41.893 ************************************ 00:29:41.893 START TEST bdev_nbd 00:29:41.893 ************************************ 00:29:41.893 00:11:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1121 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:29:41.893 00:11:42 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:29:41.893 00:11:42 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:29:41.893 00:11:42 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:41.893 00:11:42 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:29:41.893 00:11:42 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:29:41.893 00:11:42 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:29:41.893 00:11:42 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=2 00:29:41.893 00:11:42 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:29:41.893 00:11:42 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:29:41.893 00:11:42 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:29:41.893 00:11:42 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=2 00:29:41.893 00:11:42 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:41.893 00:11:42 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:29:41.893 00:11:42 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:29:41.893 00:11:42 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:29:41.893 00:11:42 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=556178 00:29:41.893 00:11:42 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:29:41.893 00:11:42 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:29:41.893 00:11:42 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 556178 /var/tmp/spdk-nbd.sock 00:29:41.893 00:11:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@827 -- # '[' -z 556178 ']' 00:29:41.893 00:11:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:29:41.893 00:11:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@832 -- # local max_retries=100 00:29:41.893 00:11:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:29:41.893 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:29:41.893 00:11:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@836 -- # xtrace_disable 00:29:41.893 00:11:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:29:41.893 [2024-05-15 00:11:42.355865] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:29:41.893 [2024-05-15 00:11:42.355921] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:29:41.893 [2024-05-15 00:11:42.472270] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:42.152 [2024-05-15 00:11:42.577146] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:29:42.411 [2024-05-15 00:11:42.756437] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:29:42.411 [2024-05-15 00:11:42.756505] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:42.411 [2024-05-15 00:11:42.756521] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:42.411 [2024-05-15 00:11:42.764454] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:29:42.411 [2024-05-15 00:11:42.764474] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:42.411 [2024-05-15 00:11:42.764486] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:42.411 [2024-05-15 00:11:42.772468] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:29:42.411 [2024-05-15 00:11:42.772487] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:29:42.411 [2024-05-15 00:11:42.772498] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:42.669 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:29:42.669 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@860 -- # return 0 00:29:42.669 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:29:42.669 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:42.669 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:29:42.669 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:29:42.669 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:29:42.669 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:42.669 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:29:42.669 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:29:42.669 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:29:42.669 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:29:42.669 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:29:42.669 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:29:42.669 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:29:42.929 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:29:42.929 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:29:42.929 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:29:42.929 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:29:42.929 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:29:42.929 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:29:42.929 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:29:42.929 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:29:42.929 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:29:42.929 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:29:42.929 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:29:42.929 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:42.929 1+0 records in 00:29:42.929 1+0 records out 00:29:42.929 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000257425 s, 15.9 MB/s 00:29:42.929 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:42.929 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:29:42.929 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:42.929 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:29:42.929 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:29:42.929 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:29:42.929 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:29:42.929 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:29:43.189 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:29:43.189 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:29:43.189 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:29:43.189 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:29:43.189 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:29:43.189 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:29:43.189 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:29:43.189 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:29:43.189 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:29:43.189 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:29:43.189 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:29:43.189 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:43.189 1+0 records in 00:29:43.189 1+0 records out 00:29:43.189 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000311906 s, 13.1 MB/s 00:29:43.189 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:43.189 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:29:43.189 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:43.189 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:29:43.189 00:11:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:29:43.189 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:29:43.189 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:29:43.189 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:29:43.448 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:29:43.448 { 00:29:43.448 "nbd_device": "/dev/nbd0", 00:29:43.448 "bdev_name": "crypto_ram" 00:29:43.448 }, 00:29:43.448 { 00:29:43.448 "nbd_device": "/dev/nbd1", 00:29:43.448 "bdev_name": "crypto_ram3" 00:29:43.448 } 00:29:43.448 ]' 00:29:43.448 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:29:43.448 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:29:43.448 { 00:29:43.448 "nbd_device": "/dev/nbd0", 00:29:43.448 "bdev_name": "crypto_ram" 00:29:43.448 }, 00:29:43.448 { 00:29:43.448 "nbd_device": "/dev/nbd1", 00:29:43.448 "bdev_name": "crypto_ram3" 00:29:43.448 } 00:29:43.448 ]' 00:29:43.448 00:11:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:29:43.448 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:29:43.448 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:43.448 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:43.448 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:43.448 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:29:43.448 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:43.448 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:29:43.707 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:43.707 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:43.707 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:43.707 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:43.707 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:43.707 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:43.707 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:43.707 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:43.707 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:43.707 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:29:43.966 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:29:43.966 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:29:43.966 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:29:43.966 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:43.966 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:43.966 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:29:43.966 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:43.966 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:43.966 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:29:43.966 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:43.966 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:29:44.225 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:29:44.225 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:29:44.225 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:29:44.225 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:29:44.225 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:29:44.225 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:29:44.225 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:29:44.225 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:29:44.225 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:29:44.225 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:29:44.225 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:29:44.225 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:29:44.225 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:29:44.225 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:44.225 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:29:44.225 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:29:44.225 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:44.225 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:29:44.225 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:29:44.225 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:44.225 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:29:44.225 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:29:44.225 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:44.225 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:29:44.225 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:29:44.225 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:29:44.225 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:44.225 00:11:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:29:44.484 /dev/nbd0 00:29:44.485 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:29:44.485 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:29:44.485 00:11:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:29:44.485 00:11:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:29:44.485 00:11:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:29:44.485 00:11:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:29:44.485 00:11:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:29:44.485 00:11:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:29:44.485 00:11:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:29:44.485 00:11:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:29:44.485 00:11:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:44.485 1+0 records in 00:29:44.485 1+0 records out 00:29:44.485 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000272388 s, 15.0 MB/s 00:29:44.485 00:11:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:44.485 00:11:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:29:44.485 00:11:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:44.748 00:11:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:29:44.748 00:11:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:29:44.748 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:44.748 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:44.748 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:29:44.748 /dev/nbd1 00:29:44.748 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:29:45.006 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:29:45.006 00:11:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:29:45.006 00:11:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:29:45.006 00:11:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:29:45.006 00:11:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:29:45.006 00:11:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:29:45.006 00:11:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:29:45.006 00:11:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:29:45.006 00:11:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:29:45.006 00:11:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:45.006 1+0 records in 00:29:45.006 1+0 records out 00:29:45.006 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000278799 s, 14.7 MB/s 00:29:45.006 00:11:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:45.006 00:11:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:29:45.006 00:11:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:45.006 00:11:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:29:45.006 00:11:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:29:45.006 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:45.006 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:45.006 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:29:45.006 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:45.006 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:29:45.006 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:29:45.006 { 00:29:45.006 "nbd_device": "/dev/nbd0", 00:29:45.006 "bdev_name": "crypto_ram" 00:29:45.006 }, 00:29:45.006 { 00:29:45.006 "nbd_device": "/dev/nbd1", 00:29:45.006 "bdev_name": "crypto_ram3" 00:29:45.006 } 00:29:45.006 ]' 00:29:45.006 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:29:45.006 { 00:29:45.006 "nbd_device": "/dev/nbd0", 00:29:45.006 "bdev_name": "crypto_ram" 00:29:45.006 }, 00:29:45.006 { 00:29:45.006 "nbd_device": "/dev/nbd1", 00:29:45.006 "bdev_name": "crypto_ram3" 00:29:45.006 } 00:29:45.006 ]' 00:29:45.006 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:29:45.006 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:29:45.006 /dev/nbd1' 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:29:45.265 /dev/nbd1' 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:29:45.265 256+0 records in 00:29:45.265 256+0 records out 00:29:45.265 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0105135 s, 99.7 MB/s 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:29:45.265 256+0 records in 00:29:45.265 256+0 records out 00:29:45.265 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.02118 s, 49.5 MB/s 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:29:45.265 256+0 records in 00:29:45.265 256+0 records out 00:29:45.265 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0441874 s, 23.7 MB/s 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:45.265 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:29:45.524 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:45.524 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:45.524 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:45.524 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:45.524 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:45.524 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:45.524 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:45.524 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:45.524 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:45.524 00:11:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:29:45.524 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:29:45.524 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:29:45.524 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:29:45.524 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:45.524 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:45.524 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:29:45.524 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:45.524 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:45.524 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:29:45.524 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:45.782 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:29:45.782 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:29:45.782 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:29:45.782 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:29:46.039 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:29:46.039 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:29:46.039 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:29:46.039 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:29:46.039 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:29:46.039 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:29:46.039 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:29:46.039 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:29:46.039 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:29:46.039 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:29:46.039 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:46.039 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:46.039 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:29:46.039 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:29:46.039 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:29:46.591 malloc_lvol_verify 00:29:46.591 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:29:46.591 e0c3d8f7-f72d-4803-af8f-87505b3084aa 00:29:46.591 00:11:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:29:46.591 cb9e1c12-bb02-46e1-b587-cbdf106380e3 00:29:46.591 00:11:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:29:46.849 /dev/nbd0 00:29:46.849 00:11:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:29:46.849 mke2fs 1.46.5 (30-Dec-2021) 00:29:46.849 Discarding device blocks: 0/4096 done 00:29:46.849 Creating filesystem with 4096 1k blocks and 1024 inodes 00:29:46.849 00:29:46.849 Allocating group tables: 0/1 done 00:29:46.849 Writing inode tables: 0/1 done 00:29:46.849 Creating journal (1024 blocks): done 00:29:46.849 Writing superblocks and filesystem accounting information: 0/1 done 00:29:46.849 00:29:46.849 00:11:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:29:46.849 00:11:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:29:46.849 00:11:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:46.849 00:11:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:29:46.849 00:11:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:46.849 00:11:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:29:46.849 00:11:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:46.849 00:11:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:29:47.107 00:11:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:47.107 00:11:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:47.107 00:11:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:47.107 00:11:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:47.107 00:11:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:47.107 00:11:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:47.107 00:11:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:47.107 00:11:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:47.107 00:11:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:29:47.107 00:11:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:29:47.107 00:11:47 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 556178 00:29:47.107 00:11:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@946 -- # '[' -z 556178 ']' 00:29:47.107 00:11:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@950 -- # kill -0 556178 00:29:47.107 00:11:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@951 -- # uname 00:29:47.107 00:11:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:29:47.107 00:11:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 556178 00:29:47.107 00:11:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:29:47.107 00:11:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:29:47.107 00:11:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@964 -- # echo 'killing process with pid 556178' 00:29:47.107 killing process with pid 556178 00:29:47.107 00:11:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@965 -- # kill 556178 00:29:47.107 00:11:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@970 -- # wait 556178 00:29:47.366 00:11:47 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:29:47.366 00:29:47.366 real 0m5.529s 00:29:47.366 user 0m7.747s 00:29:47.366 sys 0m2.298s 00:29:47.366 00:11:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:47.366 00:11:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:29:47.366 ************************************ 00:29:47.366 END TEST bdev_nbd 00:29:47.366 ************************************ 00:29:47.366 00:11:47 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:29:47.366 00:11:47 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = nvme ']' 00:29:47.366 00:11:47 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = gpt ']' 00:29:47.366 00:11:47 blockdev_crypto_sw -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:29:47.366 00:11:47 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:29:47.366 00:11:47 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:47.366 00:11:47 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:47.366 ************************************ 00:29:47.366 START TEST bdev_fio 00:29:47.366 ************************************ 00:29:47.366 00:11:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1121 -- # fio_test_suite '' 00:29:47.366 00:11:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:29:47.366 00:11:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:29:47.366 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:47.366 00:11:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:29:47.366 00:11:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:29:47.366 00:11:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:29:47.366 00:11:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:29:47.366 00:11:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:29:47.366 00:11:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1276 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:47.366 00:11:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1277 -- # local workload=verify 00:29:47.366 00:11:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1278 -- # local bdev_type=AIO 00:29:47.366 00:11:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1279 -- # local env_context= 00:29:47.366 00:11:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local fio_dir=/usr/src/fio 00:29:47.366 00:11:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:29:47.366 00:11:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1287 -- # '[' -z verify ']' 00:29:47.366 00:11:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -n '' ']' 00:29:47.366 00:11:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:47.366 00:11:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1297 -- # cat 00:29:47.366 00:11:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1309 -- # '[' verify == verify ']' 00:29:47.366 00:11:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1310 -- # cat 00:29:47.366 00:11:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1319 -- # '[' AIO == AIO ']' 00:29:47.366 00:11:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1320 -- # /usr/src/fio/fio --version 00:29:47.626 00:11:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1320 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:29:47.626 00:11:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1321 -- # echo serialize_overlap=1 00:29:47.626 00:11:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:29:47.626 00:11:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:29:47.626 00:11:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:29:47.626 00:11:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:29:47.626 00:11:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:29:47.626 00:11:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:29:47.626 00:11:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:29:47.626 00:11:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:47.626 00:11:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:29:47.626 00:11:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:47.626 00:11:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:29:47.626 ************************************ 00:29:47.626 START TEST bdev_fio_rw_verify 00:29:47.626 ************************************ 00:29:47.626 00:11:48 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1121 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:47.626 00:11:48 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:47.626 00:11:48 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1333 -- # local fio_dir=/usr/src/fio 00:29:47.626 00:11:48 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1335 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:29:47.626 00:11:48 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1335 -- # local sanitizers 00:29:47.626 00:11:48 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:47.626 00:11:48 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # shift 00:29:47.626 00:11:48 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local asan_lib= 00:29:47.626 00:11:48 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:29:47.626 00:11:48 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:47.626 00:11:48 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # grep libasan 00:29:47.626 00:11:48 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:29:47.626 00:11:48 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # asan_lib= 00:29:47.626 00:11:48 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:29:47.626 00:11:48 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:29:47.627 00:11:48 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:47.627 00:11:48 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # grep libclang_rt.asan 00:29:47.627 00:11:48 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:29:47.627 00:11:48 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # asan_lib= 00:29:47.627 00:11:48 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:29:47.627 00:11:48 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1348 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:29:47.627 00:11:48 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1348 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:47.885 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:47.885 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:47.885 fio-3.35 00:29:47.885 Starting 2 threads 00:30:00.126 00:30:00.126 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=557286: Wed May 15 00:11:58 2024 00:30:00.126 read: IOPS=21.9k, BW=85.4MiB/s (89.5MB/s)(854MiB/10001msec) 00:30:00.126 slat (nsec): min=14171, max=94620, avg=20035.14, stdev=3542.15 00:30:00.126 clat (usec): min=6, max=455, avg=145.96, stdev=58.25 00:30:00.126 lat (usec): min=24, max=487, avg=166.00, stdev=59.65 00:30:00.126 clat percentiles (usec): 00:30:00.126 | 50.000th=[ 143], 99.000th=[ 281], 99.900th=[ 302], 99.990th=[ 347], 00:30:00.127 | 99.999th=[ 416] 00:30:00.127 write: IOPS=26.2k, BW=102MiB/s (107MB/s)(971MiB/9484msec); 0 zone resets 00:30:00.127 slat (usec): min=14, max=1710, avg=33.60, stdev= 5.44 00:30:00.127 clat (usec): min=26, max=2078, avg=195.66, stdev=89.54 00:30:00.127 lat (usec): min=54, max=2109, avg=229.26, stdev=91.17 00:30:00.127 clat percentiles (usec): 00:30:00.127 | 50.000th=[ 190], 99.000th=[ 388], 99.900th=[ 412], 99.990th=[ 594], 00:30:00.127 | 99.999th=[ 2008] 00:30:00.127 bw ( KiB/s): min=93776, max=106144, per=94.92%, avg=99548.21, stdev=1763.06, samples=38 00:30:00.127 iops : min=23444, max=26536, avg=24887.05, stdev=440.76, samples=38 00:30:00.127 lat (usec) : 10=0.01%, 20=0.01%, 50=4.66%, 100=14.75%, 250=63.36% 00:30:00.127 lat (usec) : 500=17.20%, 750=0.01%, 1000=0.01% 00:30:00.127 lat (msec) : 2=0.01%, 4=0.01% 00:30:00.127 cpu : usr=99.61%, sys=0.00%, ctx=31, majf=0, minf=459 00:30:00.127 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:30:00.127 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:30:00.127 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:30:00.127 issued rwts: total=218643,248666,0,0 short=0,0,0,0 dropped=0,0,0,0 00:30:00.127 latency : target=0, window=0, percentile=100.00%, depth=8 00:30:00.127 00:30:00.127 Run status group 0 (all jobs): 00:30:00.127 READ: bw=85.4MiB/s (89.5MB/s), 85.4MiB/s-85.4MiB/s (89.5MB/s-89.5MB/s), io=854MiB (896MB), run=10001-10001msec 00:30:00.127 WRITE: bw=102MiB/s (107MB/s), 102MiB/s-102MiB/s (107MB/s-107MB/s), io=971MiB (1019MB), run=9484-9484msec 00:30:00.127 00:30:00.127 real 0m11.118s 00:30:00.127 user 0m23.733s 00:30:00.127 sys 0m0.357s 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:30:00.127 ************************************ 00:30:00.127 END TEST bdev_fio_rw_verify 00:30:00.127 ************************************ 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1276 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1277 -- # local workload=trim 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1278 -- # local bdev_type= 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1279 -- # local env_context= 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local fio_dir=/usr/src/fio 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1287 -- # '[' -z trim ']' 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -n '' ']' 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1297 -- # cat 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1309 -- # '[' trim == verify ']' 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # '[' trim == trim ']' 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1325 -- # echo rw=trimwrite 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "77178265-6f73-5699-9e7a-aaf7bf4941e1"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "77178265-6f73-5699-9e7a-aaf7bf4941e1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "e222d605-d54a-5719-99b0-1c3bd8660010"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "e222d605-d54a-5719-99b0-1c3bd8660010",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:30:00.127 crypto_ram3 ]] 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "77178265-6f73-5699-9e7a-aaf7bf4941e1"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "77178265-6f73-5699-9e7a-aaf7bf4941e1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "e222d605-d54a-5719-99b0-1c3bd8660010"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "e222d605-d54a-5719-99b0-1c3bd8660010",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1103 -- # xtrace_disable 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:30:00.127 ************************************ 00:30:00.127 START TEST bdev_fio_trim 00:30:00.127 ************************************ 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1121 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1333 -- # local fio_dir=/usr/src/fio 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1335 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1335 -- # local sanitizers 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # shift 00:30:00.127 00:11:59 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local asan_lib= 00:30:00.128 00:11:59 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:30:00.128 00:11:59 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:00.128 00:11:59 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # grep libasan 00:30:00.128 00:11:59 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:30:00.128 00:11:59 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # asan_lib= 00:30:00.128 00:11:59 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:30:00.128 00:11:59 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:30:00.128 00:11:59 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:00.128 00:11:59 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # grep libclang_rt.asan 00:30:00.128 00:11:59 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:30:00.128 00:11:59 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # asan_lib= 00:30:00.128 00:11:59 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:30:00.128 00:11:59 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1348 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:30:00.128 00:11:59 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1348 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:00.128 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:00.128 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:00.128 fio-3.35 00:30:00.128 Starting 2 threads 00:30:10.102 00:30:10.102 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=558815: Wed May 15 00:12:10 2024 00:30:10.102 write: IOPS=39.9k, BW=156MiB/s (164MB/s)(1560MiB/10001msec); 0 zone resets 00:30:10.102 slat (usec): min=14, max=1656, avg=21.90, stdev= 5.00 00:30:10.102 clat (usec): min=21, max=1975, avg=164.45, stdev=91.32 00:30:10.102 lat (usec): min=37, max=2005, avg=186.35, stdev=94.59 00:30:10.102 clat percentiles (usec): 00:30:10.102 | 50.000th=[ 131], 99.000th=[ 343], 99.900th=[ 367], 99.990th=[ 445], 00:30:10.102 | 99.999th=[ 758] 00:30:10.103 bw ( KiB/s): min=155976, max=162232, per=100.00%, avg=159822.32, stdev=694.86, samples=38 00:30:10.103 iops : min=38994, max=40558, avg=39955.68, stdev=173.65, samples=38 00:30:10.103 trim: IOPS=39.9k, BW=156MiB/s (164MB/s)(1560MiB/10001msec); 0 zone resets 00:30:10.103 slat (usec): min=6, max=484, avg=10.06, stdev= 2.40 00:30:10.103 clat (usec): min=37, max=640, avg=109.60, stdev=33.07 00:30:10.103 lat (usec): min=46, max=648, avg=119.66, stdev=33.27 00:30:10.103 clat percentiles (usec): 00:30:10.103 | 50.000th=[ 112], 99.000th=[ 182], 99.900th=[ 194], 99.990th=[ 273], 00:30:10.103 | 99.999th=[ 519] 00:30:10.103 bw ( KiB/s): min=156000, max=162224, per=100.00%, avg=159824.42, stdev=693.37, samples=38 00:30:10.103 iops : min=39000, max=40556, avg=39956.00, stdev=173.34, samples=38 00:30:10.103 lat (usec) : 50=4.13%, 100=34.29%, 250=48.51%, 500=13.06%, 750=0.01% 00:30:10.103 lat (usec) : 1000=0.01% 00:30:10.103 lat (msec) : 2=0.01% 00:30:10.103 cpu : usr=99.61%, sys=0.00%, ctx=30, majf=0, minf=331 00:30:10.103 IO depths : 1=7.5%, 2=17.4%, 4=60.1%, 8=15.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:30:10.103 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:30:10.103 complete : 0=0.0%, 4=86.9%, 8=13.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:30:10.103 issued rwts: total=0,399307,399308,0 short=0,0,0,0 dropped=0,0,0,0 00:30:10.103 latency : target=0, window=0, percentile=100.00%, depth=8 00:30:10.103 00:30:10.103 Run status group 0 (all jobs): 00:30:10.103 WRITE: bw=156MiB/s (164MB/s), 156MiB/s-156MiB/s (164MB/s-164MB/s), io=1560MiB (1636MB), run=10001-10001msec 00:30:10.103 TRIM: bw=156MiB/s (164MB/s), 156MiB/s-156MiB/s (164MB/s-164MB/s), io=1560MiB (1636MB), run=10001-10001msec 00:30:10.103 00:30:10.103 real 0m11.124s 00:30:10.103 user 0m23.730s 00:30:10.103 sys 0m0.341s 00:30:10.103 00:12:10 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1122 -- # xtrace_disable 00:30:10.103 00:12:10 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:30:10.103 ************************************ 00:30:10.103 END TEST bdev_fio_trim 00:30:10.103 ************************************ 00:30:10.103 00:12:10 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:30:10.103 00:12:10 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:10.103 00:12:10 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:30:10.103 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:10.103 00:12:10 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:30:10.103 00:30:10.103 real 0m22.611s 00:30:10.103 user 0m47.642s 00:30:10.103 sys 0m0.901s 00:30:10.103 00:12:10 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1122 -- # xtrace_disable 00:30:10.103 00:12:10 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:30:10.103 ************************************ 00:30:10.103 END TEST bdev_fio 00:30:10.103 ************************************ 00:30:10.103 00:12:10 blockdev_crypto_sw -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:30:10.103 00:12:10 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:30:10.103 00:12:10 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 16 -le 1 ']' 00:30:10.103 00:12:10 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:30:10.103 00:12:10 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:30:10.103 ************************************ 00:30:10.103 START TEST bdev_verify 00:30:10.103 ************************************ 00:30:10.103 00:12:10 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:30:10.103 [2024-05-15 00:12:10.669167] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:30:10.103 [2024-05-15 00:12:10.669226] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid560722 ] 00:30:10.363 [2024-05-15 00:12:10.796324] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:10.363 [2024-05-15 00:12:10.899225] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:30:10.363 [2024-05-15 00:12:10.899229] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:30:10.623 [2024-05-15 00:12:11.074792] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:30:10.623 [2024-05-15 00:12:11.074865] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:10.623 [2024-05-15 00:12:11.074880] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:10.623 [2024-05-15 00:12:11.082812] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:30:10.623 [2024-05-15 00:12:11.082831] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:10.623 [2024-05-15 00:12:11.082843] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:10.623 [2024-05-15 00:12:11.090835] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:30:10.623 [2024-05-15 00:12:11.090853] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:30:10.623 [2024-05-15 00:12:11.090865] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:10.623 Running I/O for 5 seconds... 00:30:15.897 00:30:15.897 Latency(us) 00:30:15.897 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:15.897 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:30:15.897 Verification LBA range: start 0x0 length 0x800 00:30:15.897 crypto_ram : 5.01 6081.85 23.76 0.00 0.00 20960.53 1638.40 28038.01 00:30:15.897 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:30:15.897 Verification LBA range: start 0x800 length 0x800 00:30:15.897 crypto_ram : 5.02 6119.60 23.90 0.00 0.00 20833.10 1994.57 27924.03 00:30:15.897 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:30:15.897 Verification LBA range: start 0x0 length 0x800 00:30:15.897 crypto_ram3 : 5.02 3056.84 11.94 0.00 0.00 41643.04 1923.34 30545.47 00:30:15.897 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:30:15.897 Verification LBA range: start 0x800 length 0x800 00:30:15.897 crypto_ram3 : 5.02 3058.28 11.95 0.00 0.00 41621.37 7864.32 30545.47 00:30:15.897 =================================================================================================================== 00:30:15.897 Total : 18316.57 71.55 0.00 0.00 27827.69 1638.40 30545.47 00:30:15.897 00:30:15.897 real 0m5.838s 00:30:15.897 user 0m10.950s 00:30:15.897 sys 0m0.244s 00:30:15.897 00:12:16 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:30:15.897 00:12:16 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:30:15.897 ************************************ 00:30:15.897 END TEST bdev_verify 00:30:15.897 ************************************ 00:30:16.156 00:12:16 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:30:16.156 00:12:16 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 16 -le 1 ']' 00:30:16.156 00:12:16 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:30:16.156 00:12:16 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:30:16.156 ************************************ 00:30:16.156 START TEST bdev_verify_big_io 00:30:16.156 ************************************ 00:30:16.156 00:12:16 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:30:16.156 [2024-05-15 00:12:16.605072] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:30:16.156 [2024-05-15 00:12:16.605136] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid561444 ] 00:30:16.156 [2024-05-15 00:12:16.734237] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:16.415 [2024-05-15 00:12:16.843596] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:30:16.415 [2024-05-15 00:12:16.843602] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:30:16.674 [2024-05-15 00:12:17.019181] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:30:16.674 [2024-05-15 00:12:17.019247] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:16.674 [2024-05-15 00:12:17.019262] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:16.674 [2024-05-15 00:12:17.027200] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:30:16.674 [2024-05-15 00:12:17.027219] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:16.674 [2024-05-15 00:12:17.027231] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:16.674 [2024-05-15 00:12:17.035223] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:30:16.674 [2024-05-15 00:12:17.035241] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:30:16.674 [2024-05-15 00:12:17.035252] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:16.674 Running I/O for 5 seconds... 00:30:21.946 00:30:21.946 Latency(us) 00:30:21.946 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:21.946 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:30:21.946 Verification LBA range: start 0x0 length 0x80 00:30:21.946 crypto_ram : 5.04 457.28 28.58 0.00 0.00 273276.56 6468.12 377487.36 00:30:21.946 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:30:21.946 Verification LBA range: start 0x80 length 0x80 00:30:21.946 crypto_ram : 5.05 456.04 28.50 0.00 0.00 273955.05 5841.25 379310.97 00:30:21.946 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:30:21.946 Verification LBA range: start 0x0 length 0x80 00:30:21.946 crypto_ram3 : 5.27 267.09 16.69 0.00 0.00 450869.34 5584.81 381134.58 00:30:21.946 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:30:21.946 Verification LBA range: start 0x80 length 0x80 00:30:21.946 crypto_ram3 : 5.22 245.18 15.32 0.00 0.00 490962.39 5812.76 381134.58 00:30:21.946 =================================================================================================================== 00:30:21.946 Total : 1425.60 89.10 0.00 0.00 345953.61 5584.81 381134.58 00:30:22.204 00:30:22.204 real 0m6.078s 00:30:22.204 user 0m11.425s 00:30:22.204 sys 0m0.247s 00:30:22.204 00:12:22 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1122 -- # xtrace_disable 00:30:22.204 00:12:22 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:30:22.204 ************************************ 00:30:22.204 END TEST bdev_verify_big_io 00:30:22.204 ************************************ 00:30:22.204 00:12:22 blockdev_crypto_sw -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:30:22.204 00:12:22 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:30:22.204 00:12:22 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:30:22.204 00:12:22 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:30:22.204 ************************************ 00:30:22.204 START TEST bdev_write_zeroes 00:30:22.204 ************************************ 00:30:22.205 00:12:22 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:30:22.205 [2024-05-15 00:12:22.769545] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:30:22.205 [2024-05-15 00:12:22.769602] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid562167 ] 00:30:22.464 [2024-05-15 00:12:22.895504] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:22.464 [2024-05-15 00:12:22.993630] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:30:22.725 [2024-05-15 00:12:23.168572] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:30:22.725 [2024-05-15 00:12:23.168648] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:22.725 [2024-05-15 00:12:23.168664] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:22.725 [2024-05-15 00:12:23.176593] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:30:22.725 [2024-05-15 00:12:23.176611] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:22.725 [2024-05-15 00:12:23.176623] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:22.725 [2024-05-15 00:12:23.184614] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:30:22.725 [2024-05-15 00:12:23.184631] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:30:22.725 [2024-05-15 00:12:23.184642] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:22.725 Running I/O for 1 seconds... 00:30:23.669 00:30:23.669 Latency(us) 00:30:23.669 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:23.669 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:30:23.669 crypto_ram : 1.01 26615.84 103.97 0.00 0.00 4796.84 1289.35 6696.07 00:30:23.669 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:30:23.669 crypto_ram3 : 1.01 13280.97 51.88 0.00 0.00 9564.86 5955.23 9972.87 00:30:23.669 =================================================================================================================== 00:30:23.669 Total : 39896.81 155.85 0.00 0.00 6386.18 1289.35 9972.87 00:30:23.927 00:30:23.927 real 0m1.769s 00:30:23.927 user 0m1.529s 00:30:23.927 sys 0m0.219s 00:30:23.927 00:12:24 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1122 -- # xtrace_disable 00:30:23.927 00:12:24 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:30:23.927 ************************************ 00:30:23.927 END TEST bdev_write_zeroes 00:30:23.927 ************************************ 00:30:24.186 00:12:24 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:30:24.186 00:12:24 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:30:24.186 00:12:24 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:30:24.186 00:12:24 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:30:24.186 ************************************ 00:30:24.186 START TEST bdev_json_nonenclosed 00:30:24.186 ************************************ 00:30:24.186 00:12:24 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:30:24.186 [2024-05-15 00:12:24.629250] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:30:24.186 [2024-05-15 00:12:24.629308] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid562525 ] 00:30:24.186 [2024-05-15 00:12:24.758725] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:24.445 [2024-05-15 00:12:24.856854] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:30:24.445 [2024-05-15 00:12:24.856923] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:30:24.445 [2024-05-15 00:12:24.856943] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:30:24.445 [2024-05-15 00:12:24.856955] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:30:24.445 00:30:24.445 real 0m0.404s 00:30:24.445 user 0m0.248s 00:30:24.445 sys 0m0.153s 00:30:24.445 00:12:24 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1122 -- # xtrace_disable 00:30:24.445 00:12:24 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:30:24.445 ************************************ 00:30:24.445 END TEST bdev_json_nonenclosed 00:30:24.445 ************************************ 00:30:24.445 00:12:25 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:30:24.445 00:12:25 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:30:24.445 00:12:25 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:30:24.445 00:12:25 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:30:24.705 ************************************ 00:30:24.705 START TEST bdev_json_nonarray 00:30:24.705 ************************************ 00:30:24.705 00:12:25 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:30:24.705 [2024-05-15 00:12:25.126597] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:30:24.705 [2024-05-15 00:12:25.126658] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid562549 ] 00:30:24.705 [2024-05-15 00:12:25.245149] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:24.966 [2024-05-15 00:12:25.347056] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:30:24.966 [2024-05-15 00:12:25.347131] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:30:24.966 [2024-05-15 00:12:25.347152] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:30:24.966 [2024-05-15 00:12:25.347168] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:30:24.966 00:30:24.966 real 0m0.403s 00:30:24.966 user 0m0.255s 00:30:24.966 sys 0m0.146s 00:30:24.966 00:12:25 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1122 -- # xtrace_disable 00:30:24.966 00:12:25 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:30:24.966 ************************************ 00:30:24.966 END TEST bdev_json_nonarray 00:30:24.966 ************************************ 00:30:24.966 00:12:25 blockdev_crypto_sw -- bdev/blockdev.sh@787 -- # [[ crypto_sw == bdev ]] 00:30:24.966 00:12:25 blockdev_crypto_sw -- bdev/blockdev.sh@794 -- # [[ crypto_sw == gpt ]] 00:30:24.966 00:12:25 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # [[ crypto_sw == crypto_sw ]] 00:30:24.966 00:12:25 blockdev_crypto_sw -- bdev/blockdev.sh@799 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:30:24.966 00:12:25 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:30:24.966 00:12:25 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:30:24.966 00:12:25 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:30:24.966 ************************************ 00:30:24.966 START TEST bdev_crypto_enomem 00:30:24.966 ************************************ 00:30:24.966 00:12:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1121 -- # bdev_crypto_enomem 00:30:24.966 00:12:25 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local base_dev=base0 00:30:24.966 00:12:25 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local test_dev=crypt0 00:30:24.966 00:12:25 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local err_dev=EE_base0 00:30:24.966 00:12:25 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@638 -- # local qd=32 00:30:24.966 00:12:25 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # ERR_PID=562588 00:30:24.966 00:12:25 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:30:24.966 00:12:25 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:30:24.966 00:12:25 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@643 -- # waitforlisten 562588 00:30:24.966 00:12:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@827 -- # '[' -z 562588 ']' 00:30:24.966 00:12:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:24.966 00:12:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@832 -- # local max_retries=100 00:30:24.966 00:12:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:24.966 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:24.966 00:12:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@836 -- # xtrace_disable 00:30:24.966 00:12:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:30:25.273 [2024-05-15 00:12:25.604410] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:30:25.273 [2024-05-15 00:12:25.604469] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid562588 ] 00:30:25.273 [2024-05-15 00:12:25.724282] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:25.273 [2024-05-15 00:12:25.821443] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:30:26.211 00:12:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:30:26.211 00:12:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@860 -- # return 0 00:30:26.211 00:12:26 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@645 -- # rpc_cmd 00:30:26.211 00:12:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:26.211 00:12:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:30:26.211 true 00:30:26.211 base0 00:30:26.211 true 00:30:26.211 [2024-05-15 00:12:26.498426] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:30:26.211 crypt0 00:30:26.211 00:12:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:26.211 00:12:26 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@652 -- # waitforbdev crypt0 00:30:26.211 00:12:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@895 -- # local bdev_name=crypt0 00:30:26.211 00:12:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:30:26.211 00:12:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@897 -- # local i 00:30:26.211 00:12:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:30:26.211 00:12:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:30:26.211 00:12:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # rpc_cmd bdev_wait_for_examine 00:30:26.211 00:12:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:26.211 00:12:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:30:26.211 00:12:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:26.211 00:12:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:30:26.211 00:12:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:26.211 00:12:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:30:26.211 [ 00:30:26.211 { 00:30:26.211 "name": "crypt0", 00:30:26.211 "aliases": [ 00:30:26.211 "9938a412-bd04-51ad-8d31-8e1d37d5ddb3" 00:30:26.211 ], 00:30:26.211 "product_name": "crypto", 00:30:26.211 "block_size": 512, 00:30:26.211 "num_blocks": 2097152, 00:30:26.211 "uuid": "9938a412-bd04-51ad-8d31-8e1d37d5ddb3", 00:30:26.211 "assigned_rate_limits": { 00:30:26.211 "rw_ios_per_sec": 0, 00:30:26.211 "rw_mbytes_per_sec": 0, 00:30:26.211 "r_mbytes_per_sec": 0, 00:30:26.211 "w_mbytes_per_sec": 0 00:30:26.211 }, 00:30:26.211 "claimed": false, 00:30:26.211 "zoned": false, 00:30:26.211 "supported_io_types": { 00:30:26.211 "read": true, 00:30:26.211 "write": true, 00:30:26.211 "unmap": false, 00:30:26.211 "write_zeroes": true, 00:30:26.211 "flush": false, 00:30:26.211 "reset": true, 00:30:26.211 "compare": false, 00:30:26.211 "compare_and_write": false, 00:30:26.211 "abort": false, 00:30:26.211 "nvme_admin": false, 00:30:26.211 "nvme_io": false 00:30:26.211 }, 00:30:26.211 "memory_domains": [ 00:30:26.211 { 00:30:26.211 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:26.211 "dma_device_type": 2 00:30:26.211 } 00:30:26.211 ], 00:30:26.211 "driver_specific": { 00:30:26.211 "crypto": { 00:30:26.211 "base_bdev_name": "EE_base0", 00:30:26.211 "name": "crypt0", 00:30:26.211 "key_name": "test_dek_sw" 00:30:26.211 } 00:30:26.211 } 00:30:26.211 } 00:30:26.211 ] 00:30:26.211 00:12:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:26.211 00:12:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@903 -- # return 0 00:30:26.211 00:12:26 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@655 -- # rpcpid=562756 00:30:26.211 00:12:26 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # sleep 1 00:30:26.211 00:12:26 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:26.211 Running I/O for 5 seconds... 00:30:27.149 00:12:27 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@658 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:30:27.149 00:12:27 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:27.149 00:12:27 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:30:27.149 00:12:27 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:27.149 00:12:27 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@660 -- # wait 562756 00:30:31.338 00:30:31.338 Latency(us) 00:30:31.338 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:31.338 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:30:31.338 crypt0 : 5.00 35936.11 140.38 0.00 0.00 886.29 418.50 1574.29 00:30:31.338 =================================================================================================================== 00:30:31.338 Total : 35936.11 140.38 0.00 0.00 886.29 418.50 1574.29 00:30:31.338 0 00:30:31.338 00:12:31 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@662 -- # rpc_cmd bdev_crypto_delete crypt0 00:30:31.338 00:12:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:31.338 00:12:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:30:31.338 00:12:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:31.338 00:12:31 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # killprocess 562588 00:30:31.338 00:12:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@946 -- # '[' -z 562588 ']' 00:30:31.338 00:12:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@950 -- # kill -0 562588 00:30:31.338 00:12:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@951 -- # uname 00:30:31.338 00:12:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:30:31.338 00:12:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 562588 00:30:31.338 00:12:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:30:31.339 00:12:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:30:31.339 00:12:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@964 -- # echo 'killing process with pid 562588' 00:30:31.339 killing process with pid 562588 00:30:31.339 00:12:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@965 -- # kill 562588 00:30:31.339 Received shutdown signal, test time was about 5.000000 seconds 00:30:31.339 00:30:31.339 Latency(us) 00:30:31.339 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:31.339 =================================================================================================================== 00:30:31.339 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:31.339 00:12:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@970 -- # wait 562588 00:30:31.597 00:12:31 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@665 -- # trap - SIGINT SIGTERM EXIT 00:30:31.597 00:30:31.597 real 0m6.417s 00:30:31.597 user 0m6.606s 00:30:31.597 sys 0m0.372s 00:30:31.597 00:12:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1122 -- # xtrace_disable 00:30:31.597 00:12:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:30:31.597 ************************************ 00:30:31.597 END TEST bdev_crypto_enomem 00:30:31.597 ************************************ 00:30:31.597 00:12:32 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:30:31.597 00:12:32 blockdev_crypto_sw -- bdev/blockdev.sh@811 -- # cleanup 00:30:31.597 00:12:32 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:30:31.597 00:12:32 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:30:31.597 00:12:32 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:30:31.597 00:12:32 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:30:31.597 00:12:32 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:30:31.597 00:12:32 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:30:31.597 00:30:31.597 real 0m54.152s 00:30:31.597 user 1m33.039s 00:30:31.597 sys 0m6.392s 00:30:31.597 00:12:32 blockdev_crypto_sw -- common/autotest_common.sh@1122 -- # xtrace_disable 00:30:31.597 00:12:32 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:30:31.597 ************************************ 00:30:31.597 END TEST blockdev_crypto_sw 00:30:31.597 ************************************ 00:30:31.597 00:12:32 -- spdk/autotest.sh@355 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:30:31.597 00:12:32 -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:30:31.597 00:12:32 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:30:31.597 00:12:32 -- common/autotest_common.sh@10 -- # set +x 00:30:31.597 ************************************ 00:30:31.597 START TEST blockdev_crypto_qat 00:30:31.597 ************************************ 00:30:31.597 00:12:32 blockdev_crypto_qat -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:30:31.597 * Looking for test storage... 00:30:31.597 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:30:31.597 00:12:32 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:30:31.597 00:12:32 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:30:31.597 00:12:32 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:30:31.597 00:12:32 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:30:31.597 00:12:32 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:30:31.597 00:12:32 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:30:31.597 00:12:32 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:30:31.857 00:12:32 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:30:31.857 00:12:32 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:30:31.857 00:12:32 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:30:31.857 00:12:32 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:30:31.857 00:12:32 blockdev_crypto_qat -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:30:31.857 00:12:32 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # uname -s 00:30:31.857 00:12:32 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:30:31.857 00:12:32 blockdev_crypto_qat -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:30:31.857 00:12:32 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # test_type=crypto_qat 00:30:31.857 00:12:32 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # crypto_device= 00:30:31.857 00:12:32 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # dek= 00:30:31.857 00:12:32 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # env_ctx= 00:30:31.857 00:12:32 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:30:31.857 00:12:32 blockdev_crypto_qat -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:30:31.857 00:12:32 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == bdev ]] 00:30:31.857 00:12:32 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == crypto_* ]] 00:30:31.857 00:12:32 blockdev_crypto_qat -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:30:31.857 00:12:32 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:30:31.857 00:12:32 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=563517 00:30:31.857 00:12:32 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:30:31.857 00:12:32 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:30:31.857 00:12:32 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 563517 00:30:31.857 00:12:32 blockdev_crypto_qat -- common/autotest_common.sh@827 -- # '[' -z 563517 ']' 00:30:31.857 00:12:32 blockdev_crypto_qat -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:31.857 00:12:32 blockdev_crypto_qat -- common/autotest_common.sh@832 -- # local max_retries=100 00:30:31.857 00:12:32 blockdev_crypto_qat -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:31.857 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:31.857 00:12:32 blockdev_crypto_qat -- common/autotest_common.sh@836 -- # xtrace_disable 00:30:31.857 00:12:32 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:31.857 [2024-05-15 00:12:32.266693] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:30:31.857 [2024-05-15 00:12:32.266767] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid563517 ] 00:30:31.857 [2024-05-15 00:12:32.393939] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:32.116 [2024-05-15 00:12:32.491664] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:30:32.685 00:12:33 blockdev_crypto_qat -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:30:32.685 00:12:33 blockdev_crypto_qat -- common/autotest_common.sh@860 -- # return 0 00:30:32.685 00:12:33 blockdev_crypto_qat -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:30:32.685 00:12:33 blockdev_crypto_qat -- bdev/blockdev.sh@708 -- # setup_crypto_qat_conf 00:30:32.685 00:12:33 blockdev_crypto_qat -- bdev/blockdev.sh@170 -- # rpc_cmd 00:30:32.685 00:12:33 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:32.685 00:12:33 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:32.685 [2024-05-15 00:12:33.201899] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:30:32.685 [2024-05-15 00:12:33.209931] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:32.685 [2024-05-15 00:12:33.217946] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:32.944 [2024-05-15 00:12:33.283232] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:30:35.476 true 00:30:35.476 true 00:30:35.476 true 00:30:35.476 true 00:30:35.476 Malloc0 00:30:35.476 Malloc1 00:30:35.476 Malloc2 00:30:35.476 Malloc3 00:30:35.476 [2024-05-15 00:12:35.878620] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:30:35.476 crypto_ram 00:30:35.476 [2024-05-15 00:12:35.886638] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:30:35.476 crypto_ram1 00:30:35.476 [2024-05-15 00:12:35.894663] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:30:35.476 crypto_ram2 00:30:35.476 [2024-05-15 00:12:35.902681] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:30:35.476 crypto_ram3 00:30:35.476 [ 00:30:35.476 { 00:30:35.476 "name": "Malloc1", 00:30:35.476 "aliases": [ 00:30:35.476 "4c991358-e802-41ad-a202-f437e34ffd65" 00:30:35.476 ], 00:30:35.476 "product_name": "Malloc disk", 00:30:35.476 "block_size": 512, 00:30:35.476 "num_blocks": 65536, 00:30:35.476 "uuid": "4c991358-e802-41ad-a202-f437e34ffd65", 00:30:35.476 "assigned_rate_limits": { 00:30:35.476 "rw_ios_per_sec": 0, 00:30:35.476 "rw_mbytes_per_sec": 0, 00:30:35.476 "r_mbytes_per_sec": 0, 00:30:35.476 "w_mbytes_per_sec": 0 00:30:35.476 }, 00:30:35.476 "claimed": true, 00:30:35.476 "claim_type": "exclusive_write", 00:30:35.476 "zoned": false, 00:30:35.476 "supported_io_types": { 00:30:35.476 "read": true, 00:30:35.476 "write": true, 00:30:35.476 "unmap": true, 00:30:35.476 "write_zeroes": true, 00:30:35.476 "flush": true, 00:30:35.476 "reset": true, 00:30:35.476 "compare": false, 00:30:35.476 "compare_and_write": false, 00:30:35.476 "abort": true, 00:30:35.476 "nvme_admin": false, 00:30:35.476 "nvme_io": false 00:30:35.476 }, 00:30:35.476 "memory_domains": [ 00:30:35.476 { 00:30:35.476 "dma_device_id": "system", 00:30:35.476 "dma_device_type": 1 00:30:35.476 }, 00:30:35.476 { 00:30:35.476 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:35.476 "dma_device_type": 2 00:30:35.476 } 00:30:35.476 ], 00:30:35.476 "driver_specific": {} 00:30:35.476 } 00:30:35.476 ] 00:30:35.476 00:12:35 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:35.476 00:12:35 blockdev_crypto_qat -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:30:35.476 00:12:35 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:35.476 00:12:35 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:35.476 00:12:35 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:35.476 00:12:35 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # cat 00:30:35.476 00:12:35 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:30:35.476 00:12:35 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:35.476 00:12:35 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:35.476 00:12:35 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:35.476 00:12:35 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:30:35.476 00:12:35 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:35.476 00:12:35 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:35.476 00:12:35 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:35.476 00:12:35 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:30:35.476 00:12:35 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:35.476 00:12:35 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:35.476 00:12:36 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:35.476 00:12:36 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:30:35.476 00:12:36 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:30:35.476 00:12:36 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:35.476 00:12:36 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:30:35.476 00:12:36 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:35.735 00:12:36 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:35.735 00:12:36 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:30:35.735 00:12:36 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "f488b798-7c2c-5a75-b25d-ac9dd85e2a85"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f488b798-7c2c-5a75-b25d-ac9dd85e2a85",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "3cadebc4-8e9d-5f49-b71a-f087512c6255"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "3cadebc4-8e9d-5f49-b71a-f087512c6255",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "6a1b3095-0f27-5d9c-9522-c5cea79d3405"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "6a1b3095-0f27-5d9c-9522-c5cea79d3405",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "7ddb925e-79d0-5e49-ac43-d31d39938c0c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "7ddb925e-79d0-5e49-ac43-d31d39938c0c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:30:35.735 00:12:36 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # jq -r .name 00:30:35.735 00:12:36 blockdev_crypto_qat -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:30:35.735 00:12:36 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:30:35.735 00:12:36 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:30:35.735 00:12:36 blockdev_crypto_qat -- bdev/blockdev.sh@754 -- # killprocess 563517 00:30:35.735 00:12:36 blockdev_crypto_qat -- common/autotest_common.sh@946 -- # '[' -z 563517 ']' 00:30:35.735 00:12:36 blockdev_crypto_qat -- common/autotest_common.sh@950 -- # kill -0 563517 00:30:35.735 00:12:36 blockdev_crypto_qat -- common/autotest_common.sh@951 -- # uname 00:30:35.735 00:12:36 blockdev_crypto_qat -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:30:35.735 00:12:36 blockdev_crypto_qat -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 563517 00:30:35.735 00:12:36 blockdev_crypto_qat -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:30:35.735 00:12:36 blockdev_crypto_qat -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:30:35.735 00:12:36 blockdev_crypto_qat -- common/autotest_common.sh@964 -- # echo 'killing process with pid 563517' 00:30:35.735 killing process with pid 563517 00:30:35.735 00:12:36 blockdev_crypto_qat -- common/autotest_common.sh@965 -- # kill 563517 00:30:35.735 00:12:36 blockdev_crypto_qat -- common/autotest_common.sh@970 -- # wait 563517 00:30:36.303 00:12:36 blockdev_crypto_qat -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:30:36.303 00:12:36 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:30:36.303 00:12:36 blockdev_crypto_qat -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:30:36.303 00:12:36 blockdev_crypto_qat -- common/autotest_common.sh@1103 -- # xtrace_disable 00:30:36.303 00:12:36 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:36.303 ************************************ 00:30:36.303 START TEST bdev_hello_world 00:30:36.303 ************************************ 00:30:36.303 00:12:36 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:30:36.303 [2024-05-15 00:12:36.837976] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:30:36.303 [2024-05-15 00:12:36.838035] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid564132 ] 00:30:36.562 [2024-05-15 00:12:36.964750] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:36.562 [2024-05-15 00:12:37.065954] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:30:36.562 [2024-05-15 00:12:37.087245] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:30:36.562 [2024-05-15 00:12:37.095274] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:36.562 [2024-05-15 00:12:37.103292] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:36.821 [2024-05-15 00:12:37.220000] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:30:39.353 [2024-05-15 00:12:39.649909] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:30:39.353 [2024-05-15 00:12:39.649970] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:39.353 [2024-05-15 00:12:39.649986] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:39.353 [2024-05-15 00:12:39.657927] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:30:39.353 [2024-05-15 00:12:39.657947] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:39.353 [2024-05-15 00:12:39.657958] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:39.353 [2024-05-15 00:12:39.665947] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:30:39.353 [2024-05-15 00:12:39.665965] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:39.353 [2024-05-15 00:12:39.665977] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:39.353 [2024-05-15 00:12:39.673968] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:30:39.353 [2024-05-15 00:12:39.673986] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:39.353 [2024-05-15 00:12:39.673998] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:39.353 [2024-05-15 00:12:39.746787] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:30:39.353 [2024-05-15 00:12:39.746829] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:30:39.353 [2024-05-15 00:12:39.746849] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:30:39.353 [2024-05-15 00:12:39.748122] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:30:39.353 [2024-05-15 00:12:39.748199] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:30:39.353 [2024-05-15 00:12:39.748216] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:30:39.353 [2024-05-15 00:12:39.748260] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:30:39.353 00:30:39.353 [2024-05-15 00:12:39.748279] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:30:39.611 00:30:39.611 real 0m3.367s 00:30:39.611 user 0m2.777s 00:30:39.611 sys 0m0.543s 00:30:39.611 00:12:40 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1122 -- # xtrace_disable 00:30:39.611 00:12:40 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:30:39.611 ************************************ 00:30:39.611 END TEST bdev_hello_world 00:30:39.611 ************************************ 00:30:39.611 00:12:40 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:30:39.611 00:12:40 blockdev_crypto_qat -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:30:39.611 00:12:40 blockdev_crypto_qat -- common/autotest_common.sh@1103 -- # xtrace_disable 00:30:39.611 00:12:40 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:39.869 ************************************ 00:30:39.869 START TEST bdev_bounds 00:30:39.869 ************************************ 00:30:39.869 00:12:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1121 -- # bdev_bounds '' 00:30:39.869 00:12:40 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=564600 00:30:39.869 00:12:40 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:30:39.869 00:12:40 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 564600' 00:30:39.869 Process bdevio pid: 564600 00:30:39.869 00:12:40 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 564600 00:30:39.869 00:12:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@827 -- # '[' -z 564600 ']' 00:30:39.869 00:12:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:39.869 00:12:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@832 -- # local max_retries=100 00:30:39.869 00:12:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:39.869 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:39.869 00:12:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@836 -- # xtrace_disable 00:30:39.869 00:12:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:30:39.869 00:12:40 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:30:39.869 [2024-05-15 00:12:40.289239] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:30:39.869 [2024-05-15 00:12:40.289300] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid564600 ] 00:30:39.869 [2024-05-15 00:12:40.416130] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:40.128 [2024-05-15 00:12:40.523034] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:30:40.128 [2024-05-15 00:12:40.523117] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:30:40.128 [2024-05-15 00:12:40.523122] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:30:40.128 [2024-05-15 00:12:40.544508] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:30:40.128 [2024-05-15 00:12:40.552539] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:40.128 [2024-05-15 00:12:40.560561] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:40.128 [2024-05-15 00:12:40.672700] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:30:42.661 [2024-05-15 00:12:43.084465] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:30:42.661 [2024-05-15 00:12:43.084555] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:42.661 [2024-05-15 00:12:43.084570] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:42.661 [2024-05-15 00:12:43.092484] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:30:42.661 [2024-05-15 00:12:43.092503] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:42.661 [2024-05-15 00:12:43.092514] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:42.661 [2024-05-15 00:12:43.100504] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:30:42.661 [2024-05-15 00:12:43.100521] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:42.661 [2024-05-15 00:12:43.100533] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:42.661 [2024-05-15 00:12:43.108526] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:30:42.661 [2024-05-15 00:12:43.108543] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:42.661 [2024-05-15 00:12:43.108555] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:42.661 00:12:43 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:30:42.661 00:12:43 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@860 -- # return 0 00:30:42.661 00:12:43 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:30:42.922 I/O targets: 00:30:42.922 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:30:42.922 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:30:42.922 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:30:42.922 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:30:42.922 00:30:42.922 00:30:42.922 CUnit - A unit testing framework for C - Version 2.1-3 00:30:42.922 http://cunit.sourceforge.net/ 00:30:42.922 00:30:42.922 00:30:42.922 Suite: bdevio tests on: crypto_ram3 00:30:42.922 Test: blockdev write read block ...passed 00:30:42.922 Test: blockdev write zeroes read block ...passed 00:30:42.922 Test: blockdev write zeroes read no split ...passed 00:30:42.922 Test: blockdev write zeroes read split ...passed 00:30:42.922 Test: blockdev write zeroes read split partial ...passed 00:30:42.922 Test: blockdev reset ...passed 00:30:42.922 Test: blockdev write read 8 blocks ...passed 00:30:42.922 Test: blockdev write read size > 128k ...passed 00:30:42.922 Test: blockdev write read invalid size ...passed 00:30:42.922 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:42.922 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:42.922 Test: blockdev write read max offset ...passed 00:30:42.922 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:42.922 Test: blockdev writev readv 8 blocks ...passed 00:30:42.922 Test: blockdev writev readv 30 x 1block ...passed 00:30:42.922 Test: blockdev writev readv block ...passed 00:30:42.922 Test: blockdev writev readv size > 128k ...passed 00:30:42.922 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:42.922 Test: blockdev comparev and writev ...passed 00:30:42.922 Test: blockdev nvme passthru rw ...passed 00:30:42.922 Test: blockdev nvme passthru vendor specific ...passed 00:30:42.922 Test: blockdev nvme admin passthru ...passed 00:30:42.922 Test: blockdev copy ...passed 00:30:42.922 Suite: bdevio tests on: crypto_ram2 00:30:42.922 Test: blockdev write read block ...passed 00:30:42.922 Test: blockdev write zeroes read block ...passed 00:30:42.922 Test: blockdev write zeroes read no split ...passed 00:30:42.922 Test: blockdev write zeroes read split ...passed 00:30:42.922 Test: blockdev write zeroes read split partial ...passed 00:30:42.922 Test: blockdev reset ...passed 00:30:42.922 Test: blockdev write read 8 blocks ...passed 00:30:42.922 Test: blockdev write read size > 128k ...passed 00:30:42.922 Test: blockdev write read invalid size ...passed 00:30:42.922 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:42.922 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:42.922 Test: blockdev write read max offset ...passed 00:30:42.922 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:42.922 Test: blockdev writev readv 8 blocks ...passed 00:30:42.922 Test: blockdev writev readv 30 x 1block ...passed 00:30:42.922 Test: blockdev writev readv block ...passed 00:30:42.922 Test: blockdev writev readv size > 128k ...passed 00:30:42.922 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:42.922 Test: blockdev comparev and writev ...passed 00:30:42.922 Test: blockdev nvme passthru rw ...passed 00:30:42.922 Test: blockdev nvme passthru vendor specific ...passed 00:30:42.922 Test: blockdev nvme admin passthru ...passed 00:30:42.922 Test: blockdev copy ...passed 00:30:42.922 Suite: bdevio tests on: crypto_ram1 00:30:42.922 Test: blockdev write read block ...passed 00:30:42.922 Test: blockdev write zeroes read block ...passed 00:30:42.922 Test: blockdev write zeroes read no split ...passed 00:30:42.922 Test: blockdev write zeroes read split ...passed 00:30:42.922 Test: blockdev write zeroes read split partial ...passed 00:30:42.922 Test: blockdev reset ...passed 00:30:42.922 Test: blockdev write read 8 blocks ...passed 00:30:42.922 Test: blockdev write read size > 128k ...passed 00:30:42.922 Test: blockdev write read invalid size ...passed 00:30:42.922 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:42.922 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:42.922 Test: blockdev write read max offset ...passed 00:30:42.922 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:42.922 Test: blockdev writev readv 8 blocks ...passed 00:30:42.922 Test: blockdev writev readv 30 x 1block ...passed 00:30:42.922 Test: blockdev writev readv block ...passed 00:30:42.922 Test: blockdev writev readv size > 128k ...passed 00:30:42.922 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:42.922 Test: blockdev comparev and writev ...passed 00:30:42.922 Test: blockdev nvme passthru rw ...passed 00:30:42.922 Test: blockdev nvme passthru vendor specific ...passed 00:30:42.922 Test: blockdev nvme admin passthru ...passed 00:30:42.922 Test: blockdev copy ...passed 00:30:42.922 Suite: bdevio tests on: crypto_ram 00:30:42.922 Test: blockdev write read block ...passed 00:30:42.922 Test: blockdev write zeroes read block ...passed 00:30:42.922 Test: blockdev write zeroes read no split ...passed 00:30:43.180 Test: blockdev write zeroes read split ...passed 00:30:43.181 Test: blockdev write zeroes read split partial ...passed 00:30:43.181 Test: blockdev reset ...passed 00:30:43.181 Test: blockdev write read 8 blocks ...passed 00:30:43.181 Test: blockdev write read size > 128k ...passed 00:30:43.181 Test: blockdev write read invalid size ...passed 00:30:43.181 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:43.181 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:43.181 Test: blockdev write read max offset ...passed 00:30:43.181 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:43.181 Test: blockdev writev readv 8 blocks ...passed 00:30:43.181 Test: blockdev writev readv 30 x 1block ...passed 00:30:43.181 Test: blockdev writev readv block ...passed 00:30:43.181 Test: blockdev writev readv size > 128k ...passed 00:30:43.181 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:43.181 Test: blockdev comparev and writev ...passed 00:30:43.181 Test: blockdev nvme passthru rw ...passed 00:30:43.181 Test: blockdev nvme passthru vendor specific ...passed 00:30:43.181 Test: blockdev nvme admin passthru ...passed 00:30:43.181 Test: blockdev copy ...passed 00:30:43.181 00:30:43.181 Run Summary: Type Total Ran Passed Failed Inactive 00:30:43.181 suites 4 4 n/a 0 0 00:30:43.181 tests 92 92 92 0 0 00:30:43.181 asserts 520 520 520 0 n/a 00:30:43.181 00:30:43.181 Elapsed time = 0.528 seconds 00:30:43.181 0 00:30:43.181 00:12:43 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 564600 00:30:43.181 00:12:43 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@946 -- # '[' -z 564600 ']' 00:30:43.181 00:12:43 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@950 -- # kill -0 564600 00:30:43.181 00:12:43 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@951 -- # uname 00:30:43.181 00:12:43 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:30:43.181 00:12:43 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 564600 00:30:43.181 00:12:43 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:30:43.181 00:12:43 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:30:43.181 00:12:43 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@964 -- # echo 'killing process with pid 564600' 00:30:43.181 killing process with pid 564600 00:30:43.181 00:12:43 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@965 -- # kill 564600 00:30:43.181 00:12:43 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@970 -- # wait 564600 00:30:43.747 00:12:44 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:30:43.747 00:30:43.747 real 0m3.826s 00:30:43.747 user 0m10.526s 00:30:43.747 sys 0m0.732s 00:30:43.747 00:12:44 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1122 -- # xtrace_disable 00:30:43.747 00:12:44 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:30:43.747 ************************************ 00:30:43.747 END TEST bdev_bounds 00:30:43.747 ************************************ 00:30:43.747 00:12:44 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:30:43.747 00:12:44 blockdev_crypto_qat -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:30:43.747 00:12:44 blockdev_crypto_qat -- common/autotest_common.sh@1103 -- # xtrace_disable 00:30:43.747 00:12:44 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:43.747 ************************************ 00:30:43.747 START TEST bdev_nbd 00:30:43.747 ************************************ 00:30:43.747 00:12:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1121 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:30:43.747 00:12:44 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:30:43.747 00:12:44 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:30:43.748 00:12:44 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:43.748 00:12:44 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:30:43.748 00:12:44 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:30:43.748 00:12:44 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:30:43.748 00:12:44 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:30:43.748 00:12:44 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:30:43.748 00:12:44 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:30:43.748 00:12:44 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:30:43.748 00:12:44 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:30:43.748 00:12:44 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:43.748 00:12:44 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:30:43.748 00:12:44 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:30:43.748 00:12:44 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:30:43.748 00:12:44 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=565154 00:30:43.748 00:12:44 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:30:43.748 00:12:44 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 565154 /var/tmp/spdk-nbd.sock 00:30:43.748 00:12:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@827 -- # '[' -z 565154 ']' 00:30:43.748 00:12:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:30:43.748 00:12:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@832 -- # local max_retries=100 00:30:43.748 00:12:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:30:43.748 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:30:43.748 00:12:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@836 -- # xtrace_disable 00:30:43.748 00:12:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:30:43.748 00:12:44 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:30:43.748 [2024-05-15 00:12:44.203552] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:30:43.748 [2024-05-15 00:12:44.203613] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:30:43.748 [2024-05-15 00:12:44.330349] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:44.008 [2024-05-15 00:12:44.439592] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:30:44.008 [2024-05-15 00:12:44.460883] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:30:44.008 [2024-05-15 00:12:44.468903] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:44.008 [2024-05-15 00:12:44.476921] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:44.008 [2024-05-15 00:12:44.589123] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:30:46.574 [2024-05-15 00:12:47.015723] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:30:46.574 [2024-05-15 00:12:47.015796] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:46.574 [2024-05-15 00:12:47.015812] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:46.574 [2024-05-15 00:12:47.023742] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:30:46.574 [2024-05-15 00:12:47.023761] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:46.574 [2024-05-15 00:12:47.023773] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:46.574 [2024-05-15 00:12:47.031762] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:30:46.574 [2024-05-15 00:12:47.031784] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:46.574 [2024-05-15 00:12:47.031796] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:46.574 [2024-05-15 00:12:47.039783] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:30:46.574 [2024-05-15 00:12:47.039799] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:46.574 [2024-05-15 00:12:47.039810] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:46.574 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:30:46.574 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@860 -- # return 0 00:30:46.574 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:30:46.574 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:46.574 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:30:46.574 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:30:46.574 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:30:46.574 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:46.574 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:30:46.574 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:30:46.574 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:30:46.574 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:30:46.574 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:30:46.574 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:46.574 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:30:46.834 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:30:46.834 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:30:46.834 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:30:46.834 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:30:46.834 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:30:46.834 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:30:46.834 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:30:46.834 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:30:46.834 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:30:46.834 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:30:46.834 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:30:46.834 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:46.834 1+0 records in 00:30:46.834 1+0 records out 00:30:46.834 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000325653 s, 12.6 MB/s 00:30:46.834 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:47.093 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:30:47.093 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:47.093 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:30:47.093 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:30:47.093 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:47.093 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:47.093 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:30:47.093 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:30:47.352 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:30:47.352 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:30:47.352 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:30:47.352 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:30:47.352 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:30:47.352 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:30:47.352 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:30:47.352 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:30:47.352 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:30:47.352 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:30:47.352 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:47.352 1+0 records in 00:30:47.352 1+0 records out 00:30:47.352 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000244315 s, 16.8 MB/s 00:30:47.352 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:47.352 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:30:47.352 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:47.352 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:30:47.352 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:30:47.352 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:47.352 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:47.352 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:30:47.611 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:30:47.611 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:30:47.611 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:30:47.611 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd2 00:30:47.611 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:30:47.611 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:30:47.611 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:30:47.611 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd2 /proc/partitions 00:30:47.611 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:30:47.611 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:30:47.611 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:30:47.611 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:47.611 1+0 records in 00:30:47.611 1+0 records out 00:30:47.611 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000315698 s, 13.0 MB/s 00:30:47.611 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:47.611 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:30:47.611 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:47.611 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:30:47.611 00:12:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:30:47.611 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:47.611 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:47.611 00:12:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:30:47.870 00:12:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:30:47.870 00:12:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:30:47.870 00:12:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:30:47.870 00:12:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd3 00:30:47.870 00:12:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:30:47.870 00:12:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:30:47.870 00:12:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:30:47.870 00:12:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd3 /proc/partitions 00:30:47.870 00:12:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:30:47.870 00:12:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:30:47.870 00:12:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:30:47.870 00:12:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:47.870 1+0 records in 00:30:47.870 1+0 records out 00:30:47.870 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000286539 s, 14.3 MB/s 00:30:47.870 00:12:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:47.870 00:12:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:30:47.870 00:12:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:47.870 00:12:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:30:47.870 00:12:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:30:47.870 00:12:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:47.870 00:12:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:47.870 00:12:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:30:48.129 00:12:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:30:48.129 { 00:30:48.129 "nbd_device": "/dev/nbd0", 00:30:48.129 "bdev_name": "crypto_ram" 00:30:48.129 }, 00:30:48.129 { 00:30:48.129 "nbd_device": "/dev/nbd1", 00:30:48.129 "bdev_name": "crypto_ram1" 00:30:48.129 }, 00:30:48.129 { 00:30:48.129 "nbd_device": "/dev/nbd2", 00:30:48.129 "bdev_name": "crypto_ram2" 00:30:48.129 }, 00:30:48.129 { 00:30:48.129 "nbd_device": "/dev/nbd3", 00:30:48.129 "bdev_name": "crypto_ram3" 00:30:48.129 } 00:30:48.129 ]' 00:30:48.129 00:12:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:30:48.129 00:12:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:30:48.129 { 00:30:48.129 "nbd_device": "/dev/nbd0", 00:30:48.129 "bdev_name": "crypto_ram" 00:30:48.129 }, 00:30:48.129 { 00:30:48.129 "nbd_device": "/dev/nbd1", 00:30:48.129 "bdev_name": "crypto_ram1" 00:30:48.129 }, 00:30:48.129 { 00:30:48.129 "nbd_device": "/dev/nbd2", 00:30:48.129 "bdev_name": "crypto_ram2" 00:30:48.129 }, 00:30:48.129 { 00:30:48.129 "nbd_device": "/dev/nbd3", 00:30:48.129 "bdev_name": "crypto_ram3" 00:30:48.129 } 00:30:48.129 ]' 00:30:48.129 00:12:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:30:48.129 00:12:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:30:48.129 00:12:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:48.129 00:12:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:30:48.129 00:12:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:30:48.129 00:12:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:30:48.129 00:12:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:48.129 00:12:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:30:48.388 00:12:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:30:48.388 00:12:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:30:48.388 00:12:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:30:48.388 00:12:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:48.388 00:12:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:48.388 00:12:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:30:48.388 00:12:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:48.388 00:12:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:48.388 00:12:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:48.388 00:12:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:30:48.647 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:30:48.647 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:30:48.647 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:30:48.647 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:48.647 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:48.647 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:30:48.647 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:48.647 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:48.647 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:48.647 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:30:48.906 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:30:48.906 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:30:48.906 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:30:48.906 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:48.906 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:48.906 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:30:48.906 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:48.906 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:48.906 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:48.906 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:30:49.166 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:30:49.166 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:30:49.166 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:30:49.166 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:49.166 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:49.166 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:30:49.166 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:49.166 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:49.166 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:30:49.166 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:49.166 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:30:49.425 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:30:49.425 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:30:49.425 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:30:49.425 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:30:49.425 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:30:49.425 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:30:49.425 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:30:49.425 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:30:49.425 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:30:49.425 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:30:49.425 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:30:49.425 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:30:49.425 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:30:49.426 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:49.426 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:30:49.426 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:30:49.426 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:49.426 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:30:49.426 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:30:49.426 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:49.426 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:30:49.426 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:30:49.426 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:49.426 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:30:49.426 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:30:49.426 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:30:49.426 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:49.426 00:12:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:30:49.685 /dev/nbd0 00:30:49.685 00:12:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:30:49.685 00:12:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:30:49.685 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:30:49.685 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:30:49.685 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:30:49.685 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:30:49.685 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:30:49.685 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:30:49.685 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:30:49.685 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:30:49.685 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:49.685 1+0 records in 00:30:49.685 1+0 records out 00:30:49.685 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000302252 s, 13.6 MB/s 00:30:49.685 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:49.685 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:30:49.685 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:49.685 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:30:49.685 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:30:49.685 00:12:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:49.685 00:12:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:49.685 00:12:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:30:49.944 /dev/nbd1 00:30:49.944 00:12:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:30:49.944 00:12:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:30:49.944 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:30:49.944 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:30:49.944 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:30:49.944 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:30:49.944 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:30:49.944 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:30:49.944 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:30:49.944 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:30:49.944 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:49.944 1+0 records in 00:30:49.944 1+0 records out 00:30:49.944 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000307038 s, 13.3 MB/s 00:30:49.944 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:49.944 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:30:49.944 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:49.944 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:30:49.944 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:30:49.944 00:12:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:49.944 00:12:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:49.944 00:12:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:30:50.203 /dev/nbd10 00:30:50.203 00:12:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:30:50.203 00:12:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:30:50.203 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd10 00:30:50.203 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:30:50.203 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:30:50.203 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:30:50.203 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd10 /proc/partitions 00:30:50.203 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:30:50.203 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:30:50.203 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:30:50.203 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:50.203 1+0 records in 00:30:50.203 1+0 records out 00:30:50.203 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000298254 s, 13.7 MB/s 00:30:50.203 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:50.203 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:30:50.203 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:50.203 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:30:50.203 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:30:50.203 00:12:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:50.203 00:12:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:50.203 00:12:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:30:50.462 /dev/nbd11 00:30:50.462 00:12:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:30:50.462 00:12:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:30:50.462 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd11 00:30:50.462 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:30:50.462 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:30:50.462 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:30:50.462 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd11 /proc/partitions 00:30:50.462 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:30:50.462 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:30:50.462 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:30:50.462 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:50.462 1+0 records in 00:30:50.462 1+0 records out 00:30:50.462 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000409425 s, 10.0 MB/s 00:30:50.462 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:50.462 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:30:50.462 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:50.462 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:30:50.462 00:12:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:30:50.462 00:12:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:50.462 00:12:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:50.462 00:12:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:30:50.462 00:12:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:50.462 00:12:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:30:50.722 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:30:50.722 { 00:30:50.722 "nbd_device": "/dev/nbd0", 00:30:50.722 "bdev_name": "crypto_ram" 00:30:50.722 }, 00:30:50.722 { 00:30:50.722 "nbd_device": "/dev/nbd1", 00:30:50.722 "bdev_name": "crypto_ram1" 00:30:50.722 }, 00:30:50.722 { 00:30:50.722 "nbd_device": "/dev/nbd10", 00:30:50.722 "bdev_name": "crypto_ram2" 00:30:50.722 }, 00:30:50.722 { 00:30:50.722 "nbd_device": "/dev/nbd11", 00:30:50.722 "bdev_name": "crypto_ram3" 00:30:50.722 } 00:30:50.722 ]' 00:30:50.722 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:30:50.722 { 00:30:50.722 "nbd_device": "/dev/nbd0", 00:30:50.722 "bdev_name": "crypto_ram" 00:30:50.722 }, 00:30:50.722 { 00:30:50.722 "nbd_device": "/dev/nbd1", 00:30:50.722 "bdev_name": "crypto_ram1" 00:30:50.722 }, 00:30:50.722 { 00:30:50.722 "nbd_device": "/dev/nbd10", 00:30:50.722 "bdev_name": "crypto_ram2" 00:30:50.722 }, 00:30:50.722 { 00:30:50.722 "nbd_device": "/dev/nbd11", 00:30:50.722 "bdev_name": "crypto_ram3" 00:30:50.722 } 00:30:50.722 ]' 00:30:50.722 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:30:50.722 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:30:50.722 /dev/nbd1 00:30:50.722 /dev/nbd10 00:30:50.722 /dev/nbd11' 00:30:50.722 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:30:50.722 /dev/nbd1 00:30:50.722 /dev/nbd10 00:30:50.722 /dev/nbd11' 00:30:50.722 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:30:50.722 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:30:50.722 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:30:50.722 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:30:50.722 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:30:50.722 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:30:50.722 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:50.722 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:30:50.722 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:30:50.722 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:30:50.722 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:30:50.722 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:30:50.722 256+0 records in 00:30:50.722 256+0 records out 00:30:50.722 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0104649 s, 100 MB/s 00:30:50.722 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:30:50.722 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:30:50.722 256+0 records in 00:30:50.722 256+0 records out 00:30:50.722 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0840314 s, 12.5 MB/s 00:30:50.722 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:30:50.722 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:30:50.981 256+0 records in 00:30:50.981 256+0 records out 00:30:50.981 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.063162 s, 16.6 MB/s 00:30:50.981 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:30:50.981 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:30:50.981 256+0 records in 00:30:50.981 256+0 records out 00:30:50.981 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0585226 s, 17.9 MB/s 00:30:50.981 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:30:50.981 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:30:50.981 256+0 records in 00:30:50.981 256+0 records out 00:30:50.981 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0557677 s, 18.8 MB/s 00:30:50.982 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:30:50.982 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:50.982 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:30:50.982 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:30:50.982 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:30:50.982 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:30:50.982 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:30:50.982 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:30:50.982 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:30:50.982 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:30:50.982 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:30:50.982 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:30:50.982 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:30:50.982 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:30:50.982 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:30:50.982 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:30:50.982 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:30:50.982 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:50.982 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:50.982 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:30:50.982 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:30:50.982 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:50.982 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:30:51.241 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:30:51.241 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:30:51.241 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:30:51.241 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:51.241 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:51.241 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:30:51.241 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:51.241 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:51.241 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:51.241 00:12:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:30:51.500 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:30:51.500 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:30:51.500 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:30:51.500 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:51.500 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:51.500 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:30:51.500 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:51.500 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:51.500 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:51.500 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:30:51.758 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:30:51.758 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:30:51.758 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:30:51.758 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:51.758 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:51.758 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:30:51.758 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:51.758 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:51.759 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:51.759 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:30:52.018 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:30:52.018 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:30:52.018 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:30:52.018 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:52.018 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:52.018 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:30:52.018 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:52.018 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:52.278 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:30:52.278 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:52.278 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:30:52.278 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:30:52.278 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:30:52.278 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:30:52.537 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:30:52.537 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:30:52.537 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:30:52.537 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:30:52.537 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:30:52.537 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:30:52.537 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:30:52.537 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:30:52.537 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:30:52.537 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:30:52.537 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:52.537 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:52.537 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:30:52.537 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:30:52.537 00:12:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:30:52.796 malloc_lvol_verify 00:30:52.796 00:12:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:30:53.059 0bb9bb3d-7895-4c0f-bd0a-767054098700 00:30:53.059 00:12:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:30:53.059 adcb2e31-b234-46aa-9481-30420012f427 00:30:53.318 00:12:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:30:53.318 /dev/nbd0 00:30:53.318 00:12:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:30:53.318 mke2fs 1.46.5 (30-Dec-2021) 00:30:53.318 Discarding device blocks: 0/4096 done 00:30:53.318 Creating filesystem with 4096 1k blocks and 1024 inodes 00:30:53.318 00:30:53.318 Allocating group tables: 0/1 done 00:30:53.318 Writing inode tables: 0/1 done 00:30:53.318 Creating journal (1024 blocks): done 00:30:53.577 Writing superblocks and filesystem accounting information: 0/1 done 00:30:53.577 00:30:53.577 00:12:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:30:53.577 00:12:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:30:53.577 00:12:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:53.577 00:12:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:30:53.577 00:12:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:30:53.577 00:12:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:30:53.577 00:12:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:53.577 00:12:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:30:53.837 00:12:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:30:53.837 00:12:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:30:53.837 00:12:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:30:53.837 00:12:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:53.837 00:12:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:53.837 00:12:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:30:53.837 00:12:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:53.837 00:12:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:53.837 00:12:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:30:53.837 00:12:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:30:53.837 00:12:54 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 565154 00:30:53.837 00:12:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@946 -- # '[' -z 565154 ']' 00:30:53.837 00:12:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@950 -- # kill -0 565154 00:30:53.837 00:12:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@951 -- # uname 00:30:53.837 00:12:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:30:53.837 00:12:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 565154 00:30:53.837 00:12:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:30:53.837 00:12:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:30:53.837 00:12:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@964 -- # echo 'killing process with pid 565154' 00:30:53.837 killing process with pid 565154 00:30:53.837 00:12:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@965 -- # kill 565154 00:30:53.837 00:12:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@970 -- # wait 565154 00:30:54.406 00:12:54 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:30:54.406 00:30:54.406 real 0m10.545s 00:30:54.406 user 0m13.531s 00:30:54.406 sys 0m4.252s 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1122 -- # xtrace_disable 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:30:54.407 ************************************ 00:30:54.407 END TEST bdev_nbd 00:30:54.407 ************************************ 00:30:54.407 00:12:54 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:30:54.407 00:12:54 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = nvme ']' 00:30:54.407 00:12:54 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = gpt ']' 00:30:54.407 00:12:54 blockdev_crypto_qat -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:30:54.407 00:12:54 blockdev_crypto_qat -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:30:54.407 00:12:54 blockdev_crypto_qat -- common/autotest_common.sh@1103 -- # xtrace_disable 00:30:54.407 00:12:54 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:54.407 ************************************ 00:30:54.407 START TEST bdev_fio 00:30:54.407 ************************************ 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1121 -- # fio_test_suite '' 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:30:54.407 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1276 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1277 -- # local workload=verify 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1278 -- # local bdev_type=AIO 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1279 -- # local env_context= 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local fio_dir=/usr/src/fio 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1287 -- # '[' -z verify ']' 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -n '' ']' 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1297 -- # cat 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1309 -- # '[' verify == verify ']' 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1310 -- # cat 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1319 -- # '[' AIO == AIO ']' 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1320 -- # /usr/src/fio/fio --version 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1320 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1321 -- # echo serialize_overlap=1 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram1]' 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram1 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1103 -- # xtrace_disable 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:30:54.407 ************************************ 00:30:54.407 START TEST bdev_fio_rw_verify 00:30:54.407 ************************************ 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1121 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1333 -- # local fio_dir=/usr/src/fio 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1335 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1335 -- # local sanitizers 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # shift 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local asan_lib= 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # grep libasan 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # asan_lib= 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # grep libclang_rt.asan 00:30:54.407 00:12:54 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:30:54.665 00:12:54 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # asan_lib= 00:30:54.665 00:12:54 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:30:54.665 00:12:54 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1348 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:30:54.665 00:12:54 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1348 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:54.923 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:54.923 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:54.923 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:54.923 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:54.923 fio-3.35 00:30:54.923 Starting 4 threads 00:31:09.808 00:31:09.808 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=567211: Wed May 15 00:13:08 2024 00:31:09.808 read: IOPS=18.7k, BW=73.1MiB/s (76.6MB/s)(731MiB/10001msec) 00:31:09.808 slat (usec): min=17, max=522, avg=73.62, stdev=31.27 00:31:09.808 clat (usec): min=22, max=1680, avg=398.72, stdev=219.11 00:31:09.808 lat (usec): min=65, max=1870, avg=472.34, stdev=229.64 00:31:09.808 clat percentiles (usec): 00:31:09.808 | 50.000th=[ 359], 99.000th=[ 979], 99.900th=[ 1172], 99.990th=[ 1319], 00:31:09.808 | 99.999th=[ 1663] 00:31:09.808 write: IOPS=20.8k, BW=81.1MiB/s (85.1MB/s)(793MiB/9773msec); 0 zone resets 00:31:09.808 slat (usec): min=24, max=1495, avg=86.88, stdev=30.46 00:31:09.808 clat (usec): min=28, max=2419, avg=448.54, stdev=239.96 00:31:09.808 lat (usec): min=71, max=2493, avg=535.42, stdev=249.28 00:31:09.808 clat percentiles (usec): 00:31:09.808 | 50.000th=[ 412], 99.000th=[ 1074], 99.900th=[ 1270], 99.990th=[ 1467], 00:31:09.808 | 99.999th=[ 2073] 00:31:09.808 bw ( KiB/s): min=59456, max=115160, per=97.85%, avg=81268.68, stdev=4037.31, samples=76 00:31:09.808 iops : min=14864, max=28790, avg=20317.16, stdev=1009.33, samples=76 00:31:09.808 lat (usec) : 50=0.01%, 100=2.30%, 250=25.25%, 500=38.18%, 750=24.01% 00:31:09.808 lat (usec) : 1000=8.87% 00:31:09.808 lat (msec) : 2=1.38%, 4=0.01% 00:31:09.808 cpu : usr=99.59%, sys=0.02%, ctx=57, majf=0, minf=302 00:31:09.808 IO depths : 1=7.9%, 2=26.3%, 4=52.6%, 8=13.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:31:09.808 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:09.808 complete : 0=0.0%, 4=88.4%, 8=11.6%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:09.808 issued rwts: total=187063,202930,0,0 short=0,0,0,0 dropped=0,0,0,0 00:31:09.808 latency : target=0, window=0, percentile=100.00%, depth=8 00:31:09.808 00:31:09.808 Run status group 0 (all jobs): 00:31:09.808 READ: bw=73.1MiB/s (76.6MB/s), 73.1MiB/s-73.1MiB/s (76.6MB/s-76.6MB/s), io=731MiB (766MB), run=10001-10001msec 00:31:09.808 WRITE: bw=81.1MiB/s (85.1MB/s), 81.1MiB/s-81.1MiB/s (85.1MB/s-85.1MB/s), io=793MiB (831MB), run=9773-9773msec 00:31:09.808 00:31:09.808 real 0m13.798s 00:31:09.808 user 0m45.951s 00:31:09.808 sys 0m0.698s 00:31:09.808 00:13:08 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:31:09.808 00:13:08 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:31:09.808 ************************************ 00:31:09.808 END TEST bdev_fio_rw_verify 00:31:09.808 ************************************ 00:31:09.808 00:13:08 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:31:09.808 00:13:08 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:09.808 00:13:08 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:31:09.808 00:13:08 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1276 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:09.808 00:13:08 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1277 -- # local workload=trim 00:31:09.808 00:13:08 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1278 -- # local bdev_type= 00:31:09.808 00:13:08 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1279 -- # local env_context= 00:31:09.808 00:13:08 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local fio_dir=/usr/src/fio 00:31:09.808 00:13:08 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:31:09.808 00:13:08 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1287 -- # '[' -z trim ']' 00:31:09.808 00:13:08 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -n '' ']' 00:31:09.808 00:13:08 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:09.808 00:13:08 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1297 -- # cat 00:31:09.808 00:13:08 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1309 -- # '[' trim == verify ']' 00:31:09.808 00:13:08 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # '[' trim == trim ']' 00:31:09.808 00:13:08 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1325 -- # echo rw=trimwrite 00:31:09.808 00:13:08 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:31:09.808 00:13:08 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "f488b798-7c2c-5a75-b25d-ac9dd85e2a85"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f488b798-7c2c-5a75-b25d-ac9dd85e2a85",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "3cadebc4-8e9d-5f49-b71a-f087512c6255"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "3cadebc4-8e9d-5f49-b71a-f087512c6255",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "6a1b3095-0f27-5d9c-9522-c5cea79d3405"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "6a1b3095-0f27-5d9c-9522-c5cea79d3405",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "7ddb925e-79d0-5e49-ac43-d31d39938c0c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "7ddb925e-79d0-5e49-ac43-d31d39938c0c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:31:09.808 00:13:08 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:31:09.808 crypto_ram1 00:31:09.808 crypto_ram2 00:31:09.808 crypto_ram3 ]] 00:31:09.808 00:13:08 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "f488b798-7c2c-5a75-b25d-ac9dd85e2a85"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f488b798-7c2c-5a75-b25d-ac9dd85e2a85",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "3cadebc4-8e9d-5f49-b71a-f087512c6255"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "3cadebc4-8e9d-5f49-b71a-f087512c6255",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "6a1b3095-0f27-5d9c-9522-c5cea79d3405"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "6a1b3095-0f27-5d9c-9522-c5cea79d3405",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "7ddb925e-79d0-5e49-ac43-d31d39938c0c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "7ddb925e-79d0-5e49-ac43-d31d39938c0c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram1]' 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram1 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1103 -- # xtrace_disable 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:31:09.809 ************************************ 00:31:09.809 START TEST bdev_fio_trim 00:31:09.809 ************************************ 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1121 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1333 -- # local fio_dir=/usr/src/fio 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1335 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1335 -- # local sanitizers 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # shift 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local asan_lib= 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # grep libasan 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # asan_lib= 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # grep libclang_rt.asan 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # asan_lib= 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1348 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:31:09.809 00:13:08 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1348 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:09.809 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:09.809 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:09.809 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:09.809 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:09.809 fio-3.35 00:31:09.809 Starting 4 threads 00:31:22.065 00:31:22.065 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=569072: Wed May 15 00:13:22 2024 00:31:22.065 write: IOPS=32.9k, BW=128MiB/s (135MB/s)(1284MiB/10001msec); 0 zone resets 00:31:22.065 slat (usec): min=17, max=1647, avg=70.03, stdev=24.58 00:31:22.065 clat (usec): min=40, max=1196, avg=251.64, stdev=123.01 00:31:22.066 lat (usec): min=59, max=2212, avg=321.68, stdev=132.46 00:31:22.066 clat percentiles (usec): 00:31:22.066 | 50.000th=[ 243], 99.000th=[ 611], 99.900th=[ 734], 99.990th=[ 799], 00:31:22.066 | 99.999th=[ 1156] 00:31:22.066 bw ( KiB/s): min=109056, max=187872, per=100.00%, avg=132166.47, stdev=5166.84, samples=76 00:31:22.066 iops : min=27264, max=46968, avg=33041.58, stdev=1291.71, samples=76 00:31:22.066 trim: IOPS=32.9k, BW=128MiB/s (135MB/s)(1284MiB/10001msec); 0 zone resets 00:31:22.066 slat (usec): min=4, max=429, avg=21.05, stdev= 9.64 00:31:22.066 clat (usec): min=6, max=2212, avg=321.86, stdev=132.48 00:31:22.066 lat (usec): min=23, max=2233, avg=342.91, stdev=135.98 00:31:22.066 clat percentiles (usec): 00:31:22.066 | 50.000th=[ 314], 99.000th=[ 701], 99.900th=[ 832], 99.990th=[ 938], 00:31:22.066 | 99.999th=[ 1385] 00:31:22.066 bw ( KiB/s): min=109056, max=187872, per=100.00%, avg=132166.47, stdev=5166.84, samples=76 00:31:22.066 iops : min=27264, max=46968, avg=33041.58, stdev=1291.71, samples=76 00:31:22.066 lat (usec) : 10=0.01%, 50=0.04%, 100=5.04%, 250=38.27%, 500=50.59% 00:31:22.066 lat (usec) : 750=5.75%, 1000=0.30% 00:31:22.066 lat (msec) : 2=0.01%, 4=0.01% 00:31:22.066 cpu : usr=99.61%, sys=0.00%, ctx=50, majf=0, minf=99 00:31:22.066 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:31:22.066 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:22.066 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:22.066 issued rwts: total=0,328779,328780,0 short=0,0,0,0 dropped=0,0,0,0 00:31:22.066 latency : target=0, window=0, percentile=100.00%, depth=8 00:31:22.066 00:31:22.066 Run status group 0 (all jobs): 00:31:22.066 WRITE: bw=128MiB/s (135MB/s), 128MiB/s-128MiB/s (135MB/s-135MB/s), io=1284MiB (1347MB), run=10001-10001msec 00:31:22.066 TRIM: bw=128MiB/s (135MB/s), 128MiB/s-128MiB/s (135MB/s-135MB/s), io=1284MiB (1347MB), run=10001-10001msec 00:31:22.325 00:31:22.325 real 0m13.769s 00:31:22.325 user 0m45.879s 00:31:22.325 sys 0m0.662s 00:31:22.325 00:13:22 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1122 -- # xtrace_disable 00:31:22.325 00:13:22 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:31:22.325 ************************************ 00:31:22.325 END TEST bdev_fio_trim 00:31:22.325 ************************************ 00:31:22.325 00:13:22 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:31:22.325 00:13:22 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:22.325 00:13:22 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:31:22.325 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:22.325 00:13:22 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:31:22.325 00:31:22.325 real 0m27.935s 00:31:22.325 user 1m32.010s 00:31:22.325 sys 0m1.562s 00:31:22.325 00:13:22 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1122 -- # xtrace_disable 00:31:22.325 00:13:22 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:31:22.325 ************************************ 00:31:22.325 END TEST bdev_fio 00:31:22.325 ************************************ 00:31:22.325 00:13:22 blockdev_crypto_qat -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:31:22.325 00:13:22 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:31:22.325 00:13:22 blockdev_crypto_qat -- common/autotest_common.sh@1097 -- # '[' 16 -le 1 ']' 00:31:22.325 00:13:22 blockdev_crypto_qat -- common/autotest_common.sh@1103 -- # xtrace_disable 00:31:22.325 00:13:22 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:31:22.325 ************************************ 00:31:22.325 START TEST bdev_verify 00:31:22.325 ************************************ 00:31:22.325 00:13:22 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:31:22.325 [2024-05-15 00:13:22.867035] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:31:22.325 [2024-05-15 00:13:22.867098] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid570497 ] 00:31:22.584 [2024-05-15 00:13:22.996606] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:22.584 [2024-05-15 00:13:23.095562] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:31:22.584 [2024-05-15 00:13:23.095569] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:31:22.584 [2024-05-15 00:13:23.117059] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:31:22.584 [2024-05-15 00:13:23.125085] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:22.584 [2024-05-15 00:13:23.133102] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:22.842 [2024-05-15 00:13:23.236576] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:31:25.376 [2024-05-15 00:13:25.650490] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:31:25.376 [2024-05-15 00:13:25.650573] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:25.376 [2024-05-15 00:13:25.650588] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:25.376 [2024-05-15 00:13:25.658511] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:31:25.376 [2024-05-15 00:13:25.658529] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:25.376 [2024-05-15 00:13:25.658542] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:25.376 [2024-05-15 00:13:25.666532] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:31:25.376 [2024-05-15 00:13:25.666553] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:25.376 [2024-05-15 00:13:25.666565] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:25.376 [2024-05-15 00:13:25.674558] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:31:25.376 [2024-05-15 00:13:25.674575] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:25.376 [2024-05-15 00:13:25.674587] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:25.376 Running I/O for 5 seconds... 00:31:30.643 00:31:30.643 Latency(us) 00:31:30.643 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:30.643 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:30.643 Verification LBA range: start 0x0 length 0x1000 00:31:30.643 crypto_ram : 5.07 483.12 1.89 0.00 0.00 263584.79 1446.07 187831.87 00:31:30.643 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:30.643 Verification LBA range: start 0x1000 length 0x1000 00:31:30.643 crypto_ram : 5.08 491.77 1.92 0.00 0.00 258783.97 3248.31 186920.07 00:31:30.643 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:30.643 Verification LBA range: start 0x0 length 0x1000 00:31:30.643 crypto_ram1 : 5.07 486.08 1.90 0.00 0.00 261246.31 719.47 169595.77 00:31:30.643 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:30.643 Verification LBA range: start 0x1000 length 0x1000 00:31:30.643 crypto_ram1 : 5.08 494.74 1.93 0.00 0.00 256543.96 3177.07 168683.97 00:31:30.643 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:30.643 Verification LBA range: start 0x0 length 0x1000 00:31:30.643 crypto_ram2 : 5.06 3810.13 14.88 0.00 0.00 33231.06 2578.70 27582.11 00:31:30.643 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:30.643 Verification LBA range: start 0x1000 length 0x1000 00:31:30.643 crypto_ram2 : 5.05 3849.57 15.04 0.00 0.00 32903.98 7066.49 27354.16 00:31:30.643 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:30.643 Verification LBA range: start 0x0 length 0x1000 00:31:30.643 crypto_ram3 : 5.06 3819.07 14.92 0.00 0.00 33081.32 1866.35 27240.18 00:31:30.643 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:30.643 Verification LBA range: start 0x1000 length 0x1000 00:31:30.643 crypto_ram3 : 5.07 3864.22 15.09 0.00 0.00 32691.71 3675.71 26784.28 00:31:30.643 =================================================================================================================== 00:31:30.643 Total : 17298.71 67.57 0.00 0.00 58700.00 719.47 187831.87 00:31:30.902 00:31:30.902 real 0m8.466s 00:31:30.902 user 0m15.901s 00:31:30.902 sys 0m0.531s 00:31:30.902 00:13:31 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:31:30.902 00:13:31 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:31:30.902 ************************************ 00:31:30.902 END TEST bdev_verify 00:31:30.902 ************************************ 00:31:30.902 00:13:31 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:31:30.902 00:13:31 blockdev_crypto_qat -- common/autotest_common.sh@1097 -- # '[' 16 -le 1 ']' 00:31:30.902 00:13:31 blockdev_crypto_qat -- common/autotest_common.sh@1103 -- # xtrace_disable 00:31:30.902 00:13:31 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:31:30.902 ************************************ 00:31:30.902 START TEST bdev_verify_big_io 00:31:30.902 ************************************ 00:31:30.902 00:13:31 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:31:30.902 [2024-05-15 00:13:31.420104] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:31:30.902 [2024-05-15 00:13:31.420161] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid571565 ] 00:31:31.160 [2024-05-15 00:13:31.547665] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:31.160 [2024-05-15 00:13:31.646012] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:31:31.160 [2024-05-15 00:13:31.646018] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:31:31.160 [2024-05-15 00:13:31.667524] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:31:31.160 [2024-05-15 00:13:31.675547] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:31.160 [2024-05-15 00:13:31.683567] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:31.418 [2024-05-15 00:13:31.796416] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:31:33.948 [2024-05-15 00:13:34.216241] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:31:33.948 [2024-05-15 00:13:34.216315] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:33.948 [2024-05-15 00:13:34.216330] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:33.948 [2024-05-15 00:13:34.224256] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:31:33.948 [2024-05-15 00:13:34.224275] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:33.948 [2024-05-15 00:13:34.224287] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:33.948 [2024-05-15 00:13:34.232277] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:31:33.948 [2024-05-15 00:13:34.232295] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:33.948 [2024-05-15 00:13:34.232307] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:33.948 [2024-05-15 00:13:34.240302] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:31:33.948 [2024-05-15 00:13:34.240323] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:33.948 [2024-05-15 00:13:34.240334] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:33.948 Running I/O for 5 seconds... 00:31:34.886 [2024-05-15 00:13:35.181650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.886 [2024-05-15 00:13:35.182081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.886 [2024-05-15 00:13:35.182462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.886 [2024-05-15 00:13:35.182825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.886 [2024-05-15 00:13:35.182894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.886 [2024-05-15 00:13:35.182961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.886 [2024-05-15 00:13:35.183018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.886 [2024-05-15 00:13:35.183071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.886 [2024-05-15 00:13:35.183529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.886 [2024-05-15 00:13:35.183553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.886 [2024-05-15 00:13:35.183567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.886 [2024-05-15 00:13:35.183582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.886 [2024-05-15 00:13:35.187142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.886 [2024-05-15 00:13:35.187193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.886 [2024-05-15 00:13:35.187245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.886 [2024-05-15 00:13:35.187289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.886 [2024-05-15 00:13:35.187730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.886 [2024-05-15 00:13:35.187776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.886 [2024-05-15 00:13:35.187833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.886 [2024-05-15 00:13:35.187889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.886 [2024-05-15 00:13:35.188247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.886 [2024-05-15 00:13:35.188264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.886 [2024-05-15 00:13:35.188278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.886 [2024-05-15 00:13:35.188293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.886 [2024-05-15 00:13:35.191827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.886 [2024-05-15 00:13:35.191885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.191927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.191968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.192368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.192419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.192462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.192502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.192979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.192995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.193011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.193027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.196484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.196536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.196596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.196650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.197112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.197155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.197196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.197237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.197678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.197696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.197711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.197726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.200856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.200902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.200943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.200984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.201476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.201522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.201564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.201605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.202028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.202045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.202061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.202076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.205259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.205306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.205347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.205387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.205860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.205905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.205948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.205990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.206412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.206430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.206444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.206463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.209792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.209838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.209879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.209919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.210374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.210425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.210467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.210508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.210949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.210966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.210982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.210996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.214352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.214404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.214448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.214490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.214953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.215008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.215050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.215091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.215548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.215566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.215581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.215597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.218914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.218960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.219003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.219045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.219509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.219558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.219599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.219641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.220065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.220082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.220097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.220114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.223373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.223426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.223467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.223508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.223991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.224036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.224077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.224118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.224561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.224578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.224594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.224609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.227764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.887 [2024-05-15 00:13:35.227814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.227857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.227898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.228358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.228407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.228450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.228490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.228882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.228901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.228915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.228930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.232071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.232118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.232164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.232205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.232678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.232733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.232777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.232854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.233352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.233369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.233384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.233404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.236636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.236682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.236723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.236764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.237164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.237209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.237250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.237304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.237667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.237684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.237699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.237713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.241265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.241332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.241382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.241430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.241837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.241882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.241928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.241969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.242384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.242406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.242421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.242437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.245629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.245687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.245727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.245782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.246205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.246250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.246291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.246332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.246773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.246791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.246807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.246822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.249870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.249918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.249959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.250000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.250487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.250537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.250583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.250625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.251051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.251069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.251085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.251100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.254008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.254058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.254099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.254139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.254591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.254636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.254679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.254723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.255146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.255163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.255177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.255192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.258283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.258329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.258370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.258416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.258876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.258921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.258965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.259008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.259482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.259500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.259515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.259530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.262494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.888 [2024-05-15 00:13:35.262543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.262588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.262631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.263053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.263097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.263138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.263183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.263604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.263622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.263637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.263652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.266549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.266594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.266651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.266692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.267192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.267236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.267278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.267319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.267758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.267780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.267795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.267811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.270725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.270769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.270809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.270850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.271323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.271367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.271414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.271456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.271877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.271894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.271910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.271924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.274843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.274889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.274935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.274976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.275452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.275508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.275549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.275601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.275987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.276004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.276019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.276034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.279090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.279136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.279178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.279219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.279673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.279717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.279759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.279801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.280150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.280167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.280182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.280197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.283576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.283623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.283683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.283738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.284226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.284286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.284329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.284369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.284804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.284825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.284840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.284855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.287805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.287851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.287891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.287943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.288348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.288414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.288456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.288497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.288962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.288980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.288996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.289011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.291844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.291891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.291932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.291973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.292394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.292445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.292487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.292533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.292968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.292986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.293001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.293016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.889 [2024-05-15 00:13:35.295702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.295748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.295789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.295837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.296304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.296348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.296389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.296436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.296831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.296848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.296862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.296877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.299612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.299658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.299700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.299740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.300204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.300249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.300296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.300337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.300810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.300828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.300843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.300859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.303672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.303718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.303759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.303802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.304252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.304295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.304337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.304377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.304821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.304839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.304858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.304873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.306697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.306743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.306785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.306825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.307128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.307170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.307211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.307259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.307532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.307549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.307564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.307578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.309924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.309969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.310010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.310051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.310547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.310591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.310632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.310675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.311113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.311131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.311146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.311164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.312897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.312943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.312984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.313020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.313325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.313369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.313415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.313464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.313737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.313754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.313768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.313782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.316379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.316784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.317860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.319235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.321175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.322535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.324019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.325405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.325678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.325694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.325709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.325723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.328376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.328840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.330314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.331932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.333880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.334783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.336160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.337800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.338074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.338091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.338106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.890 [2024-05-15 00:13:35.338124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.341052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.342751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.344291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.345941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.346994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.348548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.350257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.351903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.352176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.352193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.352207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.352221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.355937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.357305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.358961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.360610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.362453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.363825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.365479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.367120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.367537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.367554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.367570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.367584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.371860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.373569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.375215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.376713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.378416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.380057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.381701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.382944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.383350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.383367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.383382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.383402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.387370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.388984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.390612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.391341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.393009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.394638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.396255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.396658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.397104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.397121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.397136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.397156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.401226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.402872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.404328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.405720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.407715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.409358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.410411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.410807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.411249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.411269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.411284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.411299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.415164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.416822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.417563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.418949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.421045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.422776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.423167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.423564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.423936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.423953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.423968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.423983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.427686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.428758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.430520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.432093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.434028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.891 [2024-05-15 00:13:35.434741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.435130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.435524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.436001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.436019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.436034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.436049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.439664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.440662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.442035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.443673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.445446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.445835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.446222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.446620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.447070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.447087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.447105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.447121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.449683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.451322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.453132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.454821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.455530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.455925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.456312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.456705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.457118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.457135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.457149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.457163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.460183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.461557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.463203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.464845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.465665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.466055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.466462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.466854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.467129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.467146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.467161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.467175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.470276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:34.892 [2024-05-15 00:13:35.471929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.155 [2024-05-15 00:13:35.473577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.155 [2024-05-15 00:13:35.474630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.155 [2024-05-15 00:13:35.475512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.155 [2024-05-15 00:13:35.475902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.155 [2024-05-15 00:13:35.476291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.155 [2024-05-15 00:13:35.477732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.155 [2024-05-15 00:13:35.478060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.155 [2024-05-15 00:13:35.478077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.155 [2024-05-15 00:13:35.478091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.155 [2024-05-15 00:13:35.478106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.155 [2024-05-15 00:13:35.481570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.155 [2024-05-15 00:13:35.483350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.155 [2024-05-15 00:13:35.485014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.155 [2024-05-15 00:13:35.485411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.155 [2024-05-15 00:13:35.486223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.155 [2024-05-15 00:13:35.486620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.487392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.488792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.489064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.489081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.489095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.489110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.492438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.494087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.494704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.495091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.495930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.496321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.498154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.499852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.500137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.500154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.500169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.500183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.503529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.503929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.504322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.504718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.505723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.507126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.508778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.510428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.510699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.510716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.510731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.510746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.513382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.513800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.514190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.514583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.516595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.518045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.519682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.521318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.521710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.521727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.521742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.521756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.523747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.524140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.524539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.524929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.526619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.528265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.529936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.530726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.531003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.531020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.531034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.531049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.533165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.533564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.533953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.535084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.537086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.538753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.540035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.541593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.541909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.541927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.541942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.541957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.544242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.544642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.545265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.546639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.548588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.550422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.551448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.552827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.553102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.553123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.553137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.553152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.555595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.555990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.557821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.559519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.561461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.562190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.563584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.565223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.565499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.156 [2024-05-15 00:13:35.565516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.565531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.565546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.568159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.569441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.570821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.572478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.573924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.575620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.577149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.578810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.579089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.579106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.579121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.579136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.582379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.583775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.585434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.587093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.588731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.590113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.591754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.593404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.593766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.593783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.593797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.593812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.598111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.599901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.601574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.603121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.604824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.606485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.608131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.609204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.609646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.609664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.609679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.609694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.613301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.614933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.616575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.617366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.619460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.621238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.622903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.623292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.623720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.623738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.623759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.623774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.627452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.629095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.630088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.631903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.633828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.635471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.636112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.636506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.636950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.636967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.636982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.636996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.640756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.642444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.643604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.644985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.646922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.648209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.648609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.649001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.649391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.649413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.649427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.649443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.652909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.653845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.655577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.657346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.659286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.659768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.660162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.660558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.661004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.661022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.661038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.661053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.663347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.664726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.666298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.667682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.668441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.668832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.669221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.157 [2024-05-15 00:13:35.669617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.669995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.670012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.670028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.670042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.672926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.673324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.673722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.674113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.674978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.675392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.675794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.676189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.676626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.676643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.676657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.676677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.679432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.679823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.680209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.680256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.681063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.681463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.681853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.682241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.682667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.682685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.682700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.682715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.685324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.685726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.686117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.686516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.686568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.687055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.687459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.687847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.688240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.688644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.689004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.689022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.689036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.689051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.691732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.691778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.691833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.691892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.692306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.692375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.692434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.692477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.692518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.692949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.692965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.692980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.692995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.695239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.695284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.695326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.695367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.695738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.695800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.695842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.695896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.695938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.696387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.696410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.696426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.696441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.698784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.698830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.698886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.698946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.699351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.699417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.699459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.699500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.699545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.699980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.699998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.700013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.700028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.702245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.702290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.702331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.702372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.702830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.702886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.702928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.158 [2024-05-15 00:13:35.702969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.703010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.703443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.703460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.703475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.703489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.705862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.705909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.705950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.705991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.706432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.706490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.706532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.706574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.706614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.706993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.707010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.707025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.707040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.709298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.709344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.709385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.709430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.709856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.709909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.709952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.710006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.710047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.710488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.710506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.710521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.710536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.712814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.712859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.712901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.712942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.713344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.713396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.713443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.713485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.713527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.713957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.713974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.713990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.714005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.716280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.716327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.716378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.716425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.716919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.716973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.717014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.717057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.717099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.717511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.717529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.717543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.717558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.719898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.719943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.719983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.720023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.720459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.720518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.720561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.720602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.720643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.720988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.721004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.721019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.721035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.723334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.723380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.723428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.723470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.723894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.723959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.724020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.724076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.724128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.724592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.724609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.724623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.724638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.727082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.727130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.727172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.727214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.727551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.727615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.727657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.727699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.727740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.728133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.728150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.159 [2024-05-15 00:13:35.728164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.728180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.730782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.730840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.730884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.730940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.731309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.731372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.731433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.731493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.731546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.731972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.731989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.732004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.732018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.734407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.734453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.734495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.734535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.734876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.734943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.734987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.735027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.735083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.735540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.735562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.735577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.735592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.737889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.737935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.737990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.738057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.738439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.738502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.738543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.738584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.738624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.739050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.739068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.739084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.160 [2024-05-15 00:13:35.739099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.741315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.741361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.741407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.741450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.741882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.741941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.741983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.742025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.742066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.742496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.742514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.742528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.742542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.744868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.744914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.744958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.744999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.745424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.745478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.745521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.745562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.745603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.746006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.746023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.746038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.746053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.748273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.748318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.748360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.748405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.748839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.748895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.748938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.748986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.749027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.749476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.749498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.749513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.749528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.751788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.751833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.751874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.751915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.752305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.752357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.752404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.752445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.752486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.752913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.752932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.752948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.752963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.755180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.428 [2024-05-15 00:13:35.755225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.755267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.755321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.755783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.755837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.755878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.755920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.755961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.756377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.756394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.756413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.756428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.758800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.758848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.758892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.758932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.759357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.759414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.759459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.759501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.759542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.759915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.759932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.759947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.759963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.762258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.762305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.762346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.762387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.762815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.762887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.762941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.763006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.763066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.763485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.763502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.763516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.763531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.765941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.765987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.766028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.766069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.766410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.766471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.766517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.766558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.766598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.767007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.767024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.767038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.767053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.769276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.769322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.769362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.769408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.769848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.769902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.769945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.769986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.770027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.770349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.770365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.770379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.770394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.771990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.772036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.772082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.772126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.772396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.772461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.772504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.772555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.772597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.772866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.772886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.772901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.772916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.774864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.774909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.774951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.774992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.775404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.775462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.775504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.775549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.775591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.776010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.776028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.776044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.776059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.777580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.777624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.429 [2024-05-15 00:13:35.777674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.777718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.778110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.778167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.778208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.778249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.778289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.778637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.778654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.778669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.778683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.780486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.780532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.780577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.780619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.781037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.781091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.781132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.781172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.781213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.781660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.781678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.781693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.781708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.783331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.783375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.783424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.783465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.783769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.783828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.783870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.783927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.783971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.784242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.784258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.784273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.784288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.785901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.785946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.785987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.786027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.786448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.786505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.786552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.786593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.786635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.787035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.787052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.787067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.787082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.788907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.788951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.788991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.789031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.789296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.789354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.789407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.789451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.789500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.789914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.789931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.789945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.789960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.791506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.791551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.791941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.791985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.792428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.792483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.792525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.792581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.792628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.793088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.793106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.793126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.793141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.794813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.794871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.794915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.796370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.796695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.796751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.796793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.796833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.796873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.797178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.797194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.797209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.797223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.799419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.799816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.430 [2024-05-15 00:13:35.800375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.801764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.802034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.803743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.805557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.806586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.807967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.808238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.808255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.808269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.808284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.810613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.811007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.812664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.814179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.814454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.816131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.816884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.818394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.820079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.820349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.820366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.820380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.820395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.822848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.823881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.825273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.826909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.827180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.828581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.830051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.831411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.833030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.833300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.833316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.833331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.833345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.835997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.837628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.839413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.841087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.841356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.842130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.843520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.845175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.846821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.847102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.847119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.847134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.847151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.850960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.852344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.853997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.855651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.856054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.857839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.859584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.861426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.863162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.863552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.863570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.863585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.863600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.867127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.868771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.870423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.871571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.871844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.873232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.874883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.876538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.877219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.877708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.877725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.877740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.877760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.881421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.883103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.884872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.885842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.886152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.887821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.889456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.890930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.891319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.891762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.891779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.891795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.891812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.895405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.897059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.897955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.899648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.899919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.901584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.903229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.431 [2024-05-15 00:13:35.903689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.904084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.904492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.904509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.904524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.904539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.908086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.909665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.910949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.912353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.912637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.914309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.915331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.915737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.916126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.916612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.916630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.916646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.916661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.919965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.920718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.922097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.923728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.924010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.925849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.926245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.926637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.927025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.927465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.927483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.927498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.927514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.930349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.931880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.933261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.934891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.935161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.936104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.936504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.936893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.937286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.937718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.937735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.937749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.937763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.940015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.941396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.943045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.944691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.944964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.945370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.945766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.946156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.946549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.946819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.946836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.946851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.946865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.950220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.951864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.953608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.955457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.955846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.956249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.956640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.957029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.958147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.958465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.958481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.958496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.958515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.961469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.963114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.964762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.965499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.966005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.966411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.966802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.432 [2024-05-15 00:13:35.967314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.968751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.969022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.969039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.969053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.969068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.972364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.974010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.975425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.975810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.976243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.976648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.977037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.978512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.979897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.980167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.980183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.980197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.980212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.983493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.985133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.985539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.985932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.986318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.986726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.987613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.989006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.990666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.990938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.990957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.990972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.990986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.994264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.995151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.995554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.995943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.996435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.996836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:35.998419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:36.000136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:36.001775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:36.002049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:36.002065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:36.002079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:36.002094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:36.005232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:36.005636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:36.006026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:36.006422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:36.006857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:36.008299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:36.009677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.433 [2024-05-15 00:13:36.011307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.695 [2024-05-15 00:13:36.012947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.695 [2024-05-15 00:13:36.013425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.695 [2024-05-15 00:13:36.013443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.695 [2024-05-15 00:13:36.013457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.695 [2024-05-15 00:13:36.013472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.695 [2024-05-15 00:13:36.015467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.695 [2024-05-15 00:13:36.015861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.695 [2024-05-15 00:13:36.016250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.695 [2024-05-15 00:13:36.016644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.695 [2024-05-15 00:13:36.016993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.695 [2024-05-15 00:13:36.018389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.695 [2024-05-15 00:13:36.020211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.695 [2024-05-15 00:13:36.021874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.695 [2024-05-15 00:13:36.022608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.695 [2024-05-15 00:13:36.022880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.695 [2024-05-15 00:13:36.022898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.695 [2024-05-15 00:13:36.022912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.695 [2024-05-15 00:13:36.022927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.695 [2024-05-15 00:13:36.024948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.695 [2024-05-15 00:13:36.025346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.695 [2024-05-15 00:13:36.025741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.695 [2024-05-15 00:13:36.027105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.695 [2024-05-15 00:13:36.027420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.695 [2024-05-15 00:13:36.029095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.695 [2024-05-15 00:13:36.030728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.031740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.033588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.033863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.033880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.033894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.033909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.036195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.036594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.037382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.038776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.039050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.040760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.042327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.043635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.045025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.045297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.045313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.045327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.045341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.047812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.048205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.049899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.051693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.051967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.053647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.054391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.055773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.057418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.057687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.057704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.057718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.057733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.060249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.061701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.063087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.064739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.065008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.065954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.067652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.069450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.071189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.071465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.071483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.071497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.071511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.074759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.076155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.077816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.079473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.079805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.081342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.082742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.084404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.086051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.086434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.086451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.086467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.086484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.090191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.091855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.093512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.094817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.095126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.096503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.098157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.099832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.100667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.101155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.101175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.101190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.101205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.105069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.106732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.107137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.108810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.109082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.110771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.111300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.111700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.112090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.112582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.112605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.112622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.112639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.115230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.115632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.116025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.116425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.116832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.117236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.117631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.118020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.118432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.118831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.118849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.118864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.118879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.122110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.122531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.122924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.123314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.123777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.124181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.124590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.125003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.125397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.125792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.125810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.125825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.125840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.128443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.128836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.129227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.129624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.129986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.130407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.130817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.131206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.131602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.131990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.132007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.132022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.132037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.134657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.135059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.135463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.135856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.136266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.136678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.137070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.137484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.137900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.138347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.696 [2024-05-15 00:13:36.138366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.138381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.138397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.141005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.141408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.141456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.141842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.142284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.142694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.143092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.143490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.143881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.144316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.144334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.144348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.144363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.146930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.147322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.147720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.147776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.148157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.148579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.148971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.149358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.149758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.150165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.150186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.150201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.150216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.152578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.152625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.152666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.152707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.153138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.153192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.153234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.153276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.153316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.153681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.153699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.153714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.153728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.156025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.156071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.156114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.156155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.156589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.156655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.156696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.156752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.156809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.157209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.157227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.157241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.157256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.159583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.159629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.159675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.159716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.160042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.160105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.160148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.160191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.160233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.160632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.160649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.160665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.160680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.163218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.163276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.163319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.163377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.163784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.163850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.163915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.163971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.164014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.164432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.164450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.164465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.164480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.166797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.166843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.166883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.166924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.167263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.167331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.167378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.167428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.167473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.167965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.167984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.168000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.168015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.170306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.170352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.170416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.170472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.170843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.170896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.170938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.170979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.171019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.171448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.171466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.171482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.171497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.173725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.173771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.173814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.173855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.174287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.174340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.174381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.174431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.174472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.174895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.174911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.174936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.174952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.177215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.177262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.177305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.177347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.177772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.177827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.177870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.177911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.177953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.178343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.178360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.178376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.178391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.180606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.697 [2024-05-15 00:13:36.180653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.180694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.180736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.181176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.181238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.181281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.181321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.181361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.181823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.181842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.181857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.181873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.184181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.184227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.184273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.184314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.184702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.184757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.184799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.184840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.184882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.185306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.185325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.185340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.185355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.187649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.187697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.187746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.187788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.188250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.188306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.188348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.188389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.188437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.188852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.188869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.188883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.188899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.191226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.191272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.191317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.191361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.191793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.191848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.191892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.191938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.191991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.192422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.192439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.192454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.192469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.194800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.194846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.194887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.194929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.195337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.195425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.195488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.195541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.195582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.195979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.195996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.196011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.196026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.198434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.198481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.198524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.198566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.198938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.199003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.199047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.199100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.199142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.199576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.199593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.199613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.199628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.202060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.202124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.202168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.202208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.202480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.202539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.202582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.202623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.202665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.202967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.202985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.202999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.203013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.205242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.205289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.205334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.205377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.205811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.205876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.205918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.205975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.206028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.206460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.206478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.206492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.206506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.208850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.208901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.208943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.208990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.209291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.209350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.209391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.209438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.209478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.209789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.209807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.209821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.209835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.698 [2024-05-15 00:13:36.211454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.211499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.211545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.211586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.211848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.211909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.211953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.211994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.212034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.212296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.212313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.212328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.212342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.214531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.214578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.214619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.214660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.215095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.215149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.215195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.215239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.215279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.215555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.215573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.215588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.215603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.217209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.217254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.217298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.217338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.217665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.217724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.217765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.217805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.217845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.218111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.218127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.218141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.218156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.220485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.220545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.220586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.220627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.221074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.221130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.221173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.221216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.221259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.221607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.221625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.221639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.221657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.223208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.223255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.223296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.223337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.223609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.223665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.223716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.223757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.223817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.224081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.224098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.224113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.224127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.226104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.226151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.226193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.226235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.226639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.226692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.226733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.226775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.226816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.227253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.227270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.227285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.227300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.228823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.228869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.228913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.228953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.229472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.229534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.229576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.229616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.229656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.229987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.230004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.230018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.230033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.231736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.231783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.231826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.231867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.232288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.232344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.232386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.232451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.232503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.232958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.232977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.232993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.233009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.234688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.234733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.234780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.234825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.235092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.235152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.235197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.235237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.235282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.235575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.235593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.235608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.235622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.237216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.237262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.237318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.237358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.237825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.237879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.237923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.237964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.238006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.238387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.238410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.238425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.238440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.240338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.240383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.240436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.240477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.240744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.240803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.699 [2024-05-15 00:13:36.240845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.240885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.240925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.241297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.241314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.241329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.241344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.242858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.242905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.242954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.242998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.243425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.243492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.243534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.243575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.243616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.244060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.244077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.244093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.244108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.246079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.246124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.246164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.246205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.246473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.246532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.246574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.246614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.246654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.246920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.246937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.246951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.246966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.248646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.248695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.250211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.250257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.250700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.250754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.250796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.250837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.250878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.251320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.251338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.251356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.251371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.253555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.253601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.253647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.255362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.255646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.255701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.255753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.255795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.255836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.256156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.256173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.256187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.256202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.258266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.258675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.259075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.259520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.259793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.261291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.262957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.264720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.265743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.266064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.266081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.266095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.266110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.268161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.268562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.268951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.270611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.270950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.272627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.274259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.275000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.276504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.276777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.276794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.276809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.276824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.279139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.279544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.700 [2024-05-15 00:13:36.280594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.281988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.282263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.283947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.285218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.286874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.288364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.288659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.288676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.288691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.288705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.291205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.291854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.293234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.294868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.295144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.296880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.298036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.299428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.301056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.301329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.301346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.301361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.301375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.303990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.305808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.307497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.309311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.309596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.310346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.311734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.313390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.315057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.315332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.315349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.315363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.315379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.319097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.320501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.322146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.323813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.324229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.325918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.327720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.329395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.330964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.331316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.331335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.331349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.331364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.335009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.336678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.338337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.339197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.339476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.340872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.342530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.344191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.344594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.345031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.960 [2024-05-15 00:13:36.345049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.345064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.345080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.348832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.350484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.351856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.353319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.353651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.355311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.356961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.357839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.358237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.358684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.358708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.358723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.358738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.362601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.364333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.365225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.366621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.366893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.368582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.370028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.370426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.370817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.371262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.371279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.371295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.371309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.374722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.375612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.377293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.379114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.379392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.381060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.381462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.381853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.382244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.382680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.382698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.382714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.382729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.385879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.387322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.388721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.390366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.390658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.391605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.391999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.392385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.392780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.393201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.393218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.393232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.393247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.395517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.396903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.398552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.400188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.400510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.400919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.401312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.401709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.402102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.402375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.402392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.402414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.402428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.405739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.407522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.409176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.410714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.411109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.411523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.411916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.412304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.413785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.414097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.414114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.414128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.414143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.417118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.418783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.420445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.420844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.421299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.421710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.422102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.422798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.424182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.424459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.424477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.424492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.424506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.427772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.429421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.430379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.430803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.431254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.431661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.432050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.433693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.435489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.961 [2024-05-15 00:13:36.435766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.435787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.435801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.435816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.439085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.440596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.440989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.441381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.441745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.442150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.443404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.444786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.446446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.446721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.446737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.446752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.446766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.450107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.450514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.450907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.451300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.451745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.452667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.454050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.455722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.457373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.457715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.457734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.457748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.457763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.460158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.460567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.460967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.461359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.461798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.463608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.465336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.467183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.468861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.469293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.469311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.469326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.469340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.471328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.471727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.472119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.472516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.472788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.474185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.475815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.477466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.478216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.478505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.478523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.478538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.478552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.480565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.480966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.481359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.482560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.482881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.484567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.486208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.487340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.489097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.489405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.489422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.489437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.489451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.491807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.492202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.493147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.494543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.494819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.496506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.497919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.499386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.500979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.501258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.501274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.501289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.501303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.503763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.504949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.506333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.507885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.508213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.509700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.511078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.512632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.513247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.513710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.513728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.513748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.513764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.517567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.962 [2024-05-15 00:13:36.519225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.520637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.521679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.521990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.523695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.525354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.526653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.527047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.527481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.527499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.527514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.527529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.530132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.530533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.530923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.531312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.531711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.532118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.532518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.532910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.533323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.533773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.533791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.533807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.533822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.536583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.536981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.537386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.537806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.538255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.538665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.539058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.539452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.539854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.540277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.540294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.540308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.540324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.543094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.543504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.543897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.544289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.544728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.545135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.545544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.545935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.546324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.546734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.546751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.546766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:35.963 [2024-05-15 00:13:36.546781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.246 [2024-05-15 00:13:36.549358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.246 [2024-05-15 00:13:36.549765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.246 [2024-05-15 00:13:36.550158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.246 [2024-05-15 00:13:36.550553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.246 [2024-05-15 00:13:36.550881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.246 [2024-05-15 00:13:36.551287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.246 [2024-05-15 00:13:36.551682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.246 [2024-05-15 00:13:36.552073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.246 [2024-05-15 00:13:36.552468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.246 [2024-05-15 00:13:36.552828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.246 [2024-05-15 00:13:36.552845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.246 [2024-05-15 00:13:36.552860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.246 [2024-05-15 00:13:36.552875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.246 [2024-05-15 00:13:36.555516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.246 [2024-05-15 00:13:36.555922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.246 [2024-05-15 00:13:36.556317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.246 [2024-05-15 00:13:36.556716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.246 [2024-05-15 00:13:36.557168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.557577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.557969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.558381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.558790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.559246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.559265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.559281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.559297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.561985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.562383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.562778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.563168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.563581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.563988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.564410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.564802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.565196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.565670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.565689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.565704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.565725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.568474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.568867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.569259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.569663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.570088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.570501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.570894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.571287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.571683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.572110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.572127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.572141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.572156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.574867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.575267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.575316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.575714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.576153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.576566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.576959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.577352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.577755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.578205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.578223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.578239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.578254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.580886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.581287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.581689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.581751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.582221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.582630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.583029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.583434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.583830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.584243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.584260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.584275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.584290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.586620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.586666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.586707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.586748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.587176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.587229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.587271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.587313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.587354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.587766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.587785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.587801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.587816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.590081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.590128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.590169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.590210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.590647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.590707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.590750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.590791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.590836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.591298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.591316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.591331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.591346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.593614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.593661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.593702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.593743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.594113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.594175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.594217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.594259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.594302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.594739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.594757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.594774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.594791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.597217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.597280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.597322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.597363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.597828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.597882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.597926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.597968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.598009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.598395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.598418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.598432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.598451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.600803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.600849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.600893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.600934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.601355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.601419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.601466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.601509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.601559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.601918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.601935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.601949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.601963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.604275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.604325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.604366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.604415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.604838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.604904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.604964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.247 [2024-05-15 00:13:36.605008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.605049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.605427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.605444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.605459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.605475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.607770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.607816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.607859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.607920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.608388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.608459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.608514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.608555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.608611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.609001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.609018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.609033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.609047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.611464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.611531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.611589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.611630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.611997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.612060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.612102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.612143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.612185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.612588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.612604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.612619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.612634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.614966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.615011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.615051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.615091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.615472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.615554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.615602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.615648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.615689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.615964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.615980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.615995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.616009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.618433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.618479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.618521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.618563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.618959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.619024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.619067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.619108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.619150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.619488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.619506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.619521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.619536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.622118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.622170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.622210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.622250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.622527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.622587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.622640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.622680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.622720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.622992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.623009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.623025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.623040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.624653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.624698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.624743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.624783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.625052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.625115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.625156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.625198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.625259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.625534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.625552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.625568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.625583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.628003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.628053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.628095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.628136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.628411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.628472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.628513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.628557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.628605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.628877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.628894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.628909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.628923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.630572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.630621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.630661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.630702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.630968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.631035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.631077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.631118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.631158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.631433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.631450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.631465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.631479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.633823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.633873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.633914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.633956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.634352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.634416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.634458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.634498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.634538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.634872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.634888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.634903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.634918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.636519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.636563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.636608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.636655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.636925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.636980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.637021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.637064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.637114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.248 [2024-05-15 00:13:36.637389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.637415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.637429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.637444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.639578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.639624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.639665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.639708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.640147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.640203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.640244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.640286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.640344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.640622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.640640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.640654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.640669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.642273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.642317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.642358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.642405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.642707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.642765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.642807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.642848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.642888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.643158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.643175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.643190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.643205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.645354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.645426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.645482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.645523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.645986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.646039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.646081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.646123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.646168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.646522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.646540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.646554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.646569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.648105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.648151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.648196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.648238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.648515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.648576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.648625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.648667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.648715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.648987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.649004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.649019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.649034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.650974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.651020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.651065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.651108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.651533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.651604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.651649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.651690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.651732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.652173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.652191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.652207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.652222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.653738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.653783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.653824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.653873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.654251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.654305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.654346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.654387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.654433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.654769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.654786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.654801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.654815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.656701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.656748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.656790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.656832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.657230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.657292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.657335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.657375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.657424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.657859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.657881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.657896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.657915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.659537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.659582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.659626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.659668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.659997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.660056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.660099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.660144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.660185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.660459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.660478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.660493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.660508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.662258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.662303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.662344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.662386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.662831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.662887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.662929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.662974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.663016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.663457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.663474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.663490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.663506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.665246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.665290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.665334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.665383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.249 [2024-05-15 00:13:36.665660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.665714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.665762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.665805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.665847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.666164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.666180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.666195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.666210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.667876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.667922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.667983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.668027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.668474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.668530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.668572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.668614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.668656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.669038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.669054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.669070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.669086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.671114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.671161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.671205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.671246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.671519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.671579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.671625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.671665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.671706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.672106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.672124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.672140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.672155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.673748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.673796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.673845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.673886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.674240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.674304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.674346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.674387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.674433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.674868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.674886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.674902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.674918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.676957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.677001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.677041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.677081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.677344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.677410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.677453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.677494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.677543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.677812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.677829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.677847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.677861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.679553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.679599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.680654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.680712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.681188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.681253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.681296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.681337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.681378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.681819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.681837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.681852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.681867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.683962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.684010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.684051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.685709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.685987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.686048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.686092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.686134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.686181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.686455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.686472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.686486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.686500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.688537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.688935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.689331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.690264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.690598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.692281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.693927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.695373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.696798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.697135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.697151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.697166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.697180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.699424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.699834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.700227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.701812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.702084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.703756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.705418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.706259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.707644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.707917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.707933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.707948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.707962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.710337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.710740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.712275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.713673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.713942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.715631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.716466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.718059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.719794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.720064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.720081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.720095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.250 [2024-05-15 00:13:36.720110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.722623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.723658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.725040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.726694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.726966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.728342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.729861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.731226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.732861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.733132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.733148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.733162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.733177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.735977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.737485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.739127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.740774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.741045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.741964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.743344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.745013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.746655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.746991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.747008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.747026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.747041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.751206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.752740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.754389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.756145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.756621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.758104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.759766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.761424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.762821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.763250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.763267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.763282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.763297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.766909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.768574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.770229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.770990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.771260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.772929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.774723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.776460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.776853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.777289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.777307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.777322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.777338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.781052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.782723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.783751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.785568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.785843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.787518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.789167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.789694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.790089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.790512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.790530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.790546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.790561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.794210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.795708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.797089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.798491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.798761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.800434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.801445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.801854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.802244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.802716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.802734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.802749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.802765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.806145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.806965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.808358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.810022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.810293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.811830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.812221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.812622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.813012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.813468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.813485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.813500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.813515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.251 [2024-05-15 00:13:36.816060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.817805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.819545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.821376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.821656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.822183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.822582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.822974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.823363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.823747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.823764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.823779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.823793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.826658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.828049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.829709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.831346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.831716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.832131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.832529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.832924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.833448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.833720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.833737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.833751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.833771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.836814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.838460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.840106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.841170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.841594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.841999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.842390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.842787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.844506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.844822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.844839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.844853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.844868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.848406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.850069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.851615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.852007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.852441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.852844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.853238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.854490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.855875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.856144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.856161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.856175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.856190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.859524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.861182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.861586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.861981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.862375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.862782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.863561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.864947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.866611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.866881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.866897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.866912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.866926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.870287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.871024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.871424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.871816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.872266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.872671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.874222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.875929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.877582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.877853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.877870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.877884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.877900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.880829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.881229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.881626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.882019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.882464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.883913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.885311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.886973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.888642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.889043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.889059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.513 [2024-05-15 00:13:36.889073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.889088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.891012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.891414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.891803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.892195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.892523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.893912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.895567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.897220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.898220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.898497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.898514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.898528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.898543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.900569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.900966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.901358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.901755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.902027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.903716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.905517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.906157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.907808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.908078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.908094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.908108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.908123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.910695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.912148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.913538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.915232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.915515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.916339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.918010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.919503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.921189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.921463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.921480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.921495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.921509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.924269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.924677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.925074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.925472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.925844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.926248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.926647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.927045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.927446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.927898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.927918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.927934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.927949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.930536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.930933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.931325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.931728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.932074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.932486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.932888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.933279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.933681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.934119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.934137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.934153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.934168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.936839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.937237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.937640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.938037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.938500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.938902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.939292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.939688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.940088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.940496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.940513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.940528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.940543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.943803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.944209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.944612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.945003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.945485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.945886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.946282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.946682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.947077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.947527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.947544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.947559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.947574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.950303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.950706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.514 [2024-05-15 00:13:36.951098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.951496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.952019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.952431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.952827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.953219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.953614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.954022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.954039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.954053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.954068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.956667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.957065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.957468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.957862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.958300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.958711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.959102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.959498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.959893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.960258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.960276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.960291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.960305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.963046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.963446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.963840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.964232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.964655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.965061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.965460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.966511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.967434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.967859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.967876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.967892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.967911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.970334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.971462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.972306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.972701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.973009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.974093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.974490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.974881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.975277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.975582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.975599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.975614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.975628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.978149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.978603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.979005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.980814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.981248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.981663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.983348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.983751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.984144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.984566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.984584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.984599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.984614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.988633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.989037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.989436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.989830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.990217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.992000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.992404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.992794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.994445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.994936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.994954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.994970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.994985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.997433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.997923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.997972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.999253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:36.999721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:37.000124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:37.000527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:37.001197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:37.002495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:37.002951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:37.002974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:37.002991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:37.003007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:37.005539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:37.006331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:37.007509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:37.007557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:37.007987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:37.008755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.515 [2024-05-15 00:13:37.009941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.010334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.010737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.011111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.011128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.011143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.011158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.013193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.013239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.013280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.013320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.013753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.013816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.013858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.013900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.013943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.014362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.014381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.014396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.014416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.016596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.016649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.016691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.016732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.017070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.017125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.017166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.017207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.017248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.017598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.017615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.017630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.017645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.019728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.019776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.019821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.019878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.020356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.020415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.020458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.020500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.020541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.020936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.020953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.020967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.020983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.023183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.023242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.023284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.023327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.023669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.023724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.023769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.023809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.023849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.024234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.024251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.024266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.024281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.026526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.026572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.026626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.026667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.027137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.027204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.027249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.027311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.027366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.027715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.027733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.027747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.027762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.029861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.029906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.029947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.029991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.030426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.030478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.030519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.030564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.030605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.030934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.030955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.030969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.030983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.032867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.032923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.032965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.033006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.033452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.033505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.033546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.033591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.033632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.033948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.033965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.516 [2024-05-15 00:13:37.033979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.033993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.036180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.036226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.036271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.036326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.036799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.036861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.036905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.036946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.036986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.037416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.037433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.037447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.037463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.039871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.039916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.039961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.040002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.040429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.040485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.040527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.040569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.040610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.041012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.041029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.041043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.041059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.042957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.043002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.043042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.043082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.043352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.043416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.043457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.043504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.043545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.043880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.043897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.043911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.043926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.045586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.045634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.045690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.045743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.046221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.046272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.046319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.046361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.046409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.046806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.046823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.046837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.046852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.048939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.048984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.049028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.049069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.049335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.049394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.049443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.049484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.049524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.049879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.049897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.049911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.049926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.051519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.051572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.051613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.051655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.052054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.052119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.052161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.052202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.052244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.052684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.052702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.052723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.052739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.517 [2024-05-15 00:13:37.054790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.054835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.054875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.054922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.055188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.055240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.055293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.055337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.055378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.055649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.055666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.055682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.055697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.057324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.057369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.057415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.057456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.057853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.057921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.057965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.058006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.058047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.058489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.058507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.058522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.058538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.060649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.060695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.060738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.060783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.061049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.061108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.061149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.061189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.061230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.061499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.061516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.061531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.061545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.065778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.065828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.065875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.065917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.066339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.066394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.066442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.066483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.066530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.066987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.067005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.067020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.067035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.070870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.070920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.070961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.071001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.071325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.071384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.071432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.071477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.071517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.071780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.071796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.071810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.071825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.076265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.076315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.076356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.076396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.076735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.076794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.076835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.076875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.076915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.077183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.077200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.077214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.077229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.082354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.082430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.082477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.082518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.082877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.082938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.082979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.083021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.083063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.083494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.083511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.083531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.083553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.087134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.087184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.087225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.518 [2024-05-15 00:13:37.087273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.519 [2024-05-15 00:13:37.087674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.519 [2024-05-15 00:13:37.087733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.519 [2024-05-15 00:13:37.087774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.519 [2024-05-15 00:13:37.087815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.519 [2024-05-15 00:13:37.087855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.519 [2024-05-15 00:13:37.088200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.519 [2024-05-15 00:13:37.088216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.519 [2024-05-15 00:13:37.088231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.519 [2024-05-15 00:13:37.088245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.519 [2024-05-15 00:13:37.092684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.519 [2024-05-15 00:13:37.092740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.519 [2024-05-15 00:13:37.092781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.519 [2024-05-15 00:13:37.092825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.519 [2024-05-15 00:13:37.093092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.519 [2024-05-15 00:13:37.093144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.519 [2024-05-15 00:13:37.093192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.519 [2024-05-15 00:13:37.093234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.519 [2024-05-15 00:13:37.093279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.519 [2024-05-15 00:13:37.093556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.519 [2024-05-15 00:13:37.093572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.519 [2024-05-15 00:13:37.093587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.519 [2024-05-15 00:13:37.093602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.519 [2024-05-15 00:13:37.098332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.519 [2024-05-15 00:13:37.098382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.519 [2024-05-15 00:13:37.098429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.519 [2024-05-15 00:13:37.098474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.098959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.099026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.099080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.099122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.099163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.099602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.099621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.099636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.099651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.103363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.103420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.103466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.103507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.103824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.103883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.103925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.103970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.104010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.104278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.104294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.104309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.104324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.108321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.108372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.108417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.108460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.108897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.108953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.108997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.109038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.109084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.109352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.109369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.109383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.109397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.113274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.113324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.113364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.113410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.113677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.113736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.113778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.113818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.779 [2024-05-15 00:13:37.113858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.114222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.114239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.114254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.114268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.117076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.117134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.117177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.117218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.117490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.117549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.117590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.117630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.117670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.117997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.118014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.118029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.118048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.123429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.123486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.123975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.124018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.124059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.124497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.124516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.124531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.128943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.128994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.129042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.129086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.134165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.134215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.134255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.134614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.134658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.134699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.139736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.139798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.140186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.140235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.140723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.140768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.140810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.140850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.143189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.143235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.143276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.143326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.144964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.145283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.145344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.145386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.145437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.145477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.147776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.148173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.148570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.149599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.149907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.149973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.151609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.151664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.153287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.153341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.154328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.154393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.156056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.156325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.156341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.156355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.156369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.158752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.159150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.160340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.161730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.161999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.163680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.164872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.166547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.168076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.780 [2024-05-15 00:13:37.168343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.168360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.168374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.168388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.170860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.171718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.173108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.174764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.175031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.176593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.177937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.179326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.180988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.181255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.181272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.181286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.181300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.184235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.185639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.187296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.188936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.189204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.190329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.191717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.193357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.195004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.195358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.195375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.195391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.195414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.199751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.201498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.203156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.204667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.204974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.206364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.208019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.209679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.210681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.211141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.211161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.211177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.211194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.214960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.216687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.218509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.219576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.219890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.221554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.223196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.224542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.224932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.225372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.225389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.225414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.225430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.229149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.230796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.231545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.232937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.233209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.234965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.236572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.236963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.237350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.237766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.237783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.237797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.237812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.241342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.242123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.243679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.245373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.245644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.247299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.247700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.248090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.248489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.248904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.248922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.248939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.248955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.251743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.253484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.255074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.256781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.257050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.257708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.258097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.258489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.781 [2024-05-15 00:13:37.258882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.259244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.259261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.259275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.259289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.262252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.263628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.265263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.266914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.267294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.267715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.268104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.268497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.269233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.269507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.269524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.269539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.269554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.272591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.274249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.275907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.276736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.277208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.277612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.278013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.278428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.279953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.280222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.280238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.280252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.280267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.283603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.285267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.286386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.286779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.287215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.287621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.288023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.289765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.291571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.291843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.291859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.291874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.291888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.295250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.296553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.296943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.297331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.297790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.298192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.299874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.301435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.303099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.303367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.303383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.303402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.303417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.306738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.307136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.307530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.307918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.308353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.309746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.311169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.312832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.314495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.314917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.314934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.314948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.314963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.316906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.317305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.317699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.318088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.318360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.319741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.321260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.322207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.323931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.324199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.324215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.324229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.324244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.326463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.328134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.328531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.782 [2024-05-15 00:13:37.329338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.329642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.331478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.333157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.334726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.336047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.336359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.336375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.336390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.336408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.338537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.338930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.339319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.340949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.341220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.342919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.344578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.344993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.346523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.346794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.346810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.346825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.346839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.349179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.349577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.349970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.350366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.350739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.351140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.351534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.351923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.352313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.352664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.352681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.352696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.352713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.355601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.355999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.356409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.356798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.357273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.357680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.358077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.358476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.358871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.359312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.359329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.359343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.359358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.362139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.362542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.362933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.363325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.363759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.364161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.364558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.364946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.365339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.365739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.365757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.365772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.783 [2024-05-15 00:13:37.365787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.045 [2024-05-15 00:13:37.368414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.045 [2024-05-15 00:13:37.368809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.045 [2024-05-15 00:13:37.369205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.045 [2024-05-15 00:13:37.369602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.045 [2024-05-15 00:13:37.370044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.045 [2024-05-15 00:13:37.370450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.045 [2024-05-15 00:13:37.370841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.045 [2024-05-15 00:13:37.371231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.045 [2024-05-15 00:13:37.371629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.045 [2024-05-15 00:13:37.372090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.045 [2024-05-15 00:13:37.372108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.045 [2024-05-15 00:13:37.372125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.045 [2024-05-15 00:13:37.372140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.045 [2024-05-15 00:13:37.374825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.375235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.375634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.376023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.376471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.376877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.377275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.377675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.378063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.378501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.378519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.378534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.378550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.381207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.381604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.381997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.382391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.382823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.383224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.383617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.384007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.384402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.384765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.384787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.384801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.384816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.387532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.387932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.388325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.388718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.389144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.389549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.389942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.390344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.390757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.391201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.391219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.391234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.391249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.394066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.394462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.394852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.395243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.395632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.396035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.396433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.396821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.397208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.397631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.397649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.397664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.397678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.400325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.400729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.401124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.401520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.401953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.402366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.402764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.403153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.403552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.403942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.403958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.403973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.403989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.406648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.407043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.407436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.407843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.408281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.408688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.409084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.409491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.409879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.410368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.410385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.410405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.410425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.413047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.413449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.413842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.414242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.414651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.415055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.415453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.046 [2024-05-15 00:13:37.415842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.416236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.416649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.416666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.416680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.416695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.419432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.419829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.420221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.420616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.421090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.421491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.421881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.422274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.422666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.423112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.423130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.423145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.423161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.426029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.426423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.426813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.427206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.427481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.428107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.429719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.430113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.430508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.430927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.430949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.430965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.430980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.434412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.434468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.434867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.435261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.435682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.436081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.436474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.436863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.438479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.438799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.438816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.438830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.438844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.441884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.443549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.445268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.445323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.445677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.446078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.446123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.446515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.446558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.447000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.447018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.447033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.447052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.448633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.448678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.448722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.448762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.449162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.450991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.451034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.452785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.452832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.453100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.453117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.453131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.453146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.455336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.455381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.455428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.455469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.455898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.455950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.455994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.456037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.456082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.456351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.047 [2024-05-15 00:13:37.456368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.456382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.456396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.458024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.458069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.458109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.458150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.458464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.458522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.458566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.458607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.458648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.458911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.458928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.458942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.458957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.461329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.461375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.461421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.461462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.461891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.461943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.461985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.462026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.462070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.462403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.462420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.462435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.462449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.464038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.464083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.464125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.464168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.464440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.464501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.464543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.464608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.464649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.464919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.464935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.464954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.464968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.466955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.467001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.467042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.467087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.467494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.467546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.467588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.467630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.467671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.468102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.468121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.468137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.468152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.048 [2024-05-15 00:13:37.469682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.049 [2024-05-15 00:13:37.469726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.049 [2024-05-15 00:13:37.469774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.049 [2024-05-15 00:13:37.469817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.049 [2024-05-15 00:13:37.470170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.049 [2024-05-15 00:13:37.470224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.049 [2024-05-15 00:13:37.470265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.049 [2024-05-15 00:13:37.470305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.049 [2024-05-15 00:13:37.470344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.049 [2024-05-15 00:13:37.470671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.049 [2024-05-15 00:13:37.470688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.049 [2024-05-15 00:13:37.470703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.049 [2024-05-15 00:13:37.470717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.049 [2024-05-15 00:13:37.472510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.049 [2024-05-15 00:13:37.472555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.049 [2024-05-15 00:13:37.472601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.472647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.473056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.473108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.473149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.473189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.473230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.473662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.473680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.473695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.473710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.475330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.475374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.475419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.475460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.475778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.475840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.475882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.475927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.475967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.476236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.476254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.476269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.476283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.477973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.478018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.478059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.478100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.478536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.478589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.478630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.478676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.478717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.479175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.479192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.479209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.479224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.480930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.480975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.481016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.481070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.481334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.481385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.481441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.481483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.481525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.481875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.481892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.481906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.481921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.483533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.483578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.483638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.483680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.484133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.484185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.484227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.484268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.484310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.484694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.484712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.484726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.484745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.486630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.486674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.486714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.486754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.487017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.487080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.487121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.487161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.487201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.487549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.487567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.487581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.487596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.050 [2024-05-15 00:13:37.489126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.489172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.489221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.489263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.489673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.489725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.489766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.489806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.489847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.490279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.490296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.490315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.490330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.492329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.492374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.492425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.492478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.492748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.492801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.492850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.492895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.492935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.493203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.493220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.493236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.493251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.494863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.494909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.494953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.494995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.495377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.495461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.495506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.495547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.495588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.496042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.496060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.496076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.496093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.498193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.498238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.498278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.498318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.498586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.498649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.498690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.498730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.498775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.499042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.499058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.499073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.499088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.501023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.501072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.501113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.501153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.501525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.501585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.501628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.501669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.501710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.502147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.502165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.502180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.502196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.504290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.504335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.504376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.504423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.504692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.504749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.504790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.504830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.504870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.505134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.505151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.505165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.505184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.506874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.506929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.506973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.507013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.051 [2024-05-15 00:13:37.507281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.507341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.507384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.507431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.507472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.507928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.507945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.507961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.507977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.510120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.510173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.510214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.510260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.510533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.510597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.510643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.510684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.510724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.510989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.511006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.511020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.511035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.512701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.512747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.512786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.512826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.513099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.513159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.513201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.513241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.513286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.513664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.513682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.513696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.513711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.516023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.516073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.516113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.516153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.516474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.516532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.516574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.516615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.516655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.516921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.516938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.516952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.516967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.518634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.518694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.518739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.518780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.519046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.519105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.519146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.519186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.519227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.519553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.519571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.519586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.519604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.522035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.522085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.522126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.522167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.522439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.522501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.522548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.522588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.522628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.522896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.522914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.522929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.052 [2024-05-15 00:13:37.522944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.524557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.524602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.524643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.524683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.524950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.525008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.525049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.525089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.525137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.525412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.525429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.525444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.525458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.527864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.527912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.527953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.527995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.528298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.528353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.528394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.528441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.528482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.528787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.528804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.528819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.528835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.530477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.530532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.530902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.530949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.530991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.531258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.531276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.531290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.553664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.554530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.554576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.554627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.555977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.556251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.563294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.563360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.564788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.564844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.566381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.566446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.567929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.568203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.568220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.568234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.568249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.570721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.571387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.572778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.574421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.576527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.577616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.579028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.580676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.580951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.580968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.580982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.580997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.583646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.585421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.587044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.588794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.589815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.591240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.592895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.594563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.594839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.594856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.594873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.594892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.598455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.599851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.601516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.603162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.605423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.607115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.608937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.610616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.611082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.611099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.611114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.053 [2024-05-15 00:13:37.611129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.054 [2024-05-15 00:13:37.614678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.054 [2024-05-15 00:13:37.616343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.054 [2024-05-15 00:13:37.618004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.054 [2024-05-15 00:13:37.618969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.054 [2024-05-15 00:13:37.620629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.054 [2024-05-15 00:13:37.622281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.054 [2024-05-15 00:13:37.623947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.054 [2024-05-15 00:13:37.624469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.054 [2024-05-15 00:13:37.624927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.054 [2024-05-15 00:13:37.624945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.054 [2024-05-15 00:13:37.624961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.054 [2024-05-15 00:13:37.624976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.054 [2024-05-15 00:13:37.628870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.054 [2024-05-15 00:13:37.630568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.632188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.633412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.635376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.637032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.638197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.638597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.639039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.639057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.639073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.639088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.642754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.644430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.645152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.646546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.648647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.650333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.650730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.651124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.651547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.651566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.651581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.651595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.655060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.656092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.657926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.659617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.661568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.662115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.662511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.662900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.663331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.663350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.663368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.663384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.666674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.667899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.669287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.670952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.672422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.672815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.673203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.673601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.674024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.674042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.674058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.674073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.676307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.677731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.679372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.681017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.681687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.682078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.682474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.682863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.683164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.683180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.683195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.683209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.686128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.687530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.689175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.690826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.691751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.692142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.692536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.692939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.693213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.693229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.693244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.693259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.696554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.698007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.698406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.698795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.699612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.701110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.702513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.704166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.704450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.704467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.704482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.704496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.707794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.708199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.708595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.708985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.710349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.711735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.316 [2024-05-15 00:13:37.713422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.715083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.715409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.715427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.715444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.715459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.718156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.718560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.718955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.719345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.720186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.720589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.720978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.721368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.721831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.721851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.721866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.721881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.724557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.724955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.725348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.725751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.726540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.726932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.727321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.727724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.728108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.728127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.728145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.728161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.730936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.731335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.731735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.732124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.732982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.733393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.733797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.734187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.734637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.734655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.734671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.734686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.737284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.737687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.738077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.738477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.739301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.739703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.740091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.740487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.740871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.740889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.740904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.740919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.743524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.743921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.744315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.744715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.745542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.745934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.746325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.746731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.747183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.747202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.747219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.747235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.749924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.750339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.750736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.751130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.751948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.752346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.752750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.753143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.753599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.753620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.753636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.753652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.756318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.756721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.757115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.757520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.758301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.758699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.759087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.759487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.759823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.759840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.759855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.759870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.762673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.763072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.763473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.763862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.764693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.317 [2024-05-15 00:13:37.765087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.765491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.765898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.766330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.766353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.766368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.766382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.769157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.769557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.769963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.770357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.771185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.771586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.771974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.772363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.772783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.772800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.772815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.772829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.775412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.775814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.776210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.776613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.777443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.777834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.778226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.778630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.779057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.779073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.779088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.779103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.781748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.782143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.782540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.782941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.783778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.784177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.784590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.784981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.785447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.785464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.785480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.785496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.788085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.788488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.788881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.789279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.790045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.790444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.790832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.791223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.791613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.791631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.791646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.791661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.794427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.795876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.796726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.797810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.798650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.799042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.799440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.799836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.800218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.800237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.800257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.800272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.803293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.803707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.804101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.804490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.805328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.805734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.806454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.807706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.808151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.808169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.808184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.808199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.811578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.813238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.814863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.816127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.817784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.819444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.821107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.821887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.822371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.822389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.822409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.822426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.824578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.826324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.828153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.318 [2024-05-15 00:13:37.829824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.831574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.833240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.833289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.834943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.835253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.835270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.835285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.835300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.838007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.838409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.840122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.840176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.842103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.842153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.843812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.844629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.844934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.844951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.844966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.844980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.847046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.847096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.847490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.847534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.847999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.849469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.850855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.852516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.852790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.852806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.852821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.852835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.856284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.856342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.857447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.857483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.857848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.857893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.858282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.858326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.858606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.858624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.858639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.858653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.860686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.860731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.860771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.860812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.862749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.862797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.863613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.863660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.863966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.863983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.863997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.864012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.865733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.865780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.865822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.865864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.866340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.866385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.866453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.866494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.866947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.866964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.866980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.866996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.868663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.868716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.868759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.868799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.869107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.869152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.869194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.869235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.869511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.869528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.869542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.869556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.871198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.871261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.871305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.871346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.871662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.871706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.871747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.871788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.872237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.872255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.872270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.872286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.874248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.319 [2024-05-15 00:13:37.874298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.874339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.874379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.874758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.874802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.874842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.874882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.875146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.875163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.875178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.875192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.876850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.876899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.876950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.876991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.877299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.877346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.877387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.877434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.877710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.877727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.877742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.877757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.880200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.880247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.880293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.880338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.880655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.880701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.880747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.880794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.881061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.881077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.881092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.881106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.882718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.882764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.882804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.882845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.883158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.883201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.883242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.883281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.883553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.883570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.883585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.883599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.885779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.885825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.885865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.885906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.886222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.886266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.886307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.886348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.886792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.886810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.886826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.886840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.888407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.888454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.888499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.888539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.888930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.888980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.889020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.889060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.889328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.889345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.889360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.889374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.891047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.891093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.891133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.891178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.891669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.891714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.891758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.891800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.892239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.892257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.320 [2024-05-15 00:13:37.892273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.892288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.893981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.894025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.894065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.894110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.894425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.894468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.894518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.894560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.894894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.894911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.894925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.894939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.896520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.896567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.896608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.896651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.896955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.897007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.897052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.897093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.897557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.897575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.897590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.897606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.899794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.899838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.899879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.899919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.900261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.900304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.900344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.900384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.900656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.900673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.900687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.900701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.902290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.902335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.902375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.321 [2024-05-15 00:13:37.902437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.583 [2024-05-15 00:13:37.902744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.583 [2024-05-15 00:13:37.902788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.583 [2024-05-15 00:13:37.902851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.583 [2024-05-15 00:13:37.902893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.583 [2024-05-15 00:13:37.903161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.583 [2024-05-15 00:13:37.903178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.583 [2024-05-15 00:13:37.903192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.583 [2024-05-15 00:13:37.903206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.583 [2024-05-15 00:13:37.905508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.583 [2024-05-15 00:13:37.905553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.583 [2024-05-15 00:13:37.905596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.583 [2024-05-15 00:13:37.905638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.583 [2024-05-15 00:13:37.905946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.583 [2024-05-15 00:13:37.905990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.906033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.906080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.906351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.906367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.906381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.906396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.908051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.908100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.908141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.908181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.908494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.908538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.908579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.908621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.908888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.908908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.908923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.908937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.911001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.911047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.911101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.911146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.911454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.911504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.911548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.911589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.912052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.912069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.912085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.912101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.913774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.913824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.913865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.913908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.914219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.914264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.914305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.914347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.914617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.914635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.914649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.914663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.916265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.916312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.916353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.916393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.916892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.916937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.916979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.917021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.917427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.917444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.917459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.917474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.919302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.919347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.919387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.919433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.919740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.919784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.919825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.919865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.920256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.920272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.920286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.920301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.921821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.921873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.921915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.921956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.922329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.922372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.922417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.922457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.922866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.922882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.922898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.922916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.925056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.925102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.925162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.925208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.925523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.925567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.925619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.925660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.925928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.925945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.584 [2024-05-15 00:13:37.925959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.925974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.927597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.927640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.927680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.927720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.928030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.928074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.928114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.928154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.928422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.928439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.928453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.928468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.930754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.930799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.930842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.930883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.931299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.931346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.931386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.931430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.931751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.931768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.931782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.931797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.933378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.933428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.933469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.933516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.933826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.933869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.933909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.933956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.934222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.934239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.934253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.934267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.936152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.936197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.936238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.936283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.936642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.936686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.936726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.936767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.937145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.937162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.937177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.937192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.939001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.939048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.939088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.940756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.941182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.941226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.941267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.941307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.941619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.941636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.941651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.941666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.943363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.943414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.943457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.943498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.943970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.944015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.944421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.944465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.944896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.944913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.944928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.944942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.946462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.946506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.947440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.947487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.947864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.949521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.949571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.949611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.949880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.949896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.949911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.949925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.953484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.953533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.953924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.953962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.585 [2024-05-15 00:13:37.956171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.956229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.956294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.956335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.956608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.956625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.956640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.956654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.961342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.962477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.962524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.962567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.962954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.963395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.963798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.964312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.964359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.965740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.966009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.966025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.966041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.966061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.969299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.970947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.972153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.973710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.974143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.974208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.974603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.974646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.976474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.976930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.976948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.976963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.976979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.982175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.983842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.985499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.986808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.987238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.987642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.988031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.988423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.990082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.990412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.990429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.990444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.990458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.993470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.995128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.996828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.997674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.997951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.998358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:37.999070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.000331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.000725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.001082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.001099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.001113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.001127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.006604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.008397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.008800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.009187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.009598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.009996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.011065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.012448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.014095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.014366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.014382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.014397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.014415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.017704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.018540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.020267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.020663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.021104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.022927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.023327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.023904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.025302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.025576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.025593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.025607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.025622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.031694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.032093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.032489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.586 [2024-05-15 00:13:38.032879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.033322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.034880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.036274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.037936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.039591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.039979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.039996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.040011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.040026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.042266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.043422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.043815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.045033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.045372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.045786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.047017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.048405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.050062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.050333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.050350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.050364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.050378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.054870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.055270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.055666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.056774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.057078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.058746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.060385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.061667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.063234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.063544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.063561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.063575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.063590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.065709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.066103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.067958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.068352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.068795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.070650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.072338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.074005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.074869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.075182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.075199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.075213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.075228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.080391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.081797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.083458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.085114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.085491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.087242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.088824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.090519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.092335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.092726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.092742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.092757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.092771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.095049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.095999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.097378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.099037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.099307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.100765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.102199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.103580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.105235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.105513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.105530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.105544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.105558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.110325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.112168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.113865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.114641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.114928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.116777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.118444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.120002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.120791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.121069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.121086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.121101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.121116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.587 [2024-05-15 00:13:38.123559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.123958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.124351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.124740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.125185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.125590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.125982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.126374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.128086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.128552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.128570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.128586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.128601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.131758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.132158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.132553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.132946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.133315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.133728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.135325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.135718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.136465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.136735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.136752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.136767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.136781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.139462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.139859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.140248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.140645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.141019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.141946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.142984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.143374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.144700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.145065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.145082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.145096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.145111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.148406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.148807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.149203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.151005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.151483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.151889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.153545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.153940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.154331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.154686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.154704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.154719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.154734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.157414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.157812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.158323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.159777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.160244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.160793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.162225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.162615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.163010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.163423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.163441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.163455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.163470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.167307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.168332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.588 [2024-05-15 00:13:38.168727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.170073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.170449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.170855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.171243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.171640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.172032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.172479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.172500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.172516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.172531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.176213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.176715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.177107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.178920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.179395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.179801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.180195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.180594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.180989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.181424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.181441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.181457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.181472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.184815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.186384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.186775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.187171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.187594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.188014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.188412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.188799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.189186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.189577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.189594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.189609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.189624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.192503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.193655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.194047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.194443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.194824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.195229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.195622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.196010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.196395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.196768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.196785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.196800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.196815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.199693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.200098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.200496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.200888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.201329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.201731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.202119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.202517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.202912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.203181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.203197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.203211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.203226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.205847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.206269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.206671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.207062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.207500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.207917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.208309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.208707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.209214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.209488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.209506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.209520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.209535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.212659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.213055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.213449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.851 [2024-05-15 00:13:38.213841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.214249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.214663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.215403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.216636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.217025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.217381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.217402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.217416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.217431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.219906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.220298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.220693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.221078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.221427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.221830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.222923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.223796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.224187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.224489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.224507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.224521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.224535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.228524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.228922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.229312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.229709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.230067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.230479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.232170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.232571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.232961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.233229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.233250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.233265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.233279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.237333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.237745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.238205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.239715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.240199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.240606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.241002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.241680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.242967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.243427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.243445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.243461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.243476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.248776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.250191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.250239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.251884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.252198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.253878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.255520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.256285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.257992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.258439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.258457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.258472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.258488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.262504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.264268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.266021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.267636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.267992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.269378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.271035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.271082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.272733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.273074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.273092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.273107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.273121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.280373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.280439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.282167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.282225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.282498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.284165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.284218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.285242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.286637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.286905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.286921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.286936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.286950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.288980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.289029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.290504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.852 [2024-05-15 00:13:38.290982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.291424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.291481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.293158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.294680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.296327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.296599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.296617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.296631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.296645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.300743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.301777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.302167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.302211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.302505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.302561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.302602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.303219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.303264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.303688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.303706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.303721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.303737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.305262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.305306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.305346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.305389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.305808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.307270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.307316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.308965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.309010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.309278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.309295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.309314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.309329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.312723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.312776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.312820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.312862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.313294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.313361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.313409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.313450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.313490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.313796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.313813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.313827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.313841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.315452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.315496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.315537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.315577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.315874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.315932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.315972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.316012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.316052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.316313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.316328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.316342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.316356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.320211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.320263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.320308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.320348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.320775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.320829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.320872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.320916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.320960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.321251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.321267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.321281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.321295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.322973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.323024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.323065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.323105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.323369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.323432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.323481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.323523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.323563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.323827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.323844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.323859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.323873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.328237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.853 [2024-05-15 00:13:38.328287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.328329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.328369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.328810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.328871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.328915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.328961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.329004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.329429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.329446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.329460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.329475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.330996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.331053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.331100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.331141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.331452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.331508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.331549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.331590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.331630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.331930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.331946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.331961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.331975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.337608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.337661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.337703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.337743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.338016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.338077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.338119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.338160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.338200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.338629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.338648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.338663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.338683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.340229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.340274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.340315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.340355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.340825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.340886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.340927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.340967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.341007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.341339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.341356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.341370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.341384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.346166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.346216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.346258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.346313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.346584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.346640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.346681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.346730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.346772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.347254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.347272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.347287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.347302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.348970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.349019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.349062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.349107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.349388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.349452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.349495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.349536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.349598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.349864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.349880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.349895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.349909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.353976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.354029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.354071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.354113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.354447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.354507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.354548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.354589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.354629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.355013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.355029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.355044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.355059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.356862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.854 [2024-05-15 00:13:38.356907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.356947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.356987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.357248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.357306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.357347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.357390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.357448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.357846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.357863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.357877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.357891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.362152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.362202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.362243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.362286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.362718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.362771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.362819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.362860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.362901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.363171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.363188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.363203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.363218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.365184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.365229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.365272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.365313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.365583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.365642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.365684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.365723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.365763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.366180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.366197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.366211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.366234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.371184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.371238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.371278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.371318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.371768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.371825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.371867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.371908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.371949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.372223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.372239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.372254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.372268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.374338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.374383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.374429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.374481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.374744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.374793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.374866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.374909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.374949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.375217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.375234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.375249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.375263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.380053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.380106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.380156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.380197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.380681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.380737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.380778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.380820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.380860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.855 [2024-05-15 00:13:38.381210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.381227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.381241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.381255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.383173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.383218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.383262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.383303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.383572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.383631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.383672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.383713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.383753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.384020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.384036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.384050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.384065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.388190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.388240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.388280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.388320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.388731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.388795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.388838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.388879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.388925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.389358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.389375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.389390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.389409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.391320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.391378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.391427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.391468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.391731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.391790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.391831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.391872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.391912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.392175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.392192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.392206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.392221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.396677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.396730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.396770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.396810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.397080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.397137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.397179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.397219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.397260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.397709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.397729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.397744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.397760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.399751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.399796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.399840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.399888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.400161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.400216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.400257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.400298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.400357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.400627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.400644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.400658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.400672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.405724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.405775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.405816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.405857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.406126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.406183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.406224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.406272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.406315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.406794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.406811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.406827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.406842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.408943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.408987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.409028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.409067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.409409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.409469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.409511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.409551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.856 [2024-05-15 00:13:38.409591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.409854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.409870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.409884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.409899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.414851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.414901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.414950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.414991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.415331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.415385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.415430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.415471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.415512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.415922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.415939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.415954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.415969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.418216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.418261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.418312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.418354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.418623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.418682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.418724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.418771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.418811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.419088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.419105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.419119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.419134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.423827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.423877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.424683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.424731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.425003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.425068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.425112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.425153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.425193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.425652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.425672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.425688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.425703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.427753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.427796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.427837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.427877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.428179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.428237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.428279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.429935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.429980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.430247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.430264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.430278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.430292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.434436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.435540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.435586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.435973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.436272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.857 [2024-05-15 00:13:38.436328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.437164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.437209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.437251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.437700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.437719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.437734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.437750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.442859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.444485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.444532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.444572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.444839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.446416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.446462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.446503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.446544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.446815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.446831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.446845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.446860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.451014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.451069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.451115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.452869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.453146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.453914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.455362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.455414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.457067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.457339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.457356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.457370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.457384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.461073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.462887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.464618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.466450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.466723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.466782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.467637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.467685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.469001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.469275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.469291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.469305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.469320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.474304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.474710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.476470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.478065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.478339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.480021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.480749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.482185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.483848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.484121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.484142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.484156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.484171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.488210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.489488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.490873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.492516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.492788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.493915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.495688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.497304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.499022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.499296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.499313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.499327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.499341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.506428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.508179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.118 [2024-05-15 00:13:38.510012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.511656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.512069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.513448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.515112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.516771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.517910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.518193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.518210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.518225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.518239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.522642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.524304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.525958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.526713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.526997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.528817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.530578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.532242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.533311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.533647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.533665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.533679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.533694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.537816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.539477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.540343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.542026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.542297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.543960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.545624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.546226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.547605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.548048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.548066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.548081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.548097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.553480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.554794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.556359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.557771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.558047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.559726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.560524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.562274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.562672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.563105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.563121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.563136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.563151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.568844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.570494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.572149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.573433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.573733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.574761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.575153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.576475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.577113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.577556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.577574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.577589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.577605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.584150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.585910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.587708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.588631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.588908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.589317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.590089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.591285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.591681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.592027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.592047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.592062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.592076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.597776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.599445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.600093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.601706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.602185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.602594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.604175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.604569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.605437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.605759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.605776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.605791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.605805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.611623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.613008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.119 [2024-05-15 00:13:38.614006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.614965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.615394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.616270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.617370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.617761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.618154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.618508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.618525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.618540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.618555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.625167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.625579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.627276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.627669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.628096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.629914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.630304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.630700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.631100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.631490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.631508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.631522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.631537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.637134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.637784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.639106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.639501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.639867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.641324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.641720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.642112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.642511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.642871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.642888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.642902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.642917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.647295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.648465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.649266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.649661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.649946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.650894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.651281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.651680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.652075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.652352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.652369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.652383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.652404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.656005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.657546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.657972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.658362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.658637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.659197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.659593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.659988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.660387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.660665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.660682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.660696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.660711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.663897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.665692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.666082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.666658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.666931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.667337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.667734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.668134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.668789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.669062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.669079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.669099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.669114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.672342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.673760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.674150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.675076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.675354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.675763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.676155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.120 [2024-05-15 00:13:38.676555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.677559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.677851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.677868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.677883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.677898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.681549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.682504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.682902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.684271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.684655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.685061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.685458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.685851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.687264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.687667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.687684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.687699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.687714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.691615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.692308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.692706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.694351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.694791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.695195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.695591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.695987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.697645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.698101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.698118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.698134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.698151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.702196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.702766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.703157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.704828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.705283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.121 [2024-05-15 00:13:38.705695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.382 [2024-05-15 00:13:38.706089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.382 [2024-05-15 00:13:38.706499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.382 [2024-05-15 00:13:38.708154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.382 [2024-05-15 00:13:38.708614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.382 [2024-05-15 00:13:38.708632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.382 [2024-05-15 00:13:38.708648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.382 [2024-05-15 00:13:38.708663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.382 [2024-05-15 00:13:38.712841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.382 [2024-05-15 00:13:38.713299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.382 [2024-05-15 00:13:38.713694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.382 [2024-05-15 00:13:38.715472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.382 [2024-05-15 00:13:38.715956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.382 [2024-05-15 00:13:38.716360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.382 [2024-05-15 00:13:38.716771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.382 [2024-05-15 00:13:38.717171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.382 [2024-05-15 00:13:38.719019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.719548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.719567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.719582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.719597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.724084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.724500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.724893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.726532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.726986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.727384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.727786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.728181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.729813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.730262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.730283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.730299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.730314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.734890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.735299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.735715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.737258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.737726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.738130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.738533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.738978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.740489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.740965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.740986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.741002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.741021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.745811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.746210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.746868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.748170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.748621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.749023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.749425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.750091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.751395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.751841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.751859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.751874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.751893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.756829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.757718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.758782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.759172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.759509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.760812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.761201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.761595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.761990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.762310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.762326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.762341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.762355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.766946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.767997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.768913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.769310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.769622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.770665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.771056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.771452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.771845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.772127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.772143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.772159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.772173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.775942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.777506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.777905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.778294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.778577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.779094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.779489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.781195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.783016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.783292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.783308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.783323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.783337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.789453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.790318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.790366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.790759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.791039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.791830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.383 [2024-05-15 00:13:38.792221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.793697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.795359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.795654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.795672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.795687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.795702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.801579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.801979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.802656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.803938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.804382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.805084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.806477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.806525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.808166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.808447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.808465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.808479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.808494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.814677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.814739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.815129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.815173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.815589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.817225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.817271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.817667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.818697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.819006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.819023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.819038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.819053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.825484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.825540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.826202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.827813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.828264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.828319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.828760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.830282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.830675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.831082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.831099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.831114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.831128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.835645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.837308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.838972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.839019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.839389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.839455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.839504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.841151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.841194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.841648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.841670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.841685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.841701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.845231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.845283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.845324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.845364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.845643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.846851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.846897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.848674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.848717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.848994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.849011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.849025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.849039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.854098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.854149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.854190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.854230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.854616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.854675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.854718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.854760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.854801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.855241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.855258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.855274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.855290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.860389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.384 [2024-05-15 00:13:38.860448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.860488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.860528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.860798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.860864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.860906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.860947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.860987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.861260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.861277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.861292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.861306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.864531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.864582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.864625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.864667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.864987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.865043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.865084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.865125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.865165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.865478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.865495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.865509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.865524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.870426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.870483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.870527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.870568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.870836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.870894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.870938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.870980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.871022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.871296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.871312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.871326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.871340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.874323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.874375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.874430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.874486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.874758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.874807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.874857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.874902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.874942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.875208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.875225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.875239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.875254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.879952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.880019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.880070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.880111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.880383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.880448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.880492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.880533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.880574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.881011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.881028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.881044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.881059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.885119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.885168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.885209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.885249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.885523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.885591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.885632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.885673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.885714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.886130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.886147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.886161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.886176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.890743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.890798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.890840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.890882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.891316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.891369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.891419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.891477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.891524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.891796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.891812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.891827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.385 [2024-05-15 00:13:38.891841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.895655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.895713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.895755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.895797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.896085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.896140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.896182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.896223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.896265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.896560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.896578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.896592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.896607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.901763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.901813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.901858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.901899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.902233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.902290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.902332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.902373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.902422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.902854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.902872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.902888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.902906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.907983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.908054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.908096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.908137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.908414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.908478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.908520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.908560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.908600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.908864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.908881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.908895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.908910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.912208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.912264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.912307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.912348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.912750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.912807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.912848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.912889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.912929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.913271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.913288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.913302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.913317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.918249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.918300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.918341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.918390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.918667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.918720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.918770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.918813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.918853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.919189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.919205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.919220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.919234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.923090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.923141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.923184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.923233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.923512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.923569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.923614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.923657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.923706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.923974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.923992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.924007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.924022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.928956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.929009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.929053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.929095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.929570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.929629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.929670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.929715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.929756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.930118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.930135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.930150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.930164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.933928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.933980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.386 [2024-05-15 00:13:38.934033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.934073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.934342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.934409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.934452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.934499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.934540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.934810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.934832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.934846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.934861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.939224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.939280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.939322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.939363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.939660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.939724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.939769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.939810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.939851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.940129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.940146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.940161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.940175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.944930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.944981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.945026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.945075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.945454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.945507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.945548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.945589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.945630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.946066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.946084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.946099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.946114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.949660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.949712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.949758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.949805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.950178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.950236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.950276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.950316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.950357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.950669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.950686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.950700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.950715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.954842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.954893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.954936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.954977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.955339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.955394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.955441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.955482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.955522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.955836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.955853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.955867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.955881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.960830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.387 [2024-05-15 00:13:38.960891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.388 [2024-05-15 00:13:38.960936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.388 [2024-05-15 00:13:38.960976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.388 [2024-05-15 00:13:38.961244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.388 [2024-05-15 00:13:38.961303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.388 [2024-05-15 00:13:38.961353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.388 [2024-05-15 00:13:38.961395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.388 [2024-05-15 00:13:38.961442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.388 [2024-05-15 00:13:38.961900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.388 [2024-05-15 00:13:38.961918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.388 [2024-05-15 00:13:38.961934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.388 [2024-05-15 00:13:38.961949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.388 [2024-05-15 00:13:38.966178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.388 [2024-05-15 00:13:38.966238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.388 [2024-05-15 00:13:38.966287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.388 [2024-05-15 00:13:38.966328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.388 [2024-05-15 00:13:38.966602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.388 [2024-05-15 00:13:38.966664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.388 [2024-05-15 00:13:38.966708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.388 [2024-05-15 00:13:38.966750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.388 [2024-05-15 00:13:38.966790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.388 [2024-05-15 00:13:38.967059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.388 [2024-05-15 00:13:38.967076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.388 [2024-05-15 00:13:38.967090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.388 [2024-05-15 00:13:38.967104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.971078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.971129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.971171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.971212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.971657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.971713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.971757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.971800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.971841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.972109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.972125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.972144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.972158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.976022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.976078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.976119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.976159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.976437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.976498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.976540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.976581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.976622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.977010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.977027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.977042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.977057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.980252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.980307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.980348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.980388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.980661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.980721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.980763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.980803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.980843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.981245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.981263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.981277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.981292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.986557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.986611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.986664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.986706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.987174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.987229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.987273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.987314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.987355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.987794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.987811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.987826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.987840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.992391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.992449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.994106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.994153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.994433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.994502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.994544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.994585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.649 [2024-05-15 00:13:38.994633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:38.995008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:38.995027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:38.995041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:38.995056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:38.998725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:38.998775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:38.998816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:38.998856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:38.999127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:38.999185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:38.999234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.000011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.000060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.000416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.000433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.000448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.000462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.004567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.004966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.005015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.006527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.006799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.006853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.008636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.008691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.008731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.008999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.009016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.009030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.009045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.013919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.014320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.014366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.014413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.014897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.015455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.015502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.015542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.015583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.015884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.015900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.015915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.015933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.022462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.022520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.022561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.022953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.023425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.023825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.024215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.024260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.025327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.025641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.025659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.025673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.025688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.032486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.032890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.033281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.033676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.034110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.034166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.035557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.035603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.037011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.037284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.037301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.037315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.037330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.042744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.043144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.043539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.043932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.044257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.045658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.047183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.048309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.050065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.050374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.050391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.050413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.050427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.056187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.057855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.059523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.650 [2024-05-15 00:13:39.060862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.061174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.062560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.064221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.065893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.066697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.066969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.066986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.067001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.067016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.071225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.072951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.074802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.075852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.076163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.077842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.079508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.080766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.081162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.081603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.081621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.081637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.081652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.086257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.088066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.089774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.091588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.091862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.092413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.092804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.093193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.093585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.093944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.093960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.093975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.093990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.099859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.101511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.102547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.102940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.103385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.103788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.104177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.104574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.104971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.105492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.105511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.105526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.105549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.109225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.109638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.110034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.110437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.110855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.111256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.111653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.112042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.112458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.112836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.112854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.112869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.112884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.116303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.116707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.117097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.117498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.117937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.118342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.118742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.119131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.119529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.119906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.119923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.119937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.119952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.123409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.123809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.124199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.124595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.125036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.125448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.125845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.126251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.651 [2024-05-15 00:13:39.126647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.127140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.127158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.127175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.127191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.130600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.131014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.131417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.131807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.132206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.132618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.133011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.133413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.133804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.134246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.134265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.134280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.134295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.137749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.138152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.138562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.138961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.139438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.139839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.140227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.140626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.141032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.141436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.141454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.141469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.141484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.144968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.145365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.145765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.146158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.146549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.146953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.147341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.147738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.148126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.148499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.148517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.148532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.148546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.152147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.152556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.152943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.153330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.153749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.154162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.154581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.154974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.155362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.155823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.155842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.155857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.155874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.159360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.159772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.160163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.160560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.161043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.161451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.161854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.162249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.162650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.163102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.163120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.163134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.163149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.165783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.166179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.166576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.166970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.167409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.167814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.168206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.168600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.168996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.169405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.169422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.169437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.169452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.172151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.172555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.172952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.173356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.173809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.174209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.652 [2024-05-15 00:13:39.174602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.176156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.176561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.176835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.176852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.176866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.176880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.179662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.180061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.180464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.180853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.181265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.181670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.182062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.182463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.182856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.183301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.183319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.183334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.183349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.186000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.186407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.186799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.187191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.187591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.187997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.188390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.188781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.189169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.189563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.189581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.189595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.189610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.192243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.193636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.195277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.196921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.197345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.197756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.198146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.198547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.199131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.199411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.199428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.199443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.199459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.202455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.204108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.205754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.206902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.207327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.207736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.208125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.208519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.210367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.210649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.210666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.210681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.210695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.214037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.215822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.217469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.217853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.218299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.218705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.219094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.220422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.221787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.222058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.222074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.222088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.222103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.225380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.227035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.227449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.227840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.228217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.228622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.229457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.230810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.232469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.232742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.232760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.232774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.232789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.653 [2024-05-15 00:13:39.236100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.236937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.237330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.237723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.238188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.238712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.240135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.241796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.243453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.243727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.243743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.243759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.243774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.246533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.246927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.247316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.247713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.248161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.249969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.251631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.253392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.255182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.255563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.255581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.255595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.255609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.257540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.257934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.258324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.258720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.258991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.260390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.262017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.263660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.264390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.264678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.264704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.264720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.264734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.918 [2024-05-15 00:13:39.266802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.267196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.267588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.268949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.269268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.270951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.272607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.273595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.275385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.275661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.275678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.275692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.275707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.278030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.278430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.279436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.280814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.281087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.282764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.284135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.285620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.287006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.287277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.287294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.287308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.287323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.289954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.290584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.290635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.292014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.292286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.293963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.295492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.296833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.298212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.298490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.298507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.298521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.298536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.301155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.301678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.303105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.304745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.305020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.306812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.307852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.307900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.309289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.309565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.309583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.309597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.309612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.312272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.312323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.313021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.313067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.313412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.315111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.315161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.316804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.318079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.318367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.318384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.318405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.318419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.320426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.320477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.320864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.321249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.321678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.321736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.323200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.324844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.326474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.326745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.326762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.326776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.326791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.328428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.329611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.330003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.330047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.330472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.330526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.330580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.330969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.331013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.919 [2024-05-15 00:13:39.331454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.331476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.331491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.331505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.333057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.333106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.333147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.333189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.333477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.334869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.334915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.336626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.336674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.336942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.336959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.336973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.336988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.339333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.339379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.339428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.339469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.339783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.339838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.339879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.339919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.339959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.340260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.340277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.340291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.340305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.341939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.341985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.342036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.342077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.342343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.342416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.342462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.342503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.342544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.342811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.342827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.342841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.342856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.345058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.345104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.345145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.345186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.345635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.345690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.345735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.345775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.345815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.346102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.346119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.346133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.346148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.347771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.347815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.347856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.347896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.348197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.348255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.348296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.348340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.348381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.348653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.348670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.348684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.348699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.350870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.350914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.350954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.350995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.351418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.351471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.351513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.351554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.351595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.351867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.351884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.351898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.351913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.353570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.353618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.353663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.353704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.920 [2024-05-15 00:13:39.353989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.354053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.354097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.354138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.354178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.354449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.354466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.354484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.354498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.356485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.356530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.356571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.356614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.357111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.357173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.357215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.357257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.357298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.357692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.357710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.357724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.357739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.359251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.359296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.359337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.359378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.359653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.359708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.359748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.359792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.359839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.360105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.360122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.360136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.360150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.362051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.362097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.362138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.362186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.362604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.362656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.362697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.362737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.362781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.363212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.363230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.363245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.363261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.364801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.364845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.364889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.364930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.365309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.365367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.365414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.365455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.365495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.365800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.365816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.365830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.365844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.367571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.367615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.367657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.367698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.368140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.368197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.368238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.368297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.368338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.368783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.368801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.368816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.368834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.370518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.370570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.370614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.370654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.370917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.370976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.371020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.371061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.371102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.371367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.921 [2024-05-15 00:13:39.371383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.371403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.371418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.373090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.373140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.373181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.373221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.373676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.373729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.373770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.373811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.373852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.374226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.374243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.374258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.374277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.376109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.376153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.376197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.376237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.376506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.376565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.376607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.376647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.376688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.377171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.377188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.377202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.377216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.378746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.378800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.378841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.378882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.379297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.379351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.379393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.379441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.379483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.379915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.379932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.379947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.379964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.381935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.381979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.382019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.382078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.382341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.382391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.382454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.382497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.382538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.382804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.382821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.382836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.382850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.384420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.384465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.384506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.384546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.384917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.384982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.385025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.385067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.385107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.385543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.385561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.385578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.385594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.387659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.387703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.387746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.387787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.388051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.388109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.388151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.388191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.388236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.388508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.388525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.388539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.388553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.390196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.390243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.390294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.390334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.390604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.390666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.390708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.390751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.390793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.922 [2024-05-15 00:13:39.391178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.391194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.391210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.391224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.393467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.393511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.393560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.393601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.393868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.393922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.393963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.394009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.394053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.394320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.394336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.394350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.394365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.396039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.396084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.396124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.396164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.396434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.396499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.396541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.396581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.396621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.397027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.397043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.397058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.397073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.399581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.399625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.399670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.399710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.400042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.400101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.400143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.400183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.400224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.400499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.400516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.400530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.400544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.402191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.402235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.402283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.402330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.402603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.402664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.402713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.402753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.402792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.403055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.403072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.403088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.403102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.405453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.405498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.405540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.405584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.405852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.405903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.405952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.405996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.406039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.406305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.406322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.406336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.406350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.408007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.408051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.923 [2024-05-15 00:13:39.408091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.408132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.408396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.408459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.408501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.408542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.408581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.408850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.408867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.408881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.408895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.411223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.411272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.411317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.411358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.411776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.411837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.411878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.411918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.411958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.412225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.412241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.412256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.412270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.413878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.413922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.415580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.415626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.416094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.416174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.416217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.416258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.416299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.416748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.416766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.416784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.416799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.418865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.418909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.418950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.418989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.419255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.419314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.419355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.421010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.421056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.421423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.421441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.421455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.421470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.422992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.423389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.423439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.423825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.424217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.424268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.424661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.424705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.424749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.425032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.425049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.425063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.425078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.426722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.428516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.428559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.428605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.428874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.430551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.430603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.430648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.430689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.430955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.430972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.430987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.431002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.434458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.434508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.434549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.435933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.436204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.437881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.438704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.438751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.440208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.440482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.924 [2024-05-15 00:13:39.440498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.440512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.440527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.442866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.443258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.444345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.445726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.445995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.446057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.447724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.447771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.448648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.448933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.448950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.448964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.448978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.450946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.451339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.451733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.452122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.452489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.452891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.453282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.453679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.454069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.454489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.454506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.454521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.454536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.457233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.457640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.458037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.458432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.458861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.459259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.459654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.460047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.460444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.460857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.460875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.460892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.460906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.463563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.463963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.464354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.464748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.465188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.465594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.465990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.466388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.466789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.467263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.467281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.467296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.467312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.469937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.470331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.470730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.471128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.471531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.471936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.472325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.472717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.473109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.473513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.473530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.473544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.473559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.476324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.476723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.477114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.477507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.477920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.478319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.478721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.479117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.479511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.479959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.479977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.479992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.480007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.482705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.483102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.483496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.483888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.484318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.484729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.925 [2024-05-15 00:13:39.485119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.485509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.485902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.486311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.486328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.486342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.486357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.489021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.489425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.489826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.490216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.490658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.491057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.491451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.491844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.492238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.492654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.492675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.492690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.492705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.495330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.495731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.496121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.496511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.496946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.497347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.497748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.498140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.498530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.498974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.498992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.499008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.499023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.926 [2024-05-15 00:13:39.501854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.502252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.502653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.503049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.503467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.503866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.504256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.504650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.505038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.505381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.505403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.505419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.505433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.508257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.508676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.509074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.509467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.509922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.510323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.510724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.511118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.511514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.511918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.511937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.511951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.511966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.514561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.514960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.515349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.515745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.516136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.516554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.516945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.517332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.517724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.518088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.518105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.518120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.518135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.520754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.521591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.522697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.523860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.524192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.524606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.188 [2024-05-15 00:13:39.524999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.525408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.525800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.526167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.526184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.526199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.526213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.528855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.529254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.529658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.530050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.530457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.530855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.531247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.531653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.532048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.532494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.532512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.532527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.532546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.535119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.535524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.535914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.536303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.536672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.537078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.537477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.537870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.538266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.538717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.538738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.538758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.538775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.541535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.543160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.544615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.546261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.546533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.547351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.547746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.548134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.548526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.548929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.548946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.548960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.548975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.551305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.552687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.554328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.555978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.556270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.556685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.557077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.557471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.557862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.558142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.558159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.558173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.558187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.561412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.562894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.564552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.566254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.566666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.567068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.567461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.567847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.568819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.569128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.569145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.569159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.569173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.572131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.573788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.575437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.576329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.576763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.577169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.577567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.577958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.579756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.580055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.580072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.580086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.580100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.583356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.585139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.586921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.587308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.189 [2024-05-15 00:13:39.587737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.588136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.588528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.589642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.591028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.591297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.591314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.591328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.591342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.594553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.596208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.596972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.597370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.597808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.598212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.598607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.600364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.602105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.602374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.602390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.602408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.602422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.605874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.607557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.607948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.608338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.608738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.609137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.610380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.611773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.613416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.613693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.613710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.613728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.613743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.616977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.617616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.618008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.618403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.618886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.619308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.620843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.622525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.624173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.624453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.624470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.624485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.624499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.627442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.627841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.628231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.628634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.629071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.630665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.632079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.633726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.635369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.635749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.635766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.635780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.635795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.637672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.638067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.638463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.638859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.639211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.640592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.642229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.643882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.645089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.645364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.645380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.645395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.645414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.647374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.647771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.648172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.648664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.648935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.650476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.652117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.653848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.654791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.655113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.655130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.655144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.655159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.657188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.657591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.657984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.659317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.659631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.661304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.662955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.190 [2024-05-15 00:13:39.664021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.665841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.666137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.666154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.666168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.666183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.668379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.668776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.668824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.669687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.669999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.671757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.673404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.674927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.676238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.676552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.676569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.676583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.676598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.678681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.679077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.679473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.681140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.681491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.683167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.684819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.684866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.685601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.685874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.685892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.685907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.685925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.687958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.688007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.688396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.688444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.688880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.690023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.690069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.691447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.693072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.693345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.693361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.693376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.693390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.696631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.696681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.697134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.697530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.697934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.697998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.698387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.699131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.700511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.700781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.700798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.700813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.700827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.702443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.704071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.705724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.705771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.706121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.706181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.706222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.706621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.706666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.707099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.707116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.707130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.707145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.709196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.709251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.709292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.709332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.709604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.711242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.711289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.712697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.712742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.713043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.713060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.713074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.713089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.714907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.714952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.714993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.715036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.715459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.715524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.715567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.715608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.191 [2024-05-15 00:13:39.715653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.716088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.716106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.716121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.716137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.717770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.717815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.717855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.717895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.718209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.718268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.718312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.718352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.718395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.718670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.718686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.718701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.718715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.720317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.720362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.720408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.720448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.720876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.720929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.720972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.721013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.721055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.721475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.721492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.721507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.721522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.723354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.723407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.723447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.723488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.723754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.723812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.723853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.723894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.723936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.724375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.724391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.724410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.724425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.725921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.725967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.726009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.726050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.726427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.726490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.726531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.726573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.726617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.727044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.727062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.727077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.727093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.729113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.729157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.729201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.729244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.729521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.729580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.729624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.729675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.729715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.729979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.729995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.730010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.730026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.731601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.731645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.731685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.731725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.732040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.732104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.732146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.732204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.732245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.732720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.732738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.732753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.732769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.734867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.734912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.734957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.734997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.735266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.192 [2024-05-15 00:13:39.735328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.735370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.735414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.735455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.735727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.735744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.735758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.735772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.737415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.737459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.737499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.737539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.737801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.737862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.737903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.737943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.737989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.738353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.738370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.738385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.738403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.740892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.740936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.740980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.741020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.741324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.741383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.741429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.741469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.741510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.741778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.741794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.741809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.741823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.743466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.743514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.743563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.743609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.743877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.743934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.743985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.744025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.744065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.744330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.744347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.744361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.744377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.746663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.746708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.746750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.746793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.747063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.747118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.747168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.747210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.747258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.747534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.747550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.747565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.747579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.749246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.749289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.749329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.749369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.749639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.749709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.749750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.749790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.749830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.750093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.750110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.750124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.750138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.752358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.752409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.752455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.752497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.752913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.752969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.753011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.753051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.753091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.753396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.753417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.753431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.753446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.755043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.193 [2024-05-15 00:13:39.755087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.755127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.755166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.755493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.755556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.755597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.755638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.755678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.755946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.755966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.755980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.755995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.758106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.758150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.758191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.758232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.758651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.758705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.758748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.758790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.758832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.759114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.759130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.759144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.759159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.760780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.760825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.760869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.760909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.761181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.761240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.761288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.761329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.761369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.761645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.761661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.761676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.761690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.763589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.763638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.763683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.763726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.764133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.764195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.764239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.764280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.764322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.764771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.764790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.764806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.764821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.766478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.766523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.766564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.766604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.766921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.766985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.767027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.767068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.767107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.767371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.767388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.767408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.767422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.769624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.769673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.769715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.769757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.770201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.770254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.770304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.770344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.770384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.770688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.770705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.770720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.770735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.772323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.772367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.772414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.772454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.772754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.772813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.772855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.772895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.772935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.773204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.773220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.773234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.194 [2024-05-15 00:13:39.773249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.453 [2024-05-15 00:13:39.775523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.775579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.775624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.775665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.776095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.776149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.776193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.776235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.776278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.776621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.776642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.776656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.776671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.778235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.778281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.778322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.778370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.778640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.778698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.778747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.778789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.778840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.779108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.779125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.779139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.779153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.781123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.781168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.781210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.781251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.781666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.781719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.781760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.781803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.781844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.782272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.782290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.782305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.782321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.783841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.783885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.783929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.783969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.784436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.784497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.784540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.784580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.784619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.784966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.784982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.784996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.785011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.786751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.786796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.786844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.786886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.787314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.787366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.787414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.787469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.787513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.787983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.788000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.788015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.788030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.789688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.789735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.791488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.791533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.791878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.791938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.791979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.792024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.792064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.792364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.792380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.792395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.792415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.794199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.794244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.794286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.794328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.794752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.794810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.794852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.795241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.795285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.795649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.795666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.795680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.795694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.797228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.797301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.799039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.799088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.454 [2024-05-15 00:13:39.799128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.455 [2024-05-15 00:13:39.799397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.455 [2024-05-15 00:13:39.799419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.455 [2024-05-15 00:13:39.799433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.455 [2024-05-15 00:13:39.801393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.455 [2024-05-15 00:13:39.801471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.455 [2024-05-15 00:13:39.801513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.455 [2024-05-15 00:13:39.802306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.455 [2024-05-15 00:13:39.802352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.455 [2024-05-15 00:13:39.802394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.455 [2024-05-15 00:13:39.802854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.455 [2024-05-15 00:13:39.802871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.455 [2024-05-15 00:13:39.805319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.455 [2024-05-15 00:13:39.805365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.022 00:31:40.022 Latency(us) 00:31:40.022 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:40.022 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:40.022 Verification LBA range: start 0x0 length 0x100 00:31:40.022 crypto_ram : 6.18 41.40 2.59 0.00 0.00 3001424.14 279012.40 2742710.09 00:31:40.022 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:40.022 Verification LBA range: start 0x100 length 0x100 00:31:40.022 crypto_ram : 6.12 41.85 2.62 0.00 0.00 2963228.05 286306.84 2582232.38 00:31:40.022 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:40.022 Verification LBA range: start 0x0 length 0x100 00:31:40.022 crypto_ram1 : 6.19 41.39 2.59 0.00 0.00 2895027.87 277188.79 2538465.73 00:31:40.022 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:40.022 Verification LBA range: start 0x100 length 0x100 00:31:40.022 crypto_ram1 : 6.12 41.84 2.62 0.00 0.00 2858085.51 288130.45 2363399.12 00:31:40.022 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:40.022 Verification LBA range: start 0x0 length 0x100 00:31:40.022 crypto_ram2 : 5.62 250.42 15.65 0.00 0.00 452043.50 20287.67 700266.41 00:31:40.022 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:40.022 Verification LBA range: start 0x100 length 0x100 00:31:40.022 crypto_ram2 : 5.69 269.83 16.86 0.00 0.00 424107.93 82974.27 682030.30 00:31:40.022 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:40.022 Verification LBA range: start 0x0 length 0x100 00:31:40.022 crypto_ram3 : 5.80 264.70 16.54 0.00 0.00 417064.31 72488.51 384781.80 00:31:40.022 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:40.022 Verification LBA range: start 0x100 length 0x100 00:31:40.022 crypto_ram3 : 5.76 276.26 17.27 0.00 0.00 398253.34 31913.18 492374.82 00:31:40.022 =================================================================================================================== 00:31:40.022 Total : 1227.69 76.73 0.00 0.00 783964.47 20287.67 2742710.09 00:31:40.590 00:31:40.590 real 0m9.618s 00:31:40.590 user 0m18.067s 00:31:40.590 sys 0m0.659s 00:31:40.590 00:13:40 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1122 -- # xtrace_disable 00:31:40.590 00:13:40 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:31:40.590 ************************************ 00:31:40.590 END TEST bdev_verify_big_io 00:31:40.590 ************************************ 00:31:40.590 00:13:41 blockdev_crypto_qat -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:40.590 00:13:41 blockdev_crypto_qat -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:31:40.590 00:13:41 blockdev_crypto_qat -- common/autotest_common.sh@1103 -- # xtrace_disable 00:31:40.590 00:13:41 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:31:40.590 ************************************ 00:31:40.590 START TEST bdev_write_zeroes 00:31:40.590 ************************************ 00:31:40.590 00:13:41 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:40.590 [2024-05-15 00:13:41.124148] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:31:40.590 [2024-05-15 00:13:41.124207] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid572801 ] 00:31:40.849 [2024-05-15 00:13:41.249606] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:40.849 [2024-05-15 00:13:41.346098] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:31:40.849 [2024-05-15 00:13:41.367367] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:31:40.849 [2024-05-15 00:13:41.375397] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:40.849 [2024-05-15 00:13:41.383418] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:41.109 [2024-05-15 00:13:41.490013] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:31:43.642 [2024-05-15 00:13:43.912404] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:31:43.642 [2024-05-15 00:13:43.912469] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:43.642 [2024-05-15 00:13:43.912484] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:43.642 [2024-05-15 00:13:43.920423] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:31:43.642 [2024-05-15 00:13:43.920442] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:43.642 [2024-05-15 00:13:43.920454] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:43.642 [2024-05-15 00:13:43.928443] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:31:43.642 [2024-05-15 00:13:43.928461] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:43.642 [2024-05-15 00:13:43.928472] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:43.642 [2024-05-15 00:13:43.936463] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:31:43.642 [2024-05-15 00:13:43.936481] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:43.642 [2024-05-15 00:13:43.936492] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:43.642 Running I/O for 1 seconds... 00:31:44.576 00:31:44.576 Latency(us) 00:31:44.576 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:44.576 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:44.576 crypto_ram : 1.02 2014.40 7.87 0.00 0.00 63108.18 5613.30 76135.74 00:31:44.576 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:44.577 crypto_ram1 : 1.03 2027.50 7.92 0.00 0.00 62402.16 5584.81 70664.90 00:31:44.577 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:44.577 crypto_ram2 : 1.02 15510.42 60.59 0.00 0.00 8130.11 2450.48 10713.71 00:31:44.577 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:44.577 crypto_ram3 : 1.02 15542.48 60.71 0.00 0.00 8087.04 2450.48 8548.17 00:31:44.577 =================================================================================================================== 00:31:44.577 Total : 35094.81 137.09 0.00 0.00 14429.71 2450.48 76135.74 00:31:45.144 00:31:45.144 real 0m4.403s 00:31:45.144 user 0m3.829s 00:31:45.144 sys 0m0.533s 00:31:45.144 00:13:45 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1122 -- # xtrace_disable 00:31:45.144 00:13:45 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:31:45.144 ************************************ 00:31:45.144 END TEST bdev_write_zeroes 00:31:45.144 ************************************ 00:31:45.144 00:13:45 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:45.144 00:13:45 blockdev_crypto_qat -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:31:45.144 00:13:45 blockdev_crypto_qat -- common/autotest_common.sh@1103 -- # xtrace_disable 00:31:45.144 00:13:45 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:31:45.144 ************************************ 00:31:45.144 START TEST bdev_json_nonenclosed 00:31:45.144 ************************************ 00:31:45.144 00:13:45 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:45.144 [2024-05-15 00:13:45.622458] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:31:45.144 [2024-05-15 00:13:45.622517] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid573348 ] 00:31:45.403 [2024-05-15 00:13:45.749622] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:45.403 [2024-05-15 00:13:45.853204] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:31:45.403 [2024-05-15 00:13:45.853267] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:31:45.403 [2024-05-15 00:13:45.853288] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:31:45.403 [2024-05-15 00:13:45.853300] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:31:45.403 00:31:45.403 real 0m0.417s 00:31:45.403 user 0m0.268s 00:31:45.403 sys 0m0.147s 00:31:45.403 00:13:45 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1122 -- # xtrace_disable 00:31:45.403 00:13:45 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:31:45.403 ************************************ 00:31:45.403 END TEST bdev_json_nonenclosed 00:31:45.403 ************************************ 00:31:45.662 00:13:46 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:45.662 00:13:46 blockdev_crypto_qat -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:31:45.662 00:13:46 blockdev_crypto_qat -- common/autotest_common.sh@1103 -- # xtrace_disable 00:31:45.662 00:13:46 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:31:45.662 ************************************ 00:31:45.662 START TEST bdev_json_nonarray 00:31:45.662 ************************************ 00:31:45.662 00:13:46 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:45.662 [2024-05-15 00:13:46.124318] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:31:45.662 [2024-05-15 00:13:46.124383] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid573470 ] 00:31:45.921 [2024-05-15 00:13:46.253161] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:45.921 [2024-05-15 00:13:46.353958] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:31:45.921 [2024-05-15 00:13:46.354039] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:31:45.921 [2024-05-15 00:13:46.354061] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:31:45.921 [2024-05-15 00:13:46.354073] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:31:45.921 00:31:45.921 real 0m0.410s 00:31:45.921 user 0m0.249s 00:31:45.921 sys 0m0.158s 00:31:45.921 00:13:46 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1122 -- # xtrace_disable 00:31:45.921 00:13:46 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:31:45.921 ************************************ 00:31:45.921 END TEST bdev_json_nonarray 00:31:45.921 ************************************ 00:31:46.179 00:13:46 blockdev_crypto_qat -- bdev/blockdev.sh@787 -- # [[ crypto_qat == bdev ]] 00:31:46.179 00:13:46 blockdev_crypto_qat -- bdev/blockdev.sh@794 -- # [[ crypto_qat == gpt ]] 00:31:46.180 00:13:46 blockdev_crypto_qat -- bdev/blockdev.sh@798 -- # [[ crypto_qat == crypto_sw ]] 00:31:46.180 00:13:46 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:31:46.180 00:13:46 blockdev_crypto_qat -- bdev/blockdev.sh@811 -- # cleanup 00:31:46.180 00:13:46 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:31:46.180 00:13:46 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:31:46.180 00:13:46 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:31:46.180 00:13:46 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:31:46.180 00:13:46 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:31:46.180 00:13:46 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:31:46.180 00:31:46.180 real 1m14.427s 00:31:46.180 user 2m41.855s 00:31:46.180 sys 0m10.590s 00:31:46.180 00:13:46 blockdev_crypto_qat -- common/autotest_common.sh@1122 -- # xtrace_disable 00:31:46.180 00:13:46 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:31:46.180 ************************************ 00:31:46.180 END TEST blockdev_crypto_qat 00:31:46.180 ************************************ 00:31:46.180 00:13:46 -- spdk/autotest.sh@356 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:31:46.180 00:13:46 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:31:46.180 00:13:46 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:31:46.180 00:13:46 -- common/autotest_common.sh@10 -- # set +x 00:31:46.180 ************************************ 00:31:46.180 START TEST chaining 00:31:46.180 ************************************ 00:31:46.180 00:13:46 chaining -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:31:46.180 * Looking for test storage... 00:31:46.180 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:31:46.180 00:13:46 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@7 -- # uname -s 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:31:46.180 00:13:46 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:31:46.180 00:13:46 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:31:46.180 00:13:46 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:31:46.180 00:13:46 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:46.180 00:13:46 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:46.180 00:13:46 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:46.180 00:13:46 chaining -- paths/export.sh@5 -- # export PATH 00:31:46.180 00:13:46 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@47 -- # : 0 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:31:46.180 00:13:46 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:31:46.180 00:13:46 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:31:46.180 00:13:46 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:31:46.180 00:13:46 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:31:46.180 00:13:46 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:31:46.180 00:13:46 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:31:46.180 00:13:46 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:31:46.180 00:13:46 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:31:46.180 00:13:46 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:31:46.180 00:13:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@296 -- # e810=() 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@297 -- # x722=() 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@298 -- # mlx=() 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@336 -- # return 1 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:31:54.334 WARNING: No supported devices were found, fallback requested for tcp test 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:31:54.334 Cannot find device "nvmf_tgt_br" 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@155 -- # true 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:31:54.334 Cannot find device "nvmf_tgt_br2" 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@156 -- # true 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:31:54.334 Cannot find device "nvmf_tgt_br" 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@158 -- # true 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:31:54.334 Cannot find device "nvmf_tgt_br2" 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@159 -- # true 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:31:54.334 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@162 -- # true 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:31:54.334 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@163 -- # true 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:31:54.334 00:13:54 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:31:54.592 00:13:54 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:31:54.592 00:13:54 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:31:54.592 00:13:54 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:31:54.592 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:31:54.592 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.093 ms 00:31:54.592 00:31:54.592 --- 10.0.0.2 ping statistics --- 00:31:54.592 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:31:54.592 rtt min/avg/max/mdev = 0.093/0.093/0.093/0.000 ms 00:31:54.592 00:13:54 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:31:54.592 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:31:54.592 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.074 ms 00:31:54.592 00:31:54.592 --- 10.0.0.3 ping statistics --- 00:31:54.593 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:31:54.593 rtt min/avg/max/mdev = 0.074/0.074/0.074/0.000 ms 00:31:54.593 00:13:54 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:31:54.593 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:31:54.593 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.038 ms 00:31:54.593 00:31:54.593 --- 10.0.0.1 ping statistics --- 00:31:54.593 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:31:54.593 rtt min/avg/max/mdev = 0.038/0.038/0.038/0.000 ms 00:31:54.593 00:13:54 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:31:54.593 00:13:54 chaining -- nvmf/common.sh@433 -- # return 0 00:31:54.593 00:13:54 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:31:54.593 00:13:54 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:31:54.593 00:13:54 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:31:54.593 00:13:54 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:31:54.593 00:13:54 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:31:54.593 00:13:54 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:31:54.593 00:13:54 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:31:54.593 00:13:55 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:31:54.593 00:13:55 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:31:54.593 00:13:55 chaining -- common/autotest_common.sh@720 -- # xtrace_disable 00:31:54.593 00:13:55 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:54.593 00:13:55 chaining -- nvmf/common.sh@481 -- # nvmfpid=577223 00:31:54.593 00:13:55 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:31:54.593 00:13:55 chaining -- nvmf/common.sh@482 -- # waitforlisten 577223 00:31:54.593 00:13:55 chaining -- common/autotest_common.sh@827 -- # '[' -z 577223 ']' 00:31:54.593 00:13:55 chaining -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:54.593 00:13:55 chaining -- common/autotest_common.sh@832 -- # local max_retries=100 00:31:54.593 00:13:55 chaining -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:54.593 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:54.593 00:13:55 chaining -- common/autotest_common.sh@836 -- # xtrace_disable 00:31:54.593 00:13:55 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:54.593 [2024-05-15 00:13:55.105550] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:31:54.593 [2024-05-15 00:13:55.105632] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:31:54.851 [2024-05-15 00:13:55.233890] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:54.851 [2024-05-15 00:13:55.337508] app.c: 604:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:31:54.851 [2024-05-15 00:13:55.337553] app.c: 605:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:31:54.851 [2024-05-15 00:13:55.337567] app.c: 610:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:31:54.851 [2024-05-15 00:13:55.337580] app.c: 611:app_setup_trace: *NOTICE*: SPDK application currently running. 00:31:54.851 [2024-05-15 00:13:55.337590] app.c: 612:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:31:54.851 [2024-05-15 00:13:55.337620] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:31:55.785 00:13:56 chaining -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:31:55.785 00:13:56 chaining -- common/autotest_common.sh@860 -- # return 0 00:31:55.785 00:13:56 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:31:55.785 00:13:56 chaining -- common/autotest_common.sh@726 -- # xtrace_disable 00:31:55.785 00:13:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:55.785 00:13:56 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@69 -- # mktemp 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.VB5vGaIeUx 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@69 -- # mktemp 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.YxHMGaO1Eh 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:31:55.785 00:13:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:55.785 00:13:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:55.785 malloc0 00:31:55.785 true 00:31:55.785 true 00:31:55.785 [2024-05-15 00:13:56.136805] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:31:55.785 crypto0 00:31:55.785 [2024-05-15 00:13:56.144831] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:31:55.785 crypto1 00:31:55.785 [2024-05-15 00:13:56.152938] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:31:55.785 [2024-05-15 00:13:56.168917] nvmf_rpc.c: 610:decode_rpc_listen_address: *WARNING*: decode_rpc_listen_address: deprecated feature [listen_]address.transport is deprecated in favor of trtype to be removed in v24.09 00:31:55.785 [2024-05-15 00:13:56.169212] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:31:55.785 00:13:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@85 -- # update_stats 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:55.785 00:13:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:55.785 00:13:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:55.785 00:13:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:55.785 00:13:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:55.785 00:13:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:55.785 00:13:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:55.785 00:13:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:55.785 00:13:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:55.785 00:13:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:55.785 00:13:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:55.785 00:13:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:55.785 00:13:56 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:31:55.785 00:13:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:56.043 00:13:56 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:31:56.043 00:13:56 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.VB5vGaIeUx bs=1K count=64 00:31:56.043 64+0 records in 00:31:56.043 64+0 records out 00:31:56.043 65536 bytes (66 kB, 64 KiB) copied, 0.00105431 s, 62.2 MB/s 00:31:56.043 00:13:56 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.VB5vGaIeUx --ob Nvme0n1 --bs 65536 --count 1 00:31:56.043 00:13:56 chaining -- bdev/chaining.sh@25 -- # local config 00:31:56.043 00:13:56 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:31:56.043 00:13:56 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:31:56.043 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:31:56.043 00:13:56 chaining -- bdev/chaining.sh@31 -- # config='{ 00:31:56.043 "subsystems": [ 00:31:56.043 { 00:31:56.043 "subsystem": "bdev", 00:31:56.043 "config": [ 00:31:56.043 { 00:31:56.043 "method": "bdev_nvme_attach_controller", 00:31:56.043 "params": { 00:31:56.043 "trtype": "tcp", 00:31:56.043 "adrfam": "IPv4", 00:31:56.043 "name": "Nvme0", 00:31:56.043 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:56.043 "traddr": "10.0.0.2", 00:31:56.043 "trsvcid": "4420" 00:31:56.043 } 00:31:56.043 }, 00:31:56.043 { 00:31:56.043 "method": "bdev_set_options", 00:31:56.043 "params": { 00:31:56.043 "bdev_auto_examine": false 00:31:56.043 } 00:31:56.043 } 00:31:56.043 ] 00:31:56.043 } 00:31:56.043 ] 00:31:56.043 }' 00:31:56.043 00:13:56 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.VB5vGaIeUx --ob Nvme0n1 --bs 65536 --count 1 00:31:56.043 00:13:56 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:31:56.043 "subsystems": [ 00:31:56.043 { 00:31:56.043 "subsystem": "bdev", 00:31:56.043 "config": [ 00:31:56.043 { 00:31:56.043 "method": "bdev_nvme_attach_controller", 00:31:56.043 "params": { 00:31:56.043 "trtype": "tcp", 00:31:56.043 "adrfam": "IPv4", 00:31:56.043 "name": "Nvme0", 00:31:56.043 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:56.043 "traddr": "10.0.0.2", 00:31:56.043 "trsvcid": "4420" 00:31:56.043 } 00:31:56.043 }, 00:31:56.043 { 00:31:56.043 "method": "bdev_set_options", 00:31:56.043 "params": { 00:31:56.043 "bdev_auto_examine": false 00:31:56.043 } 00:31:56.043 } 00:31:56.043 ] 00:31:56.043 } 00:31:56.043 ] 00:31:56.043 }' 00:31:56.043 [2024-05-15 00:13:56.493869] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:31:56.043 [2024-05-15 00:13:56.493931] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid577449 ] 00:31:56.043 [2024-05-15 00:13:56.623088] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:56.301 [2024-05-15 00:13:56.721058] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:31:56.559  Copying: 64/64 [kB] (average 10 MBps) 00:31:56.559 00:31:56.816 00:13:57 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:56.817 00:13:57 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:56.817 00:13:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:56.817 00:13:57 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:56.817 00:13:57 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:56.817 00:13:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:56.817 00:13:57 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:56.817 00:13:57 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:56.817 00:13:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:56.817 00:13:57 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:31:56.817 00:13:57 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:56.817 00:13:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:56.817 00:13:57 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@96 -- # update_stats 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:56.817 00:13:57 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:56.817 00:13:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:56.817 00:13:57 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:56.817 00:13:57 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:56.817 00:13:57 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:56.817 00:13:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:57.075 00:13:57 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:57.075 00:13:57 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:31:57.075 00:13:57 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:31:57.075 00:13:57 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:57.075 00:13:57 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:57.075 00:13:57 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:57.075 00:13:57 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:57.075 00:13:57 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:57.075 00:13:57 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:57.075 00:13:57 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:57.075 00:13:57 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:57.075 00:13:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:57.075 00:13:57 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:57.075 00:13:57 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:31:57.075 00:13:57 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:31:57.075 00:13:57 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:57.075 00:13:57 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:57.075 00:13:57 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:31:57.075 00:13:57 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:57.075 00:13:57 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:31:57.075 00:13:57 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:57.075 00:13:57 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:31:57.075 00:13:57 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:57.075 00:13:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:57.075 00:13:57 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:57.075 00:13:57 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:31:57.075 00:13:57 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.YxHMGaO1Eh --ib Nvme0n1 --bs 65536 --count 1 00:31:57.075 00:13:57 chaining -- bdev/chaining.sh@25 -- # local config 00:31:57.075 00:13:57 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:31:57.075 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:31:57.075 00:13:57 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:31:57.075 00:13:57 chaining -- bdev/chaining.sh@31 -- # config='{ 00:31:57.075 "subsystems": [ 00:31:57.075 { 00:31:57.075 "subsystem": "bdev", 00:31:57.075 "config": [ 00:31:57.075 { 00:31:57.075 "method": "bdev_nvme_attach_controller", 00:31:57.075 "params": { 00:31:57.075 "trtype": "tcp", 00:31:57.075 "adrfam": "IPv4", 00:31:57.075 "name": "Nvme0", 00:31:57.075 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:57.075 "traddr": "10.0.0.2", 00:31:57.075 "trsvcid": "4420" 00:31:57.075 } 00:31:57.075 }, 00:31:57.075 { 00:31:57.075 "method": "bdev_set_options", 00:31:57.075 "params": { 00:31:57.075 "bdev_auto_examine": false 00:31:57.075 } 00:31:57.075 } 00:31:57.075 ] 00:31:57.075 } 00:31:57.075 ] 00:31:57.075 }' 00:31:57.075 00:13:57 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.YxHMGaO1Eh --ib Nvme0n1 --bs 65536 --count 1 00:31:57.075 00:13:57 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:31:57.075 "subsystems": [ 00:31:57.075 { 00:31:57.075 "subsystem": "bdev", 00:31:57.075 "config": [ 00:31:57.075 { 00:31:57.075 "method": "bdev_nvme_attach_controller", 00:31:57.075 "params": { 00:31:57.075 "trtype": "tcp", 00:31:57.075 "adrfam": "IPv4", 00:31:57.075 "name": "Nvme0", 00:31:57.075 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:57.075 "traddr": "10.0.0.2", 00:31:57.075 "trsvcid": "4420" 00:31:57.075 } 00:31:57.075 }, 00:31:57.075 { 00:31:57.075 "method": "bdev_set_options", 00:31:57.075 "params": { 00:31:57.075 "bdev_auto_examine": false 00:31:57.075 } 00:31:57.075 } 00:31:57.075 ] 00:31:57.075 } 00:31:57.075 ] 00:31:57.075 }' 00:31:57.075 [2024-05-15 00:13:57.644131] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:31:57.075 [2024-05-15 00:13:57.644199] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid577659 ] 00:31:57.333 [2024-05-15 00:13:57.774884] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:57.333 [2024-05-15 00:13:57.874639] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:31:57.848  Copying: 64/64 [kB] (average 10 MBps) 00:31:57.848 00:31:57.848 00:13:58 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:31:57.848 00:13:58 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:57.848 00:13:58 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:57.848 00:13:58 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:57.848 00:13:58 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:57.848 00:13:58 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:57.848 00:13:58 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:31:57.848 00:13:58 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:57.848 00:13:58 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:57.848 00:13:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:57.848 00:13:58 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:57.848 00:13:58 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:31:57.848 00:13:58 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:31:57.848 00:13:58 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:57.848 00:13:58 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:57.848 00:13:58 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:57.848 00:13:58 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:57.848 00:13:58 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:57.848 00:13:58 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:57.848 00:13:58 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:57.848 00:13:58 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:57.848 00:13:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:57.848 00:13:58 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:57.848 00:13:58 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:31:57.848 00:13:58 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:31:57.848 00:13:58 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:57.848 00:13:58 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:57.848 00:13:58 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:57.848 00:13:58 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:58.106 00:13:58 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:58.106 00:13:58 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:58.106 00:13:58 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:58.106 00:13:58 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:58.106 00:13:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:58.106 00:13:58 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:58.106 00:13:58 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:31:58.106 00:13:58 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:31:58.106 00:13:58 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:58.106 00:13:58 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:58.106 00:13:58 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:31:58.106 00:13:58 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:58.106 00:13:58 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:31:58.106 00:13:58 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:58.106 00:13:58 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:31:58.106 00:13:58 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:58.106 00:13:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:58.106 00:13:58 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:58.106 00:13:58 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:31:58.106 00:13:58 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.VB5vGaIeUx /tmp/tmp.YxHMGaO1Eh 00:31:58.106 00:13:58 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:31:58.106 00:13:58 chaining -- bdev/chaining.sh@25 -- # local config 00:31:58.106 00:13:58 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:31:58.106 00:13:58 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:31:58.106 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:31:58.106 00:13:58 chaining -- bdev/chaining.sh@31 -- # config='{ 00:31:58.106 "subsystems": [ 00:31:58.106 { 00:31:58.106 "subsystem": "bdev", 00:31:58.106 "config": [ 00:31:58.106 { 00:31:58.106 "method": "bdev_nvme_attach_controller", 00:31:58.106 "params": { 00:31:58.106 "trtype": "tcp", 00:31:58.106 "adrfam": "IPv4", 00:31:58.106 "name": "Nvme0", 00:31:58.106 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:58.106 "traddr": "10.0.0.2", 00:31:58.106 "trsvcid": "4420" 00:31:58.106 } 00:31:58.107 }, 00:31:58.107 { 00:31:58.107 "method": "bdev_set_options", 00:31:58.107 "params": { 00:31:58.107 "bdev_auto_examine": false 00:31:58.107 } 00:31:58.107 } 00:31:58.107 ] 00:31:58.107 } 00:31:58.107 ] 00:31:58.107 }' 00:31:58.107 00:13:58 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:31:58.107 00:13:58 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:31:58.107 "subsystems": [ 00:31:58.107 { 00:31:58.107 "subsystem": "bdev", 00:31:58.107 "config": [ 00:31:58.107 { 00:31:58.107 "method": "bdev_nvme_attach_controller", 00:31:58.107 "params": { 00:31:58.107 "trtype": "tcp", 00:31:58.107 "adrfam": "IPv4", 00:31:58.107 "name": "Nvme0", 00:31:58.107 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:58.107 "traddr": "10.0.0.2", 00:31:58.107 "trsvcid": "4420" 00:31:58.107 } 00:31:58.107 }, 00:31:58.107 { 00:31:58.107 "method": "bdev_set_options", 00:31:58.107 "params": { 00:31:58.107 "bdev_auto_examine": false 00:31:58.107 } 00:31:58.107 } 00:31:58.107 ] 00:31:58.107 } 00:31:58.107 ] 00:31:58.107 }' 00:31:58.107 [2024-05-15 00:13:58.643241] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:31:58.107 [2024-05-15 00:13:58.643303] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid577785 ] 00:31:58.364 [2024-05-15 00:13:58.772418] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:58.365 [2024-05-15 00:13:58.869266] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:31:58.881  Copying: 64/64 [kB] (average 20 MBps) 00:31:58.881 00:31:58.881 00:13:59 chaining -- bdev/chaining.sh@106 -- # update_stats 00:31:58.881 00:13:59 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:31:58.881 00:13:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:58.881 00:13:59 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:58.881 00:13:59 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:58.881 00:13:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:58.881 00:13:59 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:58.881 00:13:59 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:31:58.881 00:13:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:58.881 00:13:59 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:58.881 00:13:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:58.881 00:13:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:58.881 00:13:59 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:31:58.881 00:13:59 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:31:58.881 00:13:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:58.881 00:13:59 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:58.881 00:13:59 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:58.881 00:13:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:58.881 00:13:59 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:58.881 00:13:59 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:58.881 00:13:59 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:58.881 00:13:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:58.881 00:13:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:58.881 00:13:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:58.881 00:13:59 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:31:58.881 00:13:59 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:31:58.881 00:13:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:58.881 00:13:59 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:58.881 00:13:59 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:58.881 00:13:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:58.881 00:13:59 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:58.881 00:13:59 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:58.881 00:13:59 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:58.881 00:13:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:58.881 00:13:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:58.881 00:13:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:58.881 00:13:59 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:31:59.139 00:13:59 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:31:59.139 00:13:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:59.139 00:13:59 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:59.139 00:13:59 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:31:59.139 00:13:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:59.139 00:13:59 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:31:59.139 00:13:59 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:59.139 00:13:59 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:31:59.139 00:13:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:59.139 00:13:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:59.139 00:13:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:59.139 00:13:59 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:31:59.139 00:13:59 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.VB5vGaIeUx --ob Nvme0n1 --bs 4096 --count 16 00:31:59.139 00:13:59 chaining -- bdev/chaining.sh@25 -- # local config 00:31:59.139 00:13:59 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:31:59.139 00:13:59 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:31:59.139 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:31:59.139 00:13:59 chaining -- bdev/chaining.sh@31 -- # config='{ 00:31:59.139 "subsystems": [ 00:31:59.139 { 00:31:59.139 "subsystem": "bdev", 00:31:59.139 "config": [ 00:31:59.139 { 00:31:59.139 "method": "bdev_nvme_attach_controller", 00:31:59.139 "params": { 00:31:59.139 "trtype": "tcp", 00:31:59.139 "adrfam": "IPv4", 00:31:59.139 "name": "Nvme0", 00:31:59.139 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:59.139 "traddr": "10.0.0.2", 00:31:59.139 "trsvcid": "4420" 00:31:59.139 } 00:31:59.139 }, 00:31:59.139 { 00:31:59.139 "method": "bdev_set_options", 00:31:59.139 "params": { 00:31:59.139 "bdev_auto_examine": false 00:31:59.139 } 00:31:59.139 } 00:31:59.139 ] 00:31:59.139 } 00:31:59.139 ] 00:31:59.139 }' 00:31:59.139 00:13:59 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.VB5vGaIeUx --ob Nvme0n1 --bs 4096 --count 16 00:31:59.139 00:13:59 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:31:59.139 "subsystems": [ 00:31:59.139 { 00:31:59.139 "subsystem": "bdev", 00:31:59.139 "config": [ 00:31:59.139 { 00:31:59.139 "method": "bdev_nvme_attach_controller", 00:31:59.139 "params": { 00:31:59.139 "trtype": "tcp", 00:31:59.139 "adrfam": "IPv4", 00:31:59.139 "name": "Nvme0", 00:31:59.139 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:59.139 "traddr": "10.0.0.2", 00:31:59.139 "trsvcid": "4420" 00:31:59.139 } 00:31:59.139 }, 00:31:59.139 { 00:31:59.139 "method": "bdev_set_options", 00:31:59.139 "params": { 00:31:59.139 "bdev_auto_examine": false 00:31:59.139 } 00:31:59.139 } 00:31:59.139 ] 00:31:59.139 } 00:31:59.139 ] 00:31:59.139 }' 00:31:59.139 [2024-05-15 00:13:59.614994] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:31:59.139 [2024-05-15 00:13:59.615064] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid577894 ] 00:31:59.397 [2024-05-15 00:13:59.743681] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:59.397 [2024-05-15 00:13:59.842333] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:31:59.913  Copying: 64/64 [kB] (average 8000 kBps) 00:31:59.913 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:59.913 00:14:00 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:59.913 00:14:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:59.913 00:14:00 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:59.913 00:14:00 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:59.913 00:14:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:59.913 00:14:00 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:59.913 00:14:00 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:59.913 00:14:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:59.913 00:14:00 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:59.913 00:14:00 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:31:59.913 00:14:00 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:59.913 00:14:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:59.913 00:14:00 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@114 -- # update_stats 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@39 -- # opcode= 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:32:00.171 00:14:00 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:00.171 00:14:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:32:00.171 00:14:00 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:32:00.171 00:14:00 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:00.171 00:14:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:00.171 00:14:00 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:32:00.171 00:14:00 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:00.171 00:14:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:00.171 00:14:00 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:00.171 00:14:00 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:32:00.172 00:14:00 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:00.172 00:14:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:00.172 00:14:00 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:00.172 00:14:00 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:32:00.172 00:14:00 chaining -- bdev/chaining.sh@117 -- # : 00:32:00.172 00:14:00 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.YxHMGaO1Eh --ib Nvme0n1 --bs 4096 --count 16 00:32:00.172 00:14:00 chaining -- bdev/chaining.sh@25 -- # local config 00:32:00.172 00:14:00 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:32:00.172 00:14:00 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:32:00.172 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:32:00.429 00:14:00 chaining -- bdev/chaining.sh@31 -- # config='{ 00:32:00.429 "subsystems": [ 00:32:00.429 { 00:32:00.429 "subsystem": "bdev", 00:32:00.429 "config": [ 00:32:00.429 { 00:32:00.429 "method": "bdev_nvme_attach_controller", 00:32:00.429 "params": { 00:32:00.429 "trtype": "tcp", 00:32:00.429 "adrfam": "IPv4", 00:32:00.429 "name": "Nvme0", 00:32:00.429 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:32:00.429 "traddr": "10.0.0.2", 00:32:00.430 "trsvcid": "4420" 00:32:00.430 } 00:32:00.430 }, 00:32:00.430 { 00:32:00.430 "method": "bdev_set_options", 00:32:00.430 "params": { 00:32:00.430 "bdev_auto_examine": false 00:32:00.430 } 00:32:00.430 } 00:32:00.430 ] 00:32:00.430 } 00:32:00.430 ] 00:32:00.430 }' 00:32:00.430 00:14:00 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.YxHMGaO1Eh --ib Nvme0n1 --bs 4096 --count 16 00:32:00.430 00:14:00 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:32:00.430 "subsystems": [ 00:32:00.430 { 00:32:00.430 "subsystem": "bdev", 00:32:00.430 "config": [ 00:32:00.430 { 00:32:00.430 "method": "bdev_nvme_attach_controller", 00:32:00.430 "params": { 00:32:00.430 "trtype": "tcp", 00:32:00.430 "adrfam": "IPv4", 00:32:00.430 "name": "Nvme0", 00:32:00.430 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:32:00.430 "traddr": "10.0.0.2", 00:32:00.430 "trsvcid": "4420" 00:32:00.430 } 00:32:00.430 }, 00:32:00.430 { 00:32:00.430 "method": "bdev_set_options", 00:32:00.430 "params": { 00:32:00.430 "bdev_auto_examine": false 00:32:00.430 } 00:32:00.430 } 00:32:00.430 ] 00:32:00.430 } 00:32:00.430 ] 00:32:00.430 }' 00:32:00.430 [2024-05-15 00:14:00.827994] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:32:00.430 [2024-05-15 00:14:00.828057] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid578102 ] 00:32:00.430 [2024-05-15 00:14:00.954847] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:00.687 [2024-05-15 00:14:01.052515] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:32:01.253  Copying: 64/64 [kB] (average 1361 kBps) 00:32:01.253 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@39 -- # opcode= 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:32:01.253 00:14:01 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:32:01.253 00:14:01 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:01.253 00:14:01 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:32:01.253 00:14:01 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:01.253 00:14:01 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:01.253 00:14:01 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:32:01.253 00:14:01 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:01.253 00:14:01 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:01.253 00:14:01 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:32:01.253 00:14:01 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:01.253 00:14:01 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:01.253 00:14:01 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.VB5vGaIeUx /tmp/tmp.YxHMGaO1Eh 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.VB5vGaIeUx /tmp/tmp.YxHMGaO1Eh 00:32:01.253 00:14:01 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:32:01.253 00:14:01 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:32:01.253 00:14:01 chaining -- nvmf/common.sh@117 -- # sync 00:32:01.253 00:14:01 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:32:01.253 00:14:01 chaining -- nvmf/common.sh@120 -- # set +e 00:32:01.253 00:14:01 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:32:01.253 00:14:01 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:32:01.253 rmmod nvme_tcp 00:32:01.253 rmmod nvme_fabrics 00:32:01.253 rmmod nvme_keyring 00:32:01.512 00:14:01 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:32:01.512 00:14:01 chaining -- nvmf/common.sh@124 -- # set -e 00:32:01.512 00:14:01 chaining -- nvmf/common.sh@125 -- # return 0 00:32:01.512 00:14:01 chaining -- nvmf/common.sh@489 -- # '[' -n 577223 ']' 00:32:01.512 00:14:01 chaining -- nvmf/common.sh@490 -- # killprocess 577223 00:32:01.512 00:14:01 chaining -- common/autotest_common.sh@946 -- # '[' -z 577223 ']' 00:32:01.512 00:14:01 chaining -- common/autotest_common.sh@950 -- # kill -0 577223 00:32:01.512 00:14:01 chaining -- common/autotest_common.sh@951 -- # uname 00:32:01.512 00:14:01 chaining -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:32:01.512 00:14:01 chaining -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 577223 00:32:01.512 00:14:01 chaining -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:32:01.512 00:14:01 chaining -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:32:01.512 00:14:01 chaining -- common/autotest_common.sh@964 -- # echo 'killing process with pid 577223' 00:32:01.512 killing process with pid 577223 00:32:01.512 00:14:01 chaining -- common/autotest_common.sh@965 -- # kill 577223 00:32:01.512 [2024-05-15 00:14:01.917753] app.c:1024:log_deprecation_hits: *WARNING*: decode_rpc_listen_address: deprecation '[listen_]address.transport is deprecated in favor of trtype' scheduled for removal in v24.09 hit 1 times 00:32:01.512 00:14:01 chaining -- common/autotest_common.sh@970 -- # wait 577223 00:32:01.771 00:14:02 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:32:01.771 00:14:02 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:32:01.771 00:14:02 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:32:01.771 00:14:02 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:32:01.771 00:14:02 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:32:01.771 00:14:02 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:32:01.771 00:14:02 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:32:01.771 00:14:02 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:32:01.771 00:14:02 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:32:01.771 00:14:02 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:32:01.771 00:14:02 chaining -- bdev/chaining.sh@132 -- # bperfpid=578316 00:32:01.771 00:14:02 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:32:01.771 00:14:02 chaining -- bdev/chaining.sh@134 -- # waitforlisten 578316 00:32:01.771 00:14:02 chaining -- common/autotest_common.sh@827 -- # '[' -z 578316 ']' 00:32:01.771 00:14:02 chaining -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:01.771 00:14:02 chaining -- common/autotest_common.sh@832 -- # local max_retries=100 00:32:01.771 00:14:02 chaining -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:01.771 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:01.771 00:14:02 chaining -- common/autotest_common.sh@836 -- # xtrace_disable 00:32:01.771 00:14:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:01.771 [2024-05-15 00:14:02.302219] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:32:01.771 [2024-05-15 00:14:02.302295] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid578316 ] 00:32:02.028 [2024-05-15 00:14:02.429175] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:02.028 [2024-05-15 00:14:02.532342] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:32:02.961 00:14:03 chaining -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:32:02.961 00:14:03 chaining -- common/autotest_common.sh@860 -- # return 0 00:32:02.961 00:14:03 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:32:02.962 00:14:03 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:02.962 00:14:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:02.962 malloc0 00:32:02.962 true 00:32:02.962 true 00:32:02.962 [2024-05-15 00:14:03.373045] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:32:02.962 crypto0 00:32:02.962 [2024-05-15 00:14:03.381070] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:32:02.962 crypto1 00:32:02.962 00:14:03 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:02.962 00:14:03 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:32:02.962 Running I/O for 5 seconds... 00:32:08.235 00:32:08.235 Latency(us) 00:32:08.235 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:08.235 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:32:08.235 Verification LBA range: start 0x0 length 0x2000 00:32:08.235 crypto1 : 5.01 11387.82 44.48 0.00 0.00 22417.68 6753.06 14474.91 00:32:08.235 =================================================================================================================== 00:32:08.235 Total : 11387.82 44.48 0.00 0.00 22417.68 6753.06 14474.91 00:32:08.235 0 00:32:08.235 00:14:08 chaining -- bdev/chaining.sh@146 -- # killprocess 578316 00:32:08.235 00:14:08 chaining -- common/autotest_common.sh@946 -- # '[' -z 578316 ']' 00:32:08.235 00:14:08 chaining -- common/autotest_common.sh@950 -- # kill -0 578316 00:32:08.235 00:14:08 chaining -- common/autotest_common.sh@951 -- # uname 00:32:08.235 00:14:08 chaining -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:32:08.235 00:14:08 chaining -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 578316 00:32:08.235 00:14:08 chaining -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:32:08.235 00:14:08 chaining -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:32:08.236 00:14:08 chaining -- common/autotest_common.sh@964 -- # echo 'killing process with pid 578316' 00:32:08.236 killing process with pid 578316 00:32:08.236 00:14:08 chaining -- common/autotest_common.sh@965 -- # kill 578316 00:32:08.236 Received shutdown signal, test time was about 5.000000 seconds 00:32:08.236 00:32:08.236 Latency(us) 00:32:08.236 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:08.236 =================================================================================================================== 00:32:08.236 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:08.236 00:14:08 chaining -- common/autotest_common.sh@970 -- # wait 578316 00:32:08.495 00:14:08 chaining -- bdev/chaining.sh@152 -- # bperfpid=579195 00:32:08.495 00:14:08 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:32:08.495 00:14:08 chaining -- bdev/chaining.sh@154 -- # waitforlisten 579195 00:32:08.495 00:14:08 chaining -- common/autotest_common.sh@827 -- # '[' -z 579195 ']' 00:32:08.495 00:14:08 chaining -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:08.495 00:14:08 chaining -- common/autotest_common.sh@832 -- # local max_retries=100 00:32:08.495 00:14:08 chaining -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:08.495 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:08.495 00:14:08 chaining -- common/autotest_common.sh@836 -- # xtrace_disable 00:32:08.495 00:14:08 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:08.495 [2024-05-15 00:14:08.908296] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:32:08.495 [2024-05-15 00:14:08.908376] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid579195 ] 00:32:08.495 [2024-05-15 00:14:09.038214] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:08.754 [2024-05-15 00:14:09.143972] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:32:09.322 00:14:09 chaining -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:32:09.322 00:14:09 chaining -- common/autotest_common.sh@860 -- # return 0 00:32:09.323 00:14:09 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:32:09.323 00:14:09 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:09.323 00:14:09 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:09.581 malloc0 00:32:09.582 true 00:32:09.582 true 00:32:09.582 [2024-05-15 00:14:09.990512] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:32:09.582 [2024-05-15 00:14:09.990562] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:09.582 [2024-05-15 00:14:09.990584] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24808a0 00:32:09.582 [2024-05-15 00:14:09.990597] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:09.582 [2024-05-15 00:14:09.991705] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:09.582 [2024-05-15 00:14:09.991732] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:32:09.582 pt0 00:32:09.582 [2024-05-15 00:14:09.998545] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:32:09.582 crypto0 00:32:09.582 [2024-05-15 00:14:10.006664] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:32:09.582 crypto1 00:32:09.582 00:14:10 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:09.582 00:14:10 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:32:09.582 Running I/O for 5 seconds... 00:32:14.857 00:32:14.857 Latency(us) 00:32:14.857 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:14.857 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:32:14.857 Verification LBA range: start 0x0 length 0x2000 00:32:14.857 crypto1 : 5.01 9053.86 35.37 0.00 0.00 28183.32 3433.52 16982.37 00:32:14.857 =================================================================================================================== 00:32:14.857 Total : 9053.86 35.37 0.00 0.00 28183.32 3433.52 16982.37 00:32:14.857 0 00:32:14.857 00:14:15 chaining -- bdev/chaining.sh@167 -- # killprocess 579195 00:32:14.857 00:14:15 chaining -- common/autotest_common.sh@946 -- # '[' -z 579195 ']' 00:32:14.857 00:14:15 chaining -- common/autotest_common.sh@950 -- # kill -0 579195 00:32:14.857 00:14:15 chaining -- common/autotest_common.sh@951 -- # uname 00:32:14.857 00:14:15 chaining -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:32:14.857 00:14:15 chaining -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 579195 00:32:14.857 00:14:15 chaining -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:32:14.857 00:14:15 chaining -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:32:14.857 00:14:15 chaining -- common/autotest_common.sh@964 -- # echo 'killing process with pid 579195' 00:32:14.857 killing process with pid 579195 00:32:14.857 00:14:15 chaining -- common/autotest_common.sh@965 -- # kill 579195 00:32:14.857 Received shutdown signal, test time was about 5.000000 seconds 00:32:14.857 00:32:14.857 Latency(us) 00:32:14.857 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:14.857 =================================================================================================================== 00:32:14.857 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:14.857 00:14:15 chaining -- common/autotest_common.sh@970 -- # wait 579195 00:32:15.150 00:14:15 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:32:15.150 00:14:15 chaining -- bdev/chaining.sh@170 -- # killprocess 579195 00:32:15.150 00:14:15 chaining -- common/autotest_common.sh@946 -- # '[' -z 579195 ']' 00:32:15.150 00:14:15 chaining -- common/autotest_common.sh@950 -- # kill -0 579195 00:32:15.150 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 950: kill: (579195) - No such process 00:32:15.150 00:14:15 chaining -- common/autotest_common.sh@973 -- # echo 'Process with pid 579195 is not found' 00:32:15.150 Process with pid 579195 is not found 00:32:15.150 00:14:15 chaining -- bdev/chaining.sh@171 -- # wait 579195 00:32:15.150 00:14:15 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:32:15.150 00:14:15 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:32:15.150 00:14:15 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:32:15.150 00:14:15 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:32:15.150 00:14:15 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:32:15.150 00:14:15 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:32:15.150 00:14:15 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:32:15.150 00:14:15 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:32:15.150 00:14:15 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:32:15.150 00:14:15 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:32:15.150 00:14:15 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:32:15.150 00:14:15 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:32:15.150 00:14:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:15.150 00:14:15 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@296 -- # e810=() 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@297 -- # x722=() 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@298 -- # mlx=() 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@336 -- # return 1 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:32:15.151 WARNING: No supported devices were found, fallback requested for tcp test 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:32:15.151 Cannot find device "nvmf_tgt_br" 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@155 -- # true 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:32:15.151 Cannot find device "nvmf_tgt_br2" 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@156 -- # true 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:32:15.151 Cannot find device "nvmf_tgt_br" 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@158 -- # true 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:32:15.151 Cannot find device "nvmf_tgt_br2" 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@159 -- # true 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:32:15.151 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@162 -- # true 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:32:15.151 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@163 -- # true 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:32:15.151 00:14:15 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:32:15.410 00:14:15 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:32:15.410 00:14:15 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:32:15.410 00:14:15 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:32:15.410 00:14:15 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:32:15.411 00:14:15 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:32:15.411 00:14:15 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:32:15.411 00:14:15 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:32:15.411 00:14:15 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:32:15.411 00:14:15 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:32:15.411 00:14:15 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:32:15.411 00:14:15 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:32:15.411 00:14:15 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:32:15.411 00:14:15 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:32:15.411 00:14:15 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:32:15.411 00:14:15 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:32:15.411 00:14:15 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:32:15.670 00:14:16 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:32:15.670 00:14:16 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:32:15.670 00:14:16 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:32:15.670 00:14:16 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:32:15.670 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:32:15.670 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.125 ms 00:32:15.670 00:32:15.670 --- 10.0.0.2 ping statistics --- 00:32:15.670 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:32:15.670 rtt min/avg/max/mdev = 0.125/0.125/0.125/0.000 ms 00:32:15.670 00:14:16 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:32:15.670 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:32:15.670 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.067 ms 00:32:15.670 00:32:15.670 --- 10.0.0.3 ping statistics --- 00:32:15.670 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:32:15.671 rtt min/avg/max/mdev = 0.067/0.067/0.067/0.000 ms 00:32:15.671 00:14:16 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:32:15.671 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:32:15.671 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.039 ms 00:32:15.671 00:32:15.671 --- 10.0.0.1 ping statistics --- 00:32:15.671 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:32:15.671 rtt min/avg/max/mdev = 0.039/0.039/0.039/0.000 ms 00:32:15.671 00:14:16 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:32:15.671 00:14:16 chaining -- nvmf/common.sh@433 -- # return 0 00:32:15.671 00:14:16 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:32:15.671 00:14:16 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:32:15.671 00:14:16 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:32:15.671 00:14:16 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:32:15.671 00:14:16 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:32:15.671 00:14:16 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:32:15.671 00:14:16 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:32:15.929 00:14:16 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:32:15.929 00:14:16 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:32:15.929 00:14:16 chaining -- common/autotest_common.sh@720 -- # xtrace_disable 00:32:15.930 00:14:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:15.930 00:14:16 chaining -- nvmf/common.sh@481 -- # nvmfpid=580342 00:32:15.930 00:14:16 chaining -- nvmf/common.sh@482 -- # waitforlisten 580342 00:32:15.930 00:14:16 chaining -- common/autotest_common.sh@827 -- # '[' -z 580342 ']' 00:32:15.930 00:14:16 chaining -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:15.930 00:14:16 chaining -- common/autotest_common.sh@832 -- # local max_retries=100 00:32:15.930 00:14:16 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:32:15.930 00:14:16 chaining -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:15.930 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:15.930 00:14:16 chaining -- common/autotest_common.sh@836 -- # xtrace_disable 00:32:15.930 00:14:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:15.930 [2024-05-15 00:14:16.359553] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:32:15.930 [2024-05-15 00:14:16.359628] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:32:15.930 [2024-05-15 00:14:16.488408] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:16.189 [2024-05-15 00:14:16.594134] app.c: 604:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:32:16.189 [2024-05-15 00:14:16.594178] app.c: 605:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:32:16.189 [2024-05-15 00:14:16.594192] app.c: 610:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:32:16.189 [2024-05-15 00:14:16.594205] app.c: 611:app_setup_trace: *NOTICE*: SPDK application currently running. 00:32:16.189 [2024-05-15 00:14:16.594216] app.c: 612:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:32:16.189 [2024-05-15 00:14:16.594252] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:32:16.756 00:14:17 chaining -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:32:16.756 00:14:17 chaining -- common/autotest_common.sh@860 -- # return 0 00:32:16.756 00:14:17 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:32:16.756 00:14:17 chaining -- common/autotest_common.sh@726 -- # xtrace_disable 00:32:16.756 00:14:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:16.756 00:14:17 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:32:16.756 00:14:17 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:32:16.756 00:14:17 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:16.756 00:14:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:16.756 malloc0 00:32:16.756 [2024-05-15 00:14:17.328762] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:32:16.756 [2024-05-15 00:14:17.344731] nvmf_rpc.c: 610:decode_rpc_listen_address: *WARNING*: decode_rpc_listen_address: deprecated feature [listen_]address.transport is deprecated in favor of trtype to be removed in v24.09 00:32:16.756 [2024-05-15 00:14:17.345009] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:32:17.016 00:14:17 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:17.016 00:14:17 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:32:17.016 00:14:17 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:32:17.016 00:14:17 chaining -- bdev/chaining.sh@189 -- # bperfpid=580535 00:32:17.016 00:14:17 chaining -- bdev/chaining.sh@191 -- # waitforlisten 580535 /var/tmp/bperf.sock 00:32:17.016 00:14:17 chaining -- common/autotest_common.sh@827 -- # '[' -z 580535 ']' 00:32:17.016 00:14:17 chaining -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/bperf.sock 00:32:17.016 00:14:17 chaining -- common/autotest_common.sh@832 -- # local max_retries=100 00:32:17.016 00:14:17 chaining -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:32:17.016 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:32:17.016 00:14:17 chaining -- common/autotest_common.sh@836 -- # xtrace_disable 00:32:17.016 00:14:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:17.016 [2024-05-15 00:14:17.391916] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:32:17.016 [2024-05-15 00:14:17.391960] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid580535 ] 00:32:17.016 [2024-05-15 00:14:17.503258] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:17.016 [2024-05-15 00:14:17.600801] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:32:17.951 00:14:18 chaining -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:32:17.951 00:14:18 chaining -- common/autotest_common.sh@860 -- # return 0 00:32:17.951 00:14:18 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:32:17.951 00:14:18 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:32:18.209 [2024-05-15 00:14:18.678954] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:32:18.209 nvme0n1 00:32:18.209 true 00:32:18.209 crypto0 00:32:18.209 00:14:18 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:32:18.467 Running I/O for 5 seconds... 00:32:23.739 00:32:23.739 Latency(us) 00:32:23.739 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:23.739 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:32:23.739 Verification LBA range: start 0x0 length 0x2000 00:32:23.740 crypto0 : 5.02 8284.75 32.36 0.00 0.00 30801.81 5271.37 24276.81 00:32:23.740 =================================================================================================================== 00:32:23.740 Total : 8284.75 32.36 0.00 0.00 30801.81 5271.37 24276.81 00:32:23.740 0 00:32:23.740 00:14:23 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:32:23.740 00:14:23 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:32:23.740 00:14:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:23.740 00:14:23 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:32:23.740 00:14:23 chaining -- bdev/chaining.sh@39 -- # opcode= 00:32:23.740 00:14:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:32:23.740 00:14:23 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:32:23.740 00:14:23 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:32:23.740 00:14:23 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:32:23.740 00:14:23 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:32:23.740 00:14:24 chaining -- bdev/chaining.sh@205 -- # sequence=83196 00:32:23.740 00:14:24 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:32:23.740 00:14:24 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:32:23.740 00:14:24 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:23.740 00:14:24 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:23.740 00:14:24 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:32:23.740 00:14:24 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:32:23.740 00:14:24 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:32:23.740 00:14:24 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:32:23.740 00:14:24 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:32:23.740 00:14:24 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:32:23.999 00:14:24 chaining -- bdev/chaining.sh@206 -- # encrypt=41598 00:32:23.999 00:14:24 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:32:23.999 00:14:24 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:32:23.999 00:14:24 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:23.999 00:14:24 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:23.999 00:14:24 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:32:23.999 00:14:24 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:32:23.999 00:14:24 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:32:23.999 00:14:24 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:32:23.999 00:14:24 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:32:23.999 00:14:24 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:32:24.258 00:14:24 chaining -- bdev/chaining.sh@207 -- # decrypt=41598 00:32:24.258 00:14:24 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:32:24.258 00:14:24 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:32:24.258 00:14:24 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:24.258 00:14:24 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:24.258 00:14:24 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:32:24.258 00:14:24 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:32:24.258 00:14:24 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:32:24.258 00:14:24 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:32:24.258 00:14:24 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:32:24.258 00:14:24 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:32:24.517 00:14:24 chaining -- bdev/chaining.sh@208 -- # crc32c=83196 00:32:24.517 00:14:24 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:32:24.517 00:14:24 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:32:24.517 00:14:24 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:32:24.517 00:14:24 chaining -- bdev/chaining.sh@214 -- # killprocess 580535 00:32:24.517 00:14:24 chaining -- common/autotest_common.sh@946 -- # '[' -z 580535 ']' 00:32:24.517 00:14:24 chaining -- common/autotest_common.sh@950 -- # kill -0 580535 00:32:24.517 00:14:24 chaining -- common/autotest_common.sh@951 -- # uname 00:32:24.517 00:14:24 chaining -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:32:24.517 00:14:24 chaining -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 580535 00:32:24.517 00:14:24 chaining -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:32:24.517 00:14:24 chaining -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:32:24.517 00:14:24 chaining -- common/autotest_common.sh@964 -- # echo 'killing process with pid 580535' 00:32:24.517 killing process with pid 580535 00:32:24.517 00:14:24 chaining -- common/autotest_common.sh@965 -- # kill 580535 00:32:24.517 Received shutdown signal, test time was about 5.000000 seconds 00:32:24.517 00:32:24.517 Latency(us) 00:32:24.517 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:24.517 =================================================================================================================== 00:32:24.517 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:24.517 00:14:24 chaining -- common/autotest_common.sh@970 -- # wait 580535 00:32:24.776 00:14:25 chaining -- bdev/chaining.sh@219 -- # bperfpid=581599 00:32:24.776 00:14:25 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:32:24.776 00:14:25 chaining -- bdev/chaining.sh@221 -- # waitforlisten 581599 /var/tmp/bperf.sock 00:32:24.776 00:14:25 chaining -- common/autotest_common.sh@827 -- # '[' -z 581599 ']' 00:32:24.776 00:14:25 chaining -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/bperf.sock 00:32:24.776 00:14:25 chaining -- common/autotest_common.sh@832 -- # local max_retries=100 00:32:24.776 00:14:25 chaining -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:32:24.776 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:32:24.776 00:14:25 chaining -- common/autotest_common.sh@836 -- # xtrace_disable 00:32:24.776 00:14:25 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:24.776 [2024-05-15 00:14:25.245754] Starting SPDK v24.05-pre git sha1 52939f252 / DPDK 23.11.0 initialization... 00:32:24.776 [2024-05-15 00:14:25.245829] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid581599 ] 00:32:25.035 [2024-05-15 00:14:25.377072] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:25.035 [2024-05-15 00:14:25.482275] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:32:25.603 00:14:26 chaining -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:32:25.603 00:14:26 chaining -- common/autotest_common.sh@860 -- # return 0 00:32:25.603 00:14:26 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:32:25.603 00:14:26 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:32:25.863 [2024-05-15 00:14:26.441911] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:32:25.863 nvme0n1 00:32:25.863 true 00:32:25.863 crypto0 00:32:26.123 00:14:26 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:32:26.123 Running I/O for 5 seconds... 00:32:31.399 00:32:31.399 Latency(us) 00:32:31.399 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:31.399 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:32:31.399 Verification LBA range: start 0x0 length 0x200 00:32:31.399 crypto0 : 5.01 1678.82 104.93 0.00 0.00 18681.89 2151.29 18919.96 00:32:31.399 =================================================================================================================== 00:32:31.399 Total : 1678.82 104.93 0.00 0.00 18681.89 2151.29 18919.96 00:32:31.399 0 00:32:31.399 00:14:31 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:32:31.399 00:14:31 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:32:31.399 00:14:31 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:31.399 00:14:31 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:32:31.399 00:14:31 chaining -- bdev/chaining.sh@39 -- # opcode= 00:32:31.399 00:14:31 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:32:31.399 00:14:31 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:32:31.399 00:14:31 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:32:31.399 00:14:31 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:32:31.399 00:14:31 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:32:31.399 00:14:31 chaining -- bdev/chaining.sh@233 -- # sequence=16812 00:32:31.399 00:14:31 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:32:31.399 00:14:31 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:32:31.399 00:14:31 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:31.399 00:14:31 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:31.399 00:14:31 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:32:31.399 00:14:31 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:32:31.399 00:14:31 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:32:31.399 00:14:31 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:32:31.399 00:14:31 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:32:31.399 00:14:31 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:32:31.658 00:14:32 chaining -- bdev/chaining.sh@234 -- # encrypt=8406 00:32:31.658 00:14:32 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:32:31.658 00:14:32 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:32:31.658 00:14:32 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:31.658 00:14:32 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:31.658 00:14:32 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:32:31.658 00:14:32 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:32:31.658 00:14:32 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:32:31.658 00:14:32 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:32:31.658 00:14:32 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:32:31.658 00:14:32 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:32:31.917 00:14:32 chaining -- bdev/chaining.sh@235 -- # decrypt=8406 00:32:31.917 00:14:32 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:32:31.917 00:14:32 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:32:31.917 00:14:32 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:31.917 00:14:32 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:31.917 00:14:32 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:32:31.917 00:14:32 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:32:31.917 00:14:32 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:32:31.917 00:14:32 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:32:31.917 00:14:32 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:32:31.917 00:14:32 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:32:32.177 00:14:32 chaining -- bdev/chaining.sh@236 -- # crc32c=16812 00:32:32.177 00:14:32 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:32:32.177 00:14:32 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:32:32.177 00:14:32 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:32:32.177 00:14:32 chaining -- bdev/chaining.sh@242 -- # killprocess 581599 00:32:32.177 00:14:32 chaining -- common/autotest_common.sh@946 -- # '[' -z 581599 ']' 00:32:32.177 00:14:32 chaining -- common/autotest_common.sh@950 -- # kill -0 581599 00:32:32.177 00:14:32 chaining -- common/autotest_common.sh@951 -- # uname 00:32:32.177 00:14:32 chaining -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:32:32.177 00:14:32 chaining -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 581599 00:32:32.177 00:14:32 chaining -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:32:32.177 00:14:32 chaining -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:32:32.177 00:14:32 chaining -- common/autotest_common.sh@964 -- # echo 'killing process with pid 581599' 00:32:32.177 killing process with pid 581599 00:32:32.177 00:14:32 chaining -- common/autotest_common.sh@965 -- # kill 581599 00:32:32.177 Received shutdown signal, test time was about 5.000000 seconds 00:32:32.177 00:32:32.177 Latency(us) 00:32:32.177 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:32.177 =================================================================================================================== 00:32:32.177 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:32.177 00:14:32 chaining -- common/autotest_common.sh@970 -- # wait 581599 00:32:32.437 00:14:32 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:32:32.437 00:14:32 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:32:32.437 00:14:32 chaining -- nvmf/common.sh@117 -- # sync 00:32:32.437 00:14:32 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:32:32.437 00:14:32 chaining -- nvmf/common.sh@120 -- # set +e 00:32:32.437 00:14:32 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:32:32.437 00:14:32 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:32:32.437 rmmod nvme_tcp 00:32:32.437 rmmod nvme_fabrics 00:32:32.437 rmmod nvme_keyring 00:32:32.437 00:14:32 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:32:32.437 00:14:32 chaining -- nvmf/common.sh@124 -- # set -e 00:32:32.437 00:14:32 chaining -- nvmf/common.sh@125 -- # return 0 00:32:32.437 00:14:32 chaining -- nvmf/common.sh@489 -- # '[' -n 580342 ']' 00:32:32.437 00:14:32 chaining -- nvmf/common.sh@490 -- # killprocess 580342 00:32:32.437 00:14:32 chaining -- common/autotest_common.sh@946 -- # '[' -z 580342 ']' 00:32:32.437 00:14:32 chaining -- common/autotest_common.sh@950 -- # kill -0 580342 00:32:32.437 00:14:32 chaining -- common/autotest_common.sh@951 -- # uname 00:32:32.437 00:14:32 chaining -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:32:32.437 00:14:32 chaining -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 580342 00:32:32.437 00:14:32 chaining -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:32:32.437 00:14:32 chaining -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:32:32.437 00:14:32 chaining -- common/autotest_common.sh@964 -- # echo 'killing process with pid 580342' 00:32:32.437 killing process with pid 580342 00:32:32.437 00:14:32 chaining -- common/autotest_common.sh@965 -- # kill 580342 00:32:32.437 [2024-05-15 00:14:33.000707] app.c:1024:log_deprecation_hits: *WARNING*: decode_rpc_listen_address: deprecation '[listen_]address.transport is deprecated in favor of trtype' scheduled for removal in v24.09 hit 1 times 00:32:32.437 00:14:32 chaining -- common/autotest_common.sh@970 -- # wait 580342 00:32:32.696 00:14:33 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:32:32.696 00:14:33 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:32:32.697 00:14:33 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:32:32.697 00:14:33 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:32:32.697 00:14:33 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:32:32.697 00:14:33 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:32:32.697 00:14:33 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:32:32.697 00:14:33 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:32:32.956 00:14:33 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:32:32.956 00:14:33 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:32:32.956 00:32:32.956 real 0m46.708s 00:32:32.956 user 0m59.936s 00:32:32.956 sys 0m13.490s 00:32:32.956 00:14:33 chaining -- common/autotest_common.sh@1122 -- # xtrace_disable 00:32:32.956 00:14:33 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:32.956 ************************************ 00:32:32.956 END TEST chaining 00:32:32.956 ************************************ 00:32:32.956 00:14:33 -- spdk/autotest.sh@359 -- # [[ 0 -eq 1 ]] 00:32:32.956 00:14:33 -- spdk/autotest.sh@363 -- # [[ 0 -eq 1 ]] 00:32:32.956 00:14:33 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:32:32.956 00:14:33 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:32:32.956 00:14:33 -- spdk/autotest.sh@376 -- # trap - SIGINT SIGTERM EXIT 00:32:32.956 00:14:33 -- spdk/autotest.sh@378 -- # timing_enter post_cleanup 00:32:32.956 00:14:33 -- common/autotest_common.sh@720 -- # xtrace_disable 00:32:32.956 00:14:33 -- common/autotest_common.sh@10 -- # set +x 00:32:32.956 00:14:33 -- spdk/autotest.sh@379 -- # autotest_cleanup 00:32:32.956 00:14:33 -- common/autotest_common.sh@1388 -- # local autotest_es=0 00:32:32.956 00:14:33 -- common/autotest_common.sh@1389 -- # xtrace_disable 00:32:32.956 00:14:33 -- common/autotest_common.sh@10 -- # set +x 00:32:37.212 INFO: APP EXITING 00:32:37.212 INFO: killing all VMs 00:32:37.212 INFO: killing vhost app 00:32:37.212 INFO: EXIT DONE 00:32:40.501 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:32:40.501 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:32:40.501 Waiting for block devices as requested 00:32:40.501 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:32:40.501 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:32:40.760 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:32:40.760 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:32:40.760 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:32:41.020 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:32:41.020 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:32:41.020 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:32:41.279 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:32:41.279 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:32:41.279 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:32:41.538 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:32:41.538 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:32:41.538 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:32:41.797 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:32:41.797 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:32:41.797 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:32:45.990 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:32:45.990 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:32:45.990 Cleaning 00:32:45.990 Removing: /var/run/dpdk/spdk0/config 00:32:45.990 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:32:45.990 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:32:45.990 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:32:45.990 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:32:45.990 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:32:45.990 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:32:45.990 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:32:45.990 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:32:45.990 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:32:45.990 Removing: /var/run/dpdk/spdk0/hugepage_info 00:32:45.990 Removing: /dev/shm/nvmf_trace.0 00:32:45.990 Removing: /dev/shm/spdk_tgt_trace.pid345631 00:32:45.990 Removing: /var/run/dpdk/spdk0 00:32:45.990 Removing: /var/run/dpdk/spdk_pid344775 00:32:45.990 Removing: /var/run/dpdk/spdk_pid345631 00:32:45.990 Removing: /var/run/dpdk/spdk_pid346376 00:32:45.990 Removing: /var/run/dpdk/spdk_pid347554 00:32:45.990 Removing: /var/run/dpdk/spdk_pid347746 00:32:45.990 Removing: /var/run/dpdk/spdk_pid348500 00:32:45.990 Removing: /var/run/dpdk/spdk_pid348677 00:32:45.990 Removing: /var/run/dpdk/spdk_pid348959 00:32:45.990 Removing: /var/run/dpdk/spdk_pid351522 00:32:45.990 Removing: /var/run/dpdk/spdk_pid352755 00:32:45.990 Removing: /var/run/dpdk/spdk_pid353120 00:32:45.990 Removing: /var/run/dpdk/spdk_pid353382 00:32:45.990 Removing: /var/run/dpdk/spdk_pid353687 00:32:45.990 Removing: /var/run/dpdk/spdk_pid354040 00:32:45.990 Removing: /var/run/dpdk/spdk_pid354236 00:32:45.990 Removing: /var/run/dpdk/spdk_pid354440 00:32:45.990 Removing: /var/run/dpdk/spdk_pid354664 00:32:45.990 Removing: /var/run/dpdk/spdk_pid355415 00:32:45.990 Removing: /var/run/dpdk/spdk_pid358111 00:32:45.990 Removing: /var/run/dpdk/spdk_pid358310 00:32:45.990 Removing: /var/run/dpdk/spdk_pid358551 00:32:45.990 Removing: /var/run/dpdk/spdk_pid358839 00:32:45.990 Removing: /var/run/dpdk/spdk_pid358954 00:32:45.990 Removing: /var/run/dpdk/spdk_pid359148 00:32:45.990 Removing: /var/run/dpdk/spdk_pid359375 00:32:45.990 Removing: /var/run/dpdk/spdk_pid359579 00:32:45.990 Removing: /var/run/dpdk/spdk_pid359773 00:32:45.990 Removing: /var/run/dpdk/spdk_pid359973 00:32:45.990 Removing: /var/run/dpdk/spdk_pid360252 00:32:45.990 Removing: /var/run/dpdk/spdk_pid360529 00:32:45.990 Removing: /var/run/dpdk/spdk_pid360729 00:32:45.990 Removing: /var/run/dpdk/spdk_pid360923 00:32:45.990 Removing: /var/run/dpdk/spdk_pid361122 00:32:45.990 Removing: /var/run/dpdk/spdk_pid361327 00:32:45.990 Removing: /var/run/dpdk/spdk_pid361620 00:32:45.990 Removing: /var/run/dpdk/spdk_pid361875 00:32:45.990 Removing: /var/run/dpdk/spdk_pid362073 00:32:45.990 Removing: /var/run/dpdk/spdk_pid362272 00:32:45.990 Removing: /var/run/dpdk/spdk_pid362466 00:32:45.990 Removing: /var/run/dpdk/spdk_pid362736 00:32:45.990 Removing: /var/run/dpdk/spdk_pid363024 00:32:45.990 Removing: /var/run/dpdk/spdk_pid363226 00:32:45.990 Removing: /var/run/dpdk/spdk_pid363420 00:32:45.990 Removing: /var/run/dpdk/spdk_pid363616 00:32:45.990 Removing: /var/run/dpdk/spdk_pid363981 00:32:45.990 Removing: /var/run/dpdk/spdk_pid364304 00:32:45.990 Removing: /var/run/dpdk/spdk_pid364556 00:32:46.250 Removing: /var/run/dpdk/spdk_pid364920 00:32:46.250 Removing: /var/run/dpdk/spdk_pid365288 00:32:46.250 Removing: /var/run/dpdk/spdk_pid365571 00:32:46.250 Removing: /var/run/dpdk/spdk_pid365861 00:32:46.250 Removing: /var/run/dpdk/spdk_pid366226 00:32:46.250 Removing: /var/run/dpdk/spdk_pid366435 00:32:46.250 Removing: /var/run/dpdk/spdk_pid366719 00:32:46.250 Removing: /var/run/dpdk/spdk_pid367187 00:32:46.250 Removing: /var/run/dpdk/spdk_pid367566 00:32:46.250 Removing: /var/run/dpdk/spdk_pid367737 00:32:46.250 Removing: /var/run/dpdk/spdk_pid371727 00:32:46.250 Removing: /var/run/dpdk/spdk_pid373555 00:32:46.250 Removing: /var/run/dpdk/spdk_pid375479 00:32:46.250 Removing: /var/run/dpdk/spdk_pid376370 00:32:46.250 Removing: /var/run/dpdk/spdk_pid377452 00:32:46.250 Removing: /var/run/dpdk/spdk_pid377811 00:32:46.250 Removing: /var/run/dpdk/spdk_pid377918 00:32:46.250 Removing: /var/run/dpdk/spdk_pid378024 00:32:46.250 Removing: /var/run/dpdk/spdk_pid381818 00:32:46.250 Removing: /var/run/dpdk/spdk_pid382329 00:32:46.250 Removing: /var/run/dpdk/spdk_pid383266 00:32:46.250 Removing: /var/run/dpdk/spdk_pid383466 00:32:46.250 Removing: /var/run/dpdk/spdk_pid388617 00:32:46.250 Removing: /var/run/dpdk/spdk_pid393341 00:32:46.250 Removing: /var/run/dpdk/spdk_pid398060 00:32:46.250 Removing: /var/run/dpdk/spdk_pid409047 00:32:46.250 Removing: /var/run/dpdk/spdk_pid419690 00:32:46.250 Removing: /var/run/dpdk/spdk_pid430828 00:32:46.250 Removing: /var/run/dpdk/spdk_pid443895 00:32:46.250 Removing: /var/run/dpdk/spdk_pid456468 00:32:46.250 Removing: /var/run/dpdk/spdk_pid468447 00:32:46.250 Removing: /var/run/dpdk/spdk_pid472414 00:32:46.250 Removing: /var/run/dpdk/spdk_pid475471 00:32:46.250 Removing: /var/run/dpdk/spdk_pid480686 00:32:46.250 Removing: /var/run/dpdk/spdk_pid483793 00:32:46.250 Removing: /var/run/dpdk/spdk_pid488378 00:32:46.250 Removing: /var/run/dpdk/spdk_pid491604 00:32:46.250 Removing: /var/run/dpdk/spdk_pid497160 00:32:46.250 Removing: /var/run/dpdk/spdk_pid499869 00:32:46.250 Removing: /var/run/dpdk/spdk_pid506357 00:32:46.250 Removing: /var/run/dpdk/spdk_pid508441 00:32:46.250 Removing: /var/run/dpdk/spdk_pid514866 00:32:46.250 Removing: /var/run/dpdk/spdk_pid516992 00:32:46.250 Removing: /var/run/dpdk/spdk_pid523140 00:32:46.250 Removing: /var/run/dpdk/spdk_pid525078 00:32:46.250 Removing: /var/run/dpdk/spdk_pid529398 00:32:46.250 Removing: /var/run/dpdk/spdk_pid529748 00:32:46.250 Removing: /var/run/dpdk/spdk_pid530103 00:32:46.250 Removing: /var/run/dpdk/spdk_pid530463 00:32:46.250 Removing: /var/run/dpdk/spdk_pid530898 00:32:46.250 Removing: /var/run/dpdk/spdk_pid531680 00:32:46.250 Removing: /var/run/dpdk/spdk_pid532459 00:32:46.250 Removing: /var/run/dpdk/spdk_pid532811 00:32:46.250 Removing: /var/run/dpdk/spdk_pid534576 00:32:46.250 Removing: /var/run/dpdk/spdk_pid536561 00:32:46.250 Removing: /var/run/dpdk/spdk_pid538232 00:32:46.250 Removing: /var/run/dpdk/spdk_pid539583 00:32:46.250 Removing: /var/run/dpdk/spdk_pid541185 00:32:46.250 Removing: /var/run/dpdk/spdk_pid542781 00:32:46.250 Removing: /var/run/dpdk/spdk_pid544385 00:32:46.250 Removing: /var/run/dpdk/spdk_pid545600 00:32:46.250 Removing: /var/run/dpdk/spdk_pid546228 00:32:46.509 Removing: /var/run/dpdk/spdk_pid546615 00:32:46.509 Removing: /var/run/dpdk/spdk_pid548771 00:32:46.509 Removing: /var/run/dpdk/spdk_pid550628 00:32:46.509 Removing: /var/run/dpdk/spdk_pid552480 00:32:46.509 Removing: /var/run/dpdk/spdk_pid553546 00:32:46.509 Removing: /var/run/dpdk/spdk_pid554783 00:32:46.509 Removing: /var/run/dpdk/spdk_pid555328 00:32:46.509 Removing: /var/run/dpdk/spdk_pid555506 00:32:46.509 Removing: /var/run/dpdk/spdk_pid555580 00:32:46.509 Removing: /var/run/dpdk/spdk_pid555889 00:32:46.509 Removing: /var/run/dpdk/spdk_pid555974 00:32:46.509 Removing: /var/run/dpdk/spdk_pid557119 00:32:46.509 Removing: /var/run/dpdk/spdk_pid558715 00:32:46.509 Removing: /var/run/dpdk/spdk_pid560722 00:32:46.509 Removing: /var/run/dpdk/spdk_pid561444 00:32:46.509 Removing: /var/run/dpdk/spdk_pid562167 00:32:46.509 Removing: /var/run/dpdk/spdk_pid562525 00:32:46.509 Removing: /var/run/dpdk/spdk_pid562549 00:32:46.509 Removing: /var/run/dpdk/spdk_pid562588 00:32:46.509 Removing: /var/run/dpdk/spdk_pid563517 00:32:46.509 Removing: /var/run/dpdk/spdk_pid564132 00:32:46.509 Removing: /var/run/dpdk/spdk_pid564600 00:32:46.509 Removing: /var/run/dpdk/spdk_pid566782 00:32:46.509 Removing: /var/run/dpdk/spdk_pid568644 00:32:46.509 Removing: /var/run/dpdk/spdk_pid570497 00:32:46.509 Removing: /var/run/dpdk/spdk_pid571565 00:32:46.509 Removing: /var/run/dpdk/spdk_pid572801 00:32:46.509 Removing: /var/run/dpdk/spdk_pid573348 00:32:46.509 Removing: /var/run/dpdk/spdk_pid573470 00:32:46.509 Removing: /var/run/dpdk/spdk_pid577449 00:32:46.509 Removing: /var/run/dpdk/spdk_pid577659 00:32:46.509 Removing: /var/run/dpdk/spdk_pid577785 00:32:46.509 Removing: /var/run/dpdk/spdk_pid577894 00:32:46.509 Removing: /var/run/dpdk/spdk_pid578102 00:32:46.509 Removing: /var/run/dpdk/spdk_pid578316 00:32:46.509 Removing: /var/run/dpdk/spdk_pid579195 00:32:46.509 Removing: /var/run/dpdk/spdk_pid580535 00:32:46.509 Removing: /var/run/dpdk/spdk_pid581599 00:32:46.509 Clean 00:32:46.767 00:14:47 -- common/autotest_common.sh@1447 -- # return 0 00:32:46.767 00:14:47 -- spdk/autotest.sh@380 -- # timing_exit post_cleanup 00:32:46.767 00:14:47 -- common/autotest_common.sh@726 -- # xtrace_disable 00:32:46.767 00:14:47 -- common/autotest_common.sh@10 -- # set +x 00:32:46.767 00:14:47 -- spdk/autotest.sh@382 -- # timing_exit autotest 00:32:46.767 00:14:47 -- common/autotest_common.sh@726 -- # xtrace_disable 00:32:46.767 00:14:47 -- common/autotest_common.sh@10 -- # set +x 00:32:46.767 00:14:47 -- spdk/autotest.sh@383 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:32:46.767 00:14:47 -- spdk/autotest.sh@385 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:32:46.767 00:14:47 -- spdk/autotest.sh@385 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:32:46.767 00:14:47 -- spdk/autotest.sh@387 -- # hash lcov 00:32:46.767 00:14:47 -- spdk/autotest.sh@387 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:32:46.767 00:14:47 -- spdk/autotest.sh@389 -- # hostname 00:32:46.767 00:14:47 -- spdk/autotest.sh@389 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-wfp-50 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:32:47.027 geninfo: WARNING: invalid characters removed from testname! 00:33:13.578 00:15:13 -- spdk/autotest.sh@390 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:33:16.865 00:15:17 -- spdk/autotest.sh@391 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:33:19.459 00:15:19 -- spdk/autotest.sh@392 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:33:21.995 00:15:22 -- spdk/autotest.sh@393 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:33:24.529 00:15:24 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:33:27.061 00:15:27 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:33:29.591 00:15:30 -- spdk/autotest.sh@396 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:33:29.849 00:15:30 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:33:29.849 00:15:30 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:33:29.849 00:15:30 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:33:29.849 00:15:30 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:33:29.849 00:15:30 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:29.849 00:15:30 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:29.849 00:15:30 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:29.849 00:15:30 -- paths/export.sh@5 -- $ export PATH 00:33:29.849 00:15:30 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:29.850 00:15:30 -- common/autobuild_common.sh@436 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:29.850 00:15:30 -- common/autobuild_common.sh@437 -- $ date +%s 00:33:29.850 00:15:30 -- common/autobuild_common.sh@437 -- $ mktemp -dt spdk_1715724930.XXXXXX 00:33:29.850 00:15:30 -- common/autobuild_common.sh@437 -- $ SPDK_WORKSPACE=/tmp/spdk_1715724930.omVOtB 00:33:29.850 00:15:30 -- common/autobuild_common.sh@439 -- $ [[ -n '' ]] 00:33:29.850 00:15:30 -- common/autobuild_common.sh@443 -- $ '[' -n '' ']' 00:33:29.850 00:15:30 -- common/autobuild_common.sh@446 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:33:29.850 00:15:30 -- common/autobuild_common.sh@450 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:33:29.850 00:15:30 -- common/autobuild_common.sh@452 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:33:29.850 00:15:30 -- common/autobuild_common.sh@453 -- $ get_config_params 00:33:29.850 00:15:30 -- common/autotest_common.sh@395 -- $ xtrace_disable 00:33:29.850 00:15:30 -- common/autotest_common.sh@10 -- $ set +x 00:33:29.850 00:15:30 -- common/autobuild_common.sh@453 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:33:29.850 00:15:30 -- common/autobuild_common.sh@455 -- $ start_monitor_resources 00:33:29.850 00:15:30 -- pm/common@17 -- $ local monitor 00:33:29.850 00:15:30 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:33:29.850 00:15:30 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:33:29.850 00:15:30 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:33:29.850 00:15:30 -- pm/common@21 -- $ date +%s 00:33:29.850 00:15:30 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:33:29.850 00:15:30 -- pm/common@21 -- $ date +%s 00:33:29.850 00:15:30 -- pm/common@25 -- $ sleep 1 00:33:29.850 00:15:30 -- pm/common@21 -- $ date +%s 00:33:29.850 00:15:30 -- pm/common@21 -- $ date +%s 00:33:29.850 00:15:30 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1715724930 00:33:29.850 00:15:30 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1715724930 00:33:29.850 00:15:30 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1715724930 00:33:29.850 00:15:30 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1715724930 00:33:29.850 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1715724930_collect-vmstat.pm.log 00:33:29.850 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1715724930_collect-cpu-load.pm.log 00:33:29.850 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1715724930_collect-cpu-temp.pm.log 00:33:29.850 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1715724930_collect-bmc-pm.bmc.pm.log 00:33:30.785 00:15:31 -- common/autobuild_common.sh@456 -- $ trap stop_monitor_resources EXIT 00:33:30.785 00:15:31 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j72 00:33:30.785 00:15:31 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:30.785 00:15:31 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:33:30.785 00:15:31 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:33:30.785 00:15:31 -- spdk/autopackage.sh@19 -- $ timing_finish 00:33:30.785 00:15:31 -- common/autotest_common.sh@732 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:33:30.785 00:15:31 -- common/autotest_common.sh@733 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:33:30.785 00:15:31 -- common/autotest_common.sh@735 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:33:30.785 00:15:31 -- spdk/autopackage.sh@20 -- $ exit 0 00:33:30.785 00:15:31 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:33:30.785 00:15:31 -- pm/common@29 -- $ signal_monitor_resources TERM 00:33:30.785 00:15:31 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:33:30.785 00:15:31 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:33:30.785 00:15:31 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:33:30.785 00:15:31 -- pm/common@44 -- $ pid=592448 00:33:30.785 00:15:31 -- pm/common@50 -- $ kill -TERM 592448 00:33:30.785 00:15:31 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:33:30.785 00:15:31 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:33:30.785 00:15:31 -- pm/common@44 -- $ pid=592450 00:33:30.785 00:15:31 -- pm/common@50 -- $ kill -TERM 592450 00:33:30.785 00:15:31 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:33:30.785 00:15:31 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:33:30.785 00:15:31 -- pm/common@44 -- $ pid=592452 00:33:30.785 00:15:31 -- pm/common@50 -- $ kill -TERM 592452 00:33:30.785 00:15:31 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:33:30.785 00:15:31 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:33:30.785 00:15:31 -- pm/common@44 -- $ pid=592481 00:33:30.785 00:15:31 -- pm/common@50 -- $ sudo -E kill -TERM 592481 00:33:30.785 + [[ -n 230144 ]] 00:33:30.785 + sudo kill 230144 00:33:31.053 [Pipeline] } 00:33:31.069 [Pipeline] // stage 00:33:31.075 [Pipeline] } 00:33:31.092 [Pipeline] // timeout 00:33:31.097 [Pipeline] } 00:33:31.113 [Pipeline] // catchError 00:33:31.118 [Pipeline] } 00:33:31.134 [Pipeline] // wrap 00:33:31.138 [Pipeline] } 00:33:31.154 [Pipeline] // catchError 00:33:31.162 [Pipeline] stage 00:33:31.164 [Pipeline] { (Epilogue) 00:33:31.178 [Pipeline] catchError 00:33:31.179 [Pipeline] { 00:33:31.192 [Pipeline] echo 00:33:31.194 Cleanup processes 00:33:31.198 [Pipeline] sh 00:33:31.478 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:31.478 592567 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:33:31.478 592771 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:31.490 [Pipeline] sh 00:33:31.774 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:31.774 ++ grep -v 'sudo pgrep' 00:33:31.774 ++ awk '{print $1}' 00:33:31.774 + sudo kill -9 592567 00:33:31.785 [Pipeline] sh 00:33:32.063 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:33:44.318 [Pipeline] sh 00:33:44.605 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:33:44.605 Artifacts sizes are good 00:33:44.619 [Pipeline] archiveArtifacts 00:33:44.627 Archiving artifacts 00:33:44.783 [Pipeline] sh 00:33:45.071 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:33:45.088 [Pipeline] cleanWs 00:33:45.099 [WS-CLEANUP] Deleting project workspace... 00:33:45.099 [WS-CLEANUP] Deferred wipeout is used... 00:33:45.106 [WS-CLEANUP] done 00:33:45.108 [Pipeline] } 00:33:45.130 [Pipeline] // catchError 00:33:45.146 [Pipeline] sh 00:33:45.429 + logger -p user.info -t JENKINS-CI 00:33:45.439 [Pipeline] } 00:33:45.456 [Pipeline] // stage 00:33:45.463 [Pipeline] } 00:33:45.482 [Pipeline] // node 00:33:45.488 [Pipeline] End of Pipeline 00:33:45.521 Finished: SUCCESS